Search Unity

ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'ARKit' started by jimmya, Jun 5, 2017.

  1. nynohu

    nynohu

    Joined:
    May 5, 2012
    Posts:
    5
    Hi everyone,

    I'm using the newest Unity (2018.1.0f2) and Unity ARKit Plugin 1.2. I want to deploy AR application for both iOS and android. But I got problem when import both ARKit plugin and AR Core in project. Unity ARKit plugin make some weird changes and affect to AR Core. AR Core demo runs normally until I import ARKit plugin. When I built HelloAR example (made by Google) with ARKit plugin imported, the screen on my S8 shown weird particle dots as photo I attached. Please point me what setting is needed to avoid that collision.

    Thanks.
     

    Attached Files:

  2. theiaibanez

    theiaibanez

    Joined:
    Feb 2, 2017
    Posts:
    2
    Hi everyone,

    I'm trying to use ARKit 1.5's image recognition in Unity and show an object on the recognized image, but I can't make match the object rotation with arImageAnchor rotation.

    I will attach an screenshot, it shows a real plane and an object plane which rotations are different.

    How I can make match the real plane with the 3D Object?

    Thanks
     

    Attached Files:

  3. borderlineinteractive

    borderlineinteractive

    Joined:
    Sep 20, 2015
    Posts:
    13
    Hi,

    it was quite a while that i posted my question. Parenting to the head anchor does not work, and since the original release, I was not able to do what I intended. It seems to me that this is indeed impossible. What I basically would like to do is to use the rear facing camera to track the position of the smartphone (i.e. camera) in 3D space, and combine this with tracking the face position and extracting blend shapes via the front facing camera. This seems to be impossible. As soon as the UnityARFaceMeshManager script is active, the function of the UnityARCameraManager appears to be blocked. I guess that this is due to incompatibility between the

    ARKitWorldTrackingSessionConfiguration

    and the

    ARKitFaceTrackingConfiguration

    i.e. that it is impossible to use both at the same time. Is there any way around this? It would be cool to be able to walk around in a room in a standard AR setup using ARKitWorldTrackingSessionConfiguration and combine this with the ability to extract head orientation and blendshapes via ARKitFaceTrackingConfiguration as input.
     
  4. oliver_Random42

    oliver_Random42

    Joined:
    Oct 13, 2016
    Posts:
    10
    Hello,

    I am having an issue with ARKit when moving between scenes using the SceneManager.

    I have a project where I have multiple scenes in the one app - only one of these is meant to have the ARKit image target functionality.

    When I leave the ARKit scene for another and then return, when I re-scan the image target the application crashes.

    In my console output I notice before the crash that there are two instances of

    UnityEngine.XR.iOS.ARImageAnchorAdded:Invoke(ARImageAnchor)

    And have an inkling that a second instance of the ARKit functionality has been created.

    I am not sure if this is correct or not, but please find attached a copy of my console output.

    If I am correct - which I could be completely wrong - is there a way to ensure that there is only one instance of ARKit running when changing scenes??

    Thanks in advance!

    O.
     

    Attached Files:

  5. Mike-B

    Mike-B

    Joined:
    Nov 18, 2012
    Posts:
    13
    hi all

    I downloaded yesterday from asset store.
    The plugin says it supports openGLES 2 & 3 but I get this error in XCode when trying to build:
    Undefined symbols for architecture arm64:
    "_MTLCreateSystemDefaultDevice", referenced from:
    -[UnityARSession setupMetal] in ARSessionNative.o
    ld: symbol(s) not found for architecture arm64


    The issue page says it was fixed then broken again after face tracking was added. Any plan to fix or workaround?
    thanks
     
  6. shaundavies

    shaundavies

    Joined:
    Jan 31, 2017
    Posts:
    44
    I am attempting to use a phone periscope with arkit. I am getting an extreme amount of drift because of this. I am guessing I have to remap the euler angles for the camera. Would that be correct?



    Edit : Would it be fair for me to assume what i'm looking for is the unityarmatrix4v4. I am confused where the data is being taken from however. The values are public but I dont know where that is being sourced from
     
    Last edited: May 22, 2018
  7. RyanYN

    RyanYN

    Joined:
    Aug 13, 2013
    Posts:
    17
    Hey guys, can ARKit identify a 2d texture like Vuforia?
    if not, can i use both ARKit and Vuforia in one scene in unity?
    thanks.
     
  8. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    664
    Hi,

    I'm using the 2 camera setup for scale but with the second camera (ContentCamera) I'm getting this error ..

    Screen position out of view frustum (screen pos 2436.000000, 0.000000, 30.000000) (Camera rect 0 0 2436 1125)

    I've narrowed it down to the CameraScaler.cs script, this part ..

    Code (CSharp):
    1. if (scaledCamera != null && cameraScale > 0.0001f && cameraScale < 10000.0f)
    2.         {
    3.             Matrix4x4 matrix = UnityARSessionNativeInterface.GetARSessionNativeInterface().GetCameraPose();
    4.             float invScale = 1.0f / m_scale; //cameraScale;
    5.             Vector3 cameraPos = UnityARMatrixOps.GetPosition(matrix);
    6.             Vector3 vecAnchorToCamera = cameraPos - scaledObjectOrigin;
    7.             scaledCamera.transform.localPosition = scaledObjectOrigin + (vecAnchorToCamera * invScale);
    8.             scaledCamera.transform.localRotation = UnityARMatrixOps.GetRotation(matrix);
    9.  
    10.             //this needs to be adjusted for near/far
    11.             scaledCamera.projectionMatrix = UnityARSessionNativeInterface.GetARSessionNativeInterface().GetCameraProjection();
    12.         }
    Can anyone tell me how to correct this please? Thanks.
     
  9. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    49
    The camera calibration is set internally by the ARKit API. There is currently no way to set custom calibration, and because of the precision of ARKit's lens calibration model, I doubt there ever will be.
     
  10. oliver_Random42

    oliver_Random42

    Joined:
    Oct 13, 2016
    Posts:
    10
    Is it possible to use the ARKit remote to test Image Detection Targets?

    Cheers,

    O.
     
  11. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    49
    Option 1: Use the AR Image Anchor feature included in ARKit 1.5

    Option 2: Use the latest version of Vuforia and use the "ground plane" feature. This uses the tracking data from ARKit for the camera position, while also tracking image planes. It is not possible to run the individual ARKit and Vuforia plugins in the same scene.
     
  12. cdtaylor

    cdtaylor

    Joined:
    Jun 5, 2017
    Posts:
    2
    Hi all, I'm having an issue with the Occlusion sample code - UnityAROcclusion.

    Basically, it doesn't do any occlusion. I came here before experimenting too much for two reasons; the are absolutely no errors at compile or run time. Plenty of logging at runtime, but no errors. Also, I'm viewing this code as a "reference" implementation so I'm reluctant to deviate too much.

    I've set up my dev environment to properly meet the pre-requisites. The point clouding is visible, and the reference cube and little gnome dude appears at ground level.

    Anyone had to make any specific changes to get this to function correctly?
    Thanks,
     
  13. SoerenL

    SoerenL

    Joined:
    Mar 24, 2014
    Posts:
    12
    Hi, the Spring 2018 Update lists Unity 2017.1 or above as a requirement, but in the Unity ARKit Plugin FAQ the requirements are:"Unity version 5.6.2 or later, or version 2017.1.0 or later.

    Currently using the Spring 2018 Update with Unity 5.6.5. - Is it recommended that I upgrade to Unity 2017.1 ?
     
  14. Burglecut

    Burglecut

    Joined:
    Jun 27, 2015
    Posts:
    8

    Use an occlusion material occlusion culling is used for performance ;-)
     
  15. Burglecut

    Burglecut

    Joined:
    Jun 27, 2015
    Posts:
    8
  16. phili_maas

    phili_maas

    Joined:
    Dec 11, 2016
    Posts:
    21
    same here.
    even after deleting all Post Processing objects and components, ARkit Scene is still messed up with jittering video feed and low fps and weird tracking. not sure what is polluting the scene?! anybody know how to completely get rid of post processing in a scene?
     
  17. shin_unity197

    shin_unity197

    Joined:
    Oct 30, 2017
    Posts:
    18
    Hi,
    I was wondering if it is possible to lower the frame rate of the video feed that ARKit uses while also maintaining Unity's targetFrameRate(i.e. ARKit video - 30fps, Unity Build - 60 fps)
     
  18. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    14
    Hello,

    I am having troubles with keeping an object rotation to stay.I am new to arkit but not a stranger to unity. Could someone point me in the right direction.
     
  19. bronydell

    bronydell

    Joined:
    Aug 21, 2014
    Posts:
    3
    When I'm trying to use in editor
    Code (CSharp):
    1. m_session.GetTrackingQuality()
    I'm getting
    Any suggestions?
     
    Last edited: Jun 2, 2018
  20. purplehaze90

    purplehaze90

    Joined:
    Feb 3, 2017
    Posts:
    13
    Apple Mach-O Linker(Id) Error when building Unity ARKit project

    I am trying to build a Unity AR project onto my iPhone SE. However, I am getting the following error.
    Code (CSharp):
    1.  Undefined symbols for architecture arm64:
    2.    "void RegisterClass<WorldAnchor>(char const*)", referenced from:
    3.        RegisterAllClasses() in UnityClassRegistration.o
    4. ld: symbol(s) not found for architecture arm64
    5. clang: error: linker command failed with exit code 1 (use -v to see invocation)
    6.  
    I checked and found there is a cpp file UnityClassRegistration under the Native folder. It has a line
    class WorldAnchor; template <> void RegisterClass<WorldAnchor>(const char*);
    I am on Unity 2018.1.1 . Removing the class WorldAnchor lets me get past this error. But I want to know why this error is appearing in the first place. Do I need to add something to the Unity scene to get past this error? My Xcode version is 9.4 .
     
  21. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    14
    I ran into this problem when putting a World Anchor Component on a game object. I Assumed It was because of it being a none Arkit Component. But a Vuforia Component.
     
    Last edited: Jun 3, 2018
  22. purplehaze90

    purplehaze90

    Joined:
    Feb 3, 2017
    Posts:
    13
    How did you get past this error? Did updating XCode help? I am on the latest version .
     
  23. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    You can use the TrackingStatusChangedEvent instead. See RelocalizationControl.cs in the ARKit 1.5 examples to see how to use it.
     
    Burglecut and bronydell like this.
  24. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    The only control you have is to select a video format that ARKit tells you - see UnityARVideoFormats scene.
     
  25. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    Where does the plugin say it supports OpenGL ES? If it does, that is a mistake. Plugin requires Metal to work.
     
  26. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    14
    I removed the World Anchor component completely out of the scene then I was able to build.
     
    purplehaze90 likes this.
  27. MattMurphy

    MattMurphy

    Joined:
    Dec 24, 2013
    Posts:
    110
    @jimmya AR kit 2 pls? Model recognition? Sharing experiences? Is this going to be in the plugin or more of a native feature?
     
  28. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    14
  29. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    MattMurphy likes this.
  30. dyuldashev

    dyuldashev

    Joined:
    Mar 3, 2016
    Posts:
    61
    I have multiple scenes in my iOS ARInterface app, around 6-7 scenes. I enabled light estimation in some scenes. When I look at the ARInterface code, I can see that the DontDestroyOnLoad() function was used with ARRoot objects, meaning that you desired to keep one GameObject for recognizing the ground instead of instantiating it again and again in multiple scenes. I don't do that, so in each scene, I initialize ARInterace separately. However, my iOS app is crashing from time to time after being used for a few minutes. And sometimes, the bounding planes are extremely unstable, moving away from the Camera. I wonder if initializing ARInterface repeatedly in every scene might cause these issues? Thanks. @jimmya
     
  31. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    180
    Cool stuff in ARKit2 and big thumbs up to the Unity team to make it available from day one!
    There's one feature that I'm a bit disappointed about: the Environment Probe. The reflection map seems of very low quality, and also of very low contrast (it's very gray-ish). I changed the materials to 100%, sharp reflective to see better what reflections are produced.
    Also, I tuned up the quality of the unity reflection probe to 1024 but this didn't change anything, so I assume that the map that Apple/ARKit generates is just very low quality (thus only usable for glossy reflections or as a lighting)
    Is it correct that the map is of low quality?
    Is there a way to see the map generated by ARKit?

    Besides, after say a minute or two, tracking completely freezes on my 2017 iPad with iOS12
     
  32. rob_ice

    rob_ice

    Joined:
    Nov 11, 2016
    Posts:
    112
    Using ARKit 2.0 branch, I have noticed that it is no longer possible to stop plane detection after a certain point as another ARKitWorldPositionRemote prefab is spawned in any scene using the ARCamera controller script - haven't found a way to prevent that extra remote connection script from spawning in any of the example scenes now.

    Before switching to this branch, it was simple enough to stop plane detection but as of this new branch, it's impossible.
     
  33. jerry2157

    jerry2157

    Joined:
    Mar 23, 2014
    Posts:
    17
    @jimmya Hi Jimmy! How you doing?. First of all thank you for the ARInterface it works great!
    But im having a problem with it and admob, and it would be great if you could help me, please.
    I developed a game using your last commit on github of your ARInterface and everything worked, I added some Unity UI buttons (Not Unity GUI) and it worked great. The problem is... when I tried to add the oficial admob sdk to monetize the game I noticed that the UI was not showing in the scene where the AR is activated, and yes the tracking is working also, but its very curius because if I load a scene with UI just like you do with the Tanks Networking Demo with the 'ARPlaneChooser' script it works!

    Do you have any tips for this issue?
    Do yo recommend me to use the old Unity GUI just like your examples?
     
  34. J_P_

    J_P_

    Joined:
    Jan 9, 2010
    Posts:
    1,021
    It seems like when control center is brought up (swipe up from bottom), the app is paused but it can resume cleanly w/o losing tracking even if phone was moved around during the time control center was up -- but that's not the case when home button is pressed.

    But I can't find a way to detect if it was put in background or if control center was brought up and I'd rather not reinitialize all the AR stuff every time they bring up control center when it's unnecessary, but I need to do it when user presses home button to put app in background.

    edit: ah, I see Relocalization is now a thing -- will look into that

    edit2: Relocalization doesn't seem to work that great :(

    edit3: but I was able to check the TrackingChanged stuff and handle it through there to differentiate between breaking pauses and harmless pauses :)
     
    Last edited: Jun 8, 2018
  35. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    With ARKit 2 feature ARWorldMap, you might be able to save the ARWorldMap from the session at specific intervals, and when you completely lose tracking, you can relocalize to the last known good session to get back.
     
  36. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    303
    Is there any progress on this issue fix from ARKit plugin developers? Can't use ARKit Remote at all, and provided method in mentioned threat didn't help me.
     
    joe_carrot likes this.
  37. demanzonderjas

    demanzonderjas

    Joined:
    Mar 1, 2018
    Posts:
    4
    @jimmya, I am working on a project which needs a 3d model that when placed via the ImageAnchor, should always face the camera. Currently, it will use the rotation of the detected image plane, which only works for one side. Do you have a hint how to go about this? How do I know which side I am on (front- or backside) of the image marker?
     
  38. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    664
    Why don't you just put a script on the 3D model to LookAt the camera? Just put it in OnEnable() if you don't want it to keep looking at the camera.

    Or just LookAt in the Y direction ..
     
  39. shaundavies

    shaundavies

    Joined:
    Jan 31, 2017
    Posts:
    44
    anyone notice arkit quaternions are not treated the same as gameobjects in unity? I am seeing with arkit its quaternion values go from positive to negative values where regular gameobjects go to positive values back down to zero than to negative values. Is there a way to get some consistency between unity gameobjects and arkit rotation values?
     
  40. Fantas_Feng

    Fantas_Feng

    Joined:
    Mar 13, 2018
    Posts:
    2
  41. demanzonderjas

    demanzonderjas

    Joined:
    Mar 1, 2018
    Posts:
    4
    Thanks @Griffo, I will look into it. I do however, want to keep the rotation of the plane that was recognized via the image marker. So the corrected rotation of the model should either be 0f (front, already fine) or current Y rotation - 180f (back). The image marker has a center area for the GameObject to spawn on and it needs to stay on the center area with the front-side active.
     
  42. Fl0oW

    Fl0oW

    Joined:
    Apr 24, 2017
    Posts:
    20
    If I understood correctly, the ARKit SDK doesn't currently support the new Lightweight Render Pipeline in Unity 2018.1 (or the other way round), so we can't use the Shader Graph at the moment. Is that correct, and if yes, is there a timeline on when it might be supported?
     
  43. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    664
    @unity_tqlj7BfvM9vR_A Use

    Code (CSharp):
    1.  
    2. public Transform m_HitTransform;
    3.  
    4. m_HitTransform.rotation = hit.transform.rotation;
    When placing the GO to get the plane rotation then set your GO rotation to that ..

    Taken from AR script ..

    Code (CSharp):
    1.     private void Update ()
    2.     {
    3.         if ((!IsPointerOverUIObject()) && (Input.GetMouseButtonDown(0)))
    4.         {
    5.  
    6.             Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
    7.             RaycastHit hit;
    8.  
    9.             if (Physics.Raycast(ray, out hit, maxRayDistance))
    10.             {
    11.                 Debug.Log(" XX .. RaycastHit hit Layer is .. " + hit.transform.gameObject.layer + "    RaycastHit hit name is .. " + hit.transform.name);
    12.             }
    13.             //we'll try to hit one of the plane collider gameobjects that were generated by the plugin
    14.             //effectively similar to calling HitTest with ARHitTestResultType.ARHitTestResultTypeExistingPlaneUsingExtent
    15.             if (Physics.Raycast(ray, out hit, maxRayDistance, collisionLayer))
    16.             {
    17.                 //we're going to get the position from the contact point
    18.                 m_HitTransform.position = hit.point;
    19.                 Debug.Log(string.Format("x:{0:0.######} y:{1:0.######} z:{2:0.######}", m_HitTransform.position.x, m_HitTransform.position.y, m_HitTransform.position.z));
    20.  
    21.                 //and the rotation from the transform of the plane collider
    22.                 m_HitTransform.rotation = hit.transform.rotation;
    23.             }
    24.         }
    25.     }
    26.     // Taken from http://answers.unity3d.com/questions/1115464/ispointerovergameobject-not-working-with-touch-inp.html#answer-1115473
    27.     private bool IsPointerOverUIObject ()
    28.     {
    29.         PointerEventData eventDataCurrentPosition = new PointerEventData(EventSystem.current);
    30.         eventDataCurrentPosition.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
    31.         List<RaycastResult> results = new List<RaycastResult>();
    32.         EventSystem.current.RaycastAll(eventDataCurrentPosition, results);
    33.         return results.Count > 0;
    34.     }
     
    Last edited: Jun 14, 2018
  44. demanzonderjas

    demanzonderjas

    Joined:
    Mar 1, 2018
    Posts:
    4

    @Griffo, thanks for thinking along! In the meantime I solved it with the following code for the AddImageAnchor function in the GenerateImageAnchor.cs:

    Code (CSharp):
    1. void AddImageAnchor(ARImageAnchor arImageAnchor)
    2.     {
    3.         Debug.Log ("image anchor added");
    4.         if (arImageAnchor.referenceImageName == referenceImage.imageName) {
    5.             Vector3 position = UnityARMatrixOps.GetPosition (arImageAnchor.transform);
    6.             Quaternion rotation = UnityARMatrixOps.GetRotation (arImageAnchor.transform);
    7.  
    8.             imageAnchorGO = Instantiate<GameObject> (prefabToGenerate, position, rotation);
    9.             modelContainer = GameObject.Find("ModelContainer");
    10.  
    11.             Vector3 fixedRotation = new Vector3(0, 0, 0);
    12.             RaycastHit hit;
    13.             // check if the front/backside is hit and then change the position accordingly
    14.             if (Physics.Raycast(Camera.main.transform.position, transform.TransformDirection(Vector3.forward), out hit, Mathf.Infinity)) {
    15.                 Debug.Log("Did Hit");
    16.                 Debug.Log("Name: " + hit.transform.gameObject.name);
    17.  
    18.                 if(hit.transform.gameObject.name == "FrontCollider") {
    19.                     fixedRotation = new Vector3(0, 0, 0);
    20.                 } else {
    21.                     fixedRotation = new Vector3(0, 180f, 0);
    22.                 }
    23.             }
    24.  
    25.             modelContainer.transform.localEulerAngles = fixedRotation;
    26.  
    27.             modelController.SaveModels();
    28.             phaseController.StartNextPhase();
    29.         }
    30.     }
     
    Griffo likes this.
  45. Mike-B

    Mike-B

    Joined:
    Nov 18, 2012
    Posts:
    13
    It used to be in the plugin readme, when 5.6 was still supported... apparently not anymore :)
     
  46. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    793
    AFAIK that was never there - I wrote the readme, so I would think I'd know about it.
     
  47. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    664
    A game I'm working on, play it on the kitchen table or on the park ..

    You play as 3rd person but going to work on FP.





     
    Last edited: Jun 16, 2018
    castana1962 likes this.
  48. OLGV

    OLGV

    Joined:
    Oct 4, 2012
    Posts:
    26
    Does anyone know of a way to display a tiled image (aka pattern) on the ARKit's "debugPlanePrefab" which is used to visualise the AR Planes Detection?

    At the moment this is using a texture that gets stretched according to the size of the detected surface, while I would like to show a pattern image like ARCore does for example.

    Thank you,
     
  49. GookGo

    GookGo

    Joined:
    May 5, 2018
    Posts:
    38
    If I fully rigged the character's face, do I still need the blendshapes for the face tracking by iPhoneX?
     
  50. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    180
    What is the best approach right now for developer cross platform mobile AR? (ARCore & ARKit)
    Is ARInterface the best take?