Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

ARKit 2.0 Beta support

Discussion in 'AR' started by jimmya, Jun 5, 2018.

  1. castana1962

    castana1962

    Joined:
    Apr 10, 2013
    Posts:
    400
    Hi Jimmy
    Sorry for my ignorance and my little English but I am trying to try this SharedSpheres AR project and after I imported it to Unity, I get the following errors(14)
    You are trying to import an asset which contains a global game manager. This is not allowed.
    UnityEditorInternal.InternalEditorUtility:projectWindowDrag(HierarchyProperty, Boolean)
    UnityEngine.GUIUtility:processEvent(Int32, IntPtr)
    For it, Could you help me to fix these errors?
    Thanks for your help !!!
     
    rob_ice likes this.
  2. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This can found with a simple google search, and although this is not the forum for this, I will post a solution here:
    https://www.oodlestechnologies.com/...to-Devices-without-an-Apple-Developer-Account
     
    castana1962 likes this.
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You do not need to import that - that is already a Unity project - you can just open it in Unity. (2017.4 or later)
     
    castana1962 likes this.
  4. castana1962

    castana1962

    Joined:
    Apr 10, 2013
    Posts:
    400
  5. castana1962

    castana1962

    Joined:
    Apr 10, 2013
    Posts:
    400
    Hi again
    I will do it !!!
    Thanks for your patience.... I am starting with AR developer......
     
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please ask in https://forum.unity.com/forums/ios-and-tvos.27/
     
    castana1962 likes this.
  7. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    For some reason my main camera is not moving around as I move around the phone. It's just sitting there. Was working completely fine 1 hour ago. The only thing that's changed as far as I can tell is that I upgraded to the latest beta (12 beta 2)

    is iOS 12 beta 2 breaking ARKit 2.0 Unity support for anyone else?
     
    Last edited: Jun 20, 2018
  8. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    Hello! First of all, great work on the plugin, thank you! It was easy for me to transfer from ARKit1.5 to ARKit2.0_beta version of the plugin, and all works perfectly with my current project.

    I am checking out environment texturing example and I am wondering, is it possible to apply this to already instantiated objects in the scene? Any advice on approaching that problem?

    I imagine I should use Manual mode. While instancing an object into the scene, I would also create an EnvironmentProbeAnchor at the position of the object, would that work?

    Thanks!
     
  9. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I think it should work with automatic as well since your objects will just pick up the closest one - read the page on Reflection Probes. Mainly just make sure your shaders make use of the environment map.
     
    SourceKraut and mkusan like this.
  10. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    @jimmya yes, you're correct. It works perfectly with the Automatic setting on! Thank you!
     
  11. castana1962

    castana1962

    Joined:
    Apr 10, 2013
    Posts:
    400
  12. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,450
    Currently finding that changing the plane detection to horizontal doesn't stop vertical plane detection with arkit 2.0, and cannot see where in the session or config it's set to vertical?

    My camera manager script is set to horizontal only and all the exanple scenes seem to exhibit this issue with arkit 2.0.

    Has anyone else had this issue or am I going crazy haha
     
  13. akhella

    akhella

    Joined:
    Apr 14, 2015
    Posts:
    9
    Hi everyone

    First off, BIG thanks to the Unity team for integrating ARKit 2.0 support so quickly!

    I've been testing the image anchor sample included in the ARkit 2.0 beta branch from BitBucket, and it seems that it's more of image recognition (ARKit 1.x) than image tracking.

    Has image tracking from ARkit 2.0 been integrated yet? If so, what's the best way to get real time tracking, similar to the XCode examples from Apple?

    Thanks!
     
  14. rob_ice

    rob_ice

    Joined:
    Nov 11, 2016
    Posts:
    112
    @jimmya We have identified that enabling Environment Texturing Automatic makes the vertical plane detection happen automatically, despite the user not wishing to have verical plane detection on.

    Have you and the other devs noticed this? Would like to circumvent it either by automatically checking for vertical planes and removing them, or by fixing it altogether, another option we have for an app we are developing is to check on our ARhittest if the surface is horizontal or vertical, however this hasn't been simple to figure out!

    Ultimately, we want to switch off plane detection once it has detected one plane and not have more generate, but also have reflection spheres generating on the one plane
     
    mkusan likes this.
  15. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    I have a similar case with the app, it also detects a horizontal plane and then I switch off the plane detection. I also used environment texturing, but I haven't noticed yet that it's detecting vertical, but it may be because I don't have clear vertical lines at the place where I'm testing, so it always detects a horizontal one. How did you notice it?

    And for distinguishing between horizontal and vertical, correct me if I'm wrong, but I think you can use
    ARPlaneAnchor.alignment to get the orientation of the anchor added to the scene:

    public enum ARPlaneAnchorAlignment : long
    {
    /** A plane that is horizontal with respect to gravity. */
    ARPlaneAnchorAlignmentHorizontal,

    /** A plane that is parallel with respect to gravity. */
    ARPlaneAnchorAlignmentVertical
    }
     
    rob_ice likes this.
  16. rob_ice

    rob_ice

    Joined:
    Nov 11, 2016
    Posts:
    112
    To reproduce, open the ARkit scene and on the ARcameramanager, change environment texturing to manual or automatic - it seems to just keep making vertical planes as well as horizontal despite having the plane modes to horizontal only

    Currently using an ARHitTestResult here:
    Code (CSharp):
    1. bool DoARRaycast(Touch touch, ref ARHitTestResult hitOut)
    2.         {
    3.  
    4.  
    5.             var screenPosition = Camera.main.ScreenToViewportPoint(touch.position);
    6.             ARPoint point = new ARPoint()
    7.             {
    8.                 x = screenPosition.x,
    9.                 y = screenPosition.y
    10.             };
    11.             //nothing
    12.             var hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface().HitTest(point, ARHitTestResultType.ARHitTestResultTypeExistingPlane);
    13.             //var hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface().HitTest(point, ARHitTestResultType.ARHitTestResultTypeEstimatedHorizontalPlane);
    14.             if (hitResults.Count < 1)
    15.                 return false;
    16.  
    17.  
    18.             hitOut = hitResults[0];
    19.             return true;
    20.         }
    ***************************
    Then checking the result to move the object along a plane, however checking the plane's rotation isn't reliable and I can't seem to find a way to check the alignement of the plane from the ARHitTestResult, it just gives me an 'Anchor ID'
    Code (CSharp):
    1. // Check AR plane
    2.             ARHitTestResult arHit = new ARHitTestResult();
    3.             if (DoARRaycast(touch, ref arHit))
    4.             {
    5.                 if (m_ARHit.anchorIdentifier != arHit.anchorIdentifier)
    6.                 {
    7.                     Quaternion Rotation = UnityARMatrixOps.GetRotation(arHit.worldTransform);
    8.  
    9.                     if (Rotation.eulerAngles.y <= 20)
    10.                     {
    11.                         Debug.Log("thisisa vertical plane" + "eulerX " + Rotation.eulerAngles.x + "eulerY " + Rotation.eulerAngles.y + "eulerZ " + Rotation.eulerAngles.z);
    12.  
    13.  
    14.                         transform.position = UnityARMatrixOps.GetPosition(arHit.worldTransform);
    15.                         //transform.rotation = UnityARMatrixOps.GetRotation(arHit.worldTransform);
    16.  
    17.                     }
    18.                     else
    19.                     {
    20.                         // This means we've hit a different plane that isn't vertical, so move immediately
    21.                         Debug.Log("thisisa horizontal plane" + "eulerX " + Rotation.eulerAngles.x + "eulerY " + Rotation.eulerAngles.y + "eulerZ " + Rotation.eulerAngles.z);
    22.  
    23.  
    24.                         transform.position = UnityARMatrixOps.GetPosition(arHit.worldTransform);
    25.                         //transform.rotation = UnityARMatrixOps.GetRotation(arHit.worldTransform);
    26.                     }
    27.  
    28.                 }
    29.  
    30.                 m_LastHitType = HitType.ARPlane;
    31.                 m_ARHit = arHit;
    32.             }
     
    Last edited: Jun 22, 2018
  17. rob_ice

    rob_ice

    Joined:
    Nov 11, 2016
    Posts:
    112
    My only other solution right now is to write something into UnityARGeneratePlanes that deletes any vertical planes that appear by referencing their anchors and deleting those as well as the gameobjects.. not ideal!
     
  18. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    You could check in UnityARAnchorManager, in the method:

    Code (CSharp):
    1.  
    2. public void AddAnchor (ARPlaneAnchor arPlaneAnchor) {
    3.             GameObject go = ARUtility.CreatePlaneInScene (arPlaneAnchor);
    4.             go.AddComponent<DontDestroyOnLoad> (); //this is so these GOs persist across scene loads
    5.             ARPlaneAnchorGameObject arpag = new ARPlaneAnchorGameObject ();
    6.             arpag.planeAnchor = arPlaneAnchor;
    7.             arpag.gameObject = go;
    8.             planeAnchorMap.Add (arPlaneAnchor.identifier, arpag);
    9.         }
    10.  
    Add to this method something like:

    Debug.Log("arPlaneAnchor.alignment: " + arPlaneAnchor.alignment);

    For me it prints out:

    arPlaneAnchor.alignment: ARPlaneAnchorAlignmentHorizontal

    You are using UnityARAnchorManager for sure because UnityARGeneratePlanes has a reference to it.

    You could decide not to create a plane (first line of the method) if the arPlaneAnchor.alignment returns ARPlaneAnchorAlignmentVertical.
     
    rob_ice likes this.
  19. rob_ice

    rob_ice

    Joined:
    Nov 11, 2016
    Posts:
    112
    This is just the check I needed! Thankyou so much - I also managed to create a workaround for the issue altogether by adding this to UnityARGeneratePlane.cs

    Code (CSharp):
    1.         void Update()
    2.         {
    3.             if (unityARAnchorManager.GetCurrentPlaneAnchors().Count == 1 && settouchprompt == false)
    4.             {
    5.  
    6.                 cameraManagerScript.StopPlaneDetection();
    7. settouchprompt = true;
    8.  
    9.  
    10. }
    11.  
    then having some additional session modifications in the cameramanager script to disable plane detection and also disable environment probe mapping at the same time (seems to still generate one or two environment probes which are luckily enough for our app to have some nice reflection probe set in the scene for reflective surfaces.

    Code (CSharp):
    1.     public void StopPlaneDetection()
    2.     {
    3.         planeDetection = UnityARPlaneDetection.None;
    4.         config.planeDetection = UnityARPlaneDetection.None;
    5.         config.alignment = startAlignment;
    6.         config.getPointCloudData = getPointCloud;
    7.         config.enableLightEstimation = enableLightEstimation;
    8.         config.enableAutoFocus = enableAutoFocus;
    9.         config.maximumNumberOfTrackedImages = maximumNumberOfTrackedImages;
    10.         config.environmentTexturing = UnityAREnvironmentTexturing.UnityAREnvironmentTexturingNone;
    11.         Debug.Log("disabledplanedetection");
    12.             m_session.RunWithConfig(config);
    13.     }
    I will give your code a shot to see if I can keep on environment probe setting only with horizontal surfaces
     
  20. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    I'm getting a lot of tracking issues when testing on my iPhone 7 Plus. I'm told by team members with iPhone Xs that they don't see this.



    This occurs after some seconds, usually, but sometimes I can test for several minutes before seeing it. Sometimes tracking resumes again, but often not. No errors in the log, and Unity itself appears to still be running just fine. These issues occur if every scene (my own or the samples), even if I disable a lot of features (such as the environment textures or ambient lighting, etc.). I do not have trouble with the previous ARKit plugins on the same hardware setup.

    iPhone 7 Plus
    ARKit 2.0 beta test
    Unity 2018.1.5f1
    ARKit plugin, beta 2 brach, June 19th ("merged in face-gc")
    iOS 12 beta 2
    XCode 10.0 beta 2
     
  21. sam598

    sam598

    Joined:
    Sep 21, 2014
    Posts:
    60
    I have similar issues on an iPhone 7 Plus, iOS 12 beta 2.

    Hopefully it's just a prerelease issue since ARKit has been one of the most stable AR platforms.
     
  22. imaginethepoet

    imaginethepoet

    Joined:
    Aug 23, 2016
    Posts:
    44
    I'm having the same problem with deployment on here to ios 12 and xcode where the deployments just crash. I've also just tried the armap example with xcode 10 beta 2 and ios 12 beta 2 and same problem. Going to try a different unity version and see if that solves it.
     
  23. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    for anyone else having this issue, turns out it was due to the Start Alignment parameter on the Unity AR Camera Manager being set camera, which didn't work for me, switching it over to gravity and heading fixed it
     
  24. BrandonYJ

    BrandonYJ

    Joined:
    Jan 5, 2017
    Posts:
    3
    Summary: After downloading the ARkit 2 object scanning from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects . After i build it in Xcode to my Ipad IOS 12 beta it keep crashing.


    Steps to Reproduce: Download the sample code from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects > build with xcode 10.0beta to Ipad IOS 12 beta > open the app and it crash.


    Expected Results: expected result to be able to open the app and scan the object.


    Actual Results: cant even open the app since it keep crashes when you open it.


    Version/Build: 12
     
  25. prasetion

    prasetion

    Joined:
    Apr 3, 2014
    Posts:
    28
    hi @jimmya , I have test arkit2.0 in unity 2018.1.3, xcode 10.0 beta and iOS 12 beta on my iPhone 7. I want to try arimage anchor. I have succeed build in my iPhone. But there is a problem. in GenerateImageAnchor.cs, I see there are delegate AddImageAnchor, UpdateImageAnchor and RemoveImageAnchor. When I build and see the log in Xcode, when OnTrack Image, I can see the debug.log in AddImageAnchor function, but when I OnUntracked image, I did not see the debug.log in RemoveImageAnchor and there is no error in output. Am I missing something? what condition if I want to see debug.log in RemoveImageAnchor? thanks and cheers ;)
     
  26. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    Hello again!

    I have another doubt about environment texturing. I am not sure if this is a topic for this place or in general for Unity, but maybe someone leads me in the correct direction.

    Environment texturing works fine in my scene, although it's a bit "too strong". I know that these kind of things are supposed to be altered in the material or shader of certain objects, however I have to find an alternative way. I see also that I can tweak some variables of environment probes, but I didn't manage to dampen the whole effect. Any ideas on the approach to this problem? Is there something in the input from the ARKit that I could tinker with?

    Thanks!
     
  27. dagon

    dagon

    Joined:
    Jan 4, 2013
    Posts:
    20
  28. Tuitive

    Tuitive

    Joined:
    Mar 25, 2013
    Posts:
    36
    Can anyone confirm that ARKit 2 Remote works with the latest plugin & iOS12? It connections when I hit Play, but I get frozen video (yet tracked point cloud/objects) in the Editor, and a smooth video feed but no traces of AR on the device. I rebuilt the Remote App using the supplied scene in the ARKit 2 plugin. I'm using the USB connection and not WiFi. Here are my other specs:

    MacbookPro, late 2013
    High Sierra 10.13.5
    Xcode 10.0 beta 2
    Unity 2017.4.1

    iPhone 8
    iOS 12.0 beta 2
     
  29. robtow

    robtow

    Joined:
    Jun 28, 2018
    Posts:
    1
    I have a naive question, as a newbie to Unity...
    ...I am able to compile, deploy, and run "tongueAndEyes" on my iPhone X running iOS 12 beta, using XCode 10 beta.

    I read "The other improvement is that it now does eye gaze tracking. You receive a transform that describes where each eye on the face is pointed at, as well as the position of the object which is being looked at."

    My question is, where in the source code of the example can I actually see the values of the left and right eye gaze? Where is the drawing of the xyz axis for the left and right eye gazes done??? It is not immediately obvious.

    Thanks in advance!
     
  30. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    Yeah it doesn't do much better than that for me either. I do get a little bit of video streaming into Unity, but it's like 3fps.

    Your scene objects / UI etc.. always only shows up in the Unity editor though, that's always been the case with the ARKit Remote app. The phone is only the camera (this is probably due to the complexity/bandwidth of streaming the simulation back to the phone)
     
  31. Tuitive

    Tuitive

    Joined:
    Mar 25, 2013
    Posts:
    36
    I'd be thrilled w 3 fps. I'm literally getting 0 fps, other than this faint ghosting of a video stream overlaying the frozen video frame. I found a video of the exact problem I'm having:


    I guess I'll try this guy's hack fix.

    Thanks for reminding me about this; I had forgotten that this is how ARKit Remote works, which is different from the non-AR Remote.
     
  32. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    Tuitive likes this.
  33. Tuitive

    Tuitive

    Joined:
    Mar 25, 2013
    Posts:
    36
  34. imaginethepoet

    imaginethepoet

    Joined:
    Aug 23, 2016
    Posts:
    44
    Could anyone explain in simple terms what each of these examples is and should do? I haven't been able to find anything on that really. I have it all running now, but I'm trying to understand do I use the object scanner example to capture an object, and then test the detection? What about the one that is hello world prefab? Thanks

    For those that might have issues. I also had to add this to my plist.info. Privacy - Camera Usage Description and set a string to get this to work.

    I also had a lot of provision issues.
     
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You should be submitting a bug report to Apple for that. https://developer.apple.com/bug-reporting/
     
    rob_ice likes this.
  36. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    you might want to read docs/What'sNewInARKit2.md
     
  37. harperAtustwo

    harperAtustwo

    Joined:
    Nov 15, 2016
    Posts:
    25
    Has anyone found a solution for the camera feed to work with the Lightweight Render Pipeline preview? The camera feed turns black. I guess that is a result out of the shader on the AR camera not being compatible with the Lightweight render pipeline preview. Unity's ARVideo class calls these properties in the shader.
    Code (CSharp):
    1. public void OnPreRender()
    2.         {
    3.  
    4.             if (!bCommandBufferInitialized) {
    5.                 InitializeCommandBuffer ();
    6.             }
    7.  
    8.             m_ClearMaterial.SetTexture("_textureY", _videoTextureY);
    9.             m_ClearMaterial.SetTexture("_textureCbCr", _videoTextureCbCr);
    10.  
    11.             m_ClearMaterial.SetMatrix("_DisplayTransform", _displayTransform);
    12.         }
    This is the shader. https://bitbucket.org/Unity-Technol...hader?at=default&fileviewer=file-view-default I am sure that support is coming in the future but does anyone know a way around this at the moment? I would like to make some pretty stuff in AR using the lightweight render pipeline.

    Thanks.
     
  38. Henrik-Flink

    Henrik-Flink

    Joined:
    Feb 1, 2013
    Posts:
    25
    A quick question, is there any reason behind why the ToungeAndEyes example uses the ARCameraTracker and not the UnityARCameraManager? Thanks!
     
  39. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    No specific reason - it just happens that the ARCameraTracker does just one thing - move the camera, while the other does the initialization of the AR session as well as move the camera. But in the case of face tracking, you cannot use UnityARCameraManager because it initializes WorldTrackingConfiguration instead of FaceTrackingConfiguration.
     
    rob_ice likes this.
  40. Henrik-Flink

    Henrik-Flink

    Joined:
    Feb 1, 2013
    Posts:
    25
    Understood! Another thing, the lookAtPoint one can get from the face anchor, is that, as it says in the apple documentation relative to the face anchor, or is it in world space? I guess I'm wondering the same for the eye transforms, do they need to be converted if I want their position in the "unity world space"?
     
    Last edited: Jun 30, 2018
  41. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    Is there any way to develop / test multiplayer ARkit 2 app with only 1 registered Apple Developer account and 1 iPhone? Or do I need to buy another iPhone and enroll to beta iOS developer program with a 2nd account in order to have both iPhones upgraded to iOS 12?
     
  42. davejones1

    davejones1

    Joined:
    Jan 19, 2018
    Posts:
    183
    You can test on multiple ios devices with one apple developer account.
     
  43. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    @jimmya

    After updating to iOS 12 beta 3 all of my ARKit 2.0 scenes freeze just after starting. Is it possible there is something which needs updating in the Unity plugin, or is this likely a device issue or something else?
     
    esoinila and rob_ice like this.
  44. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    I finally clued in to the cause :p I was restoring map data when my scene started, and loading maps from beta 2 in beta 3 caused ARKit to freeze up (while Unity's main thread was fine). Resetting the map solves it, and newly-saved maps load just fine.
     
    Nester808 likes this.
  45. imaginethepoet

    imaginethepoet

    Joined:
    Aug 23, 2016
    Posts:
    44
    Does your generated object on an ar-reference set need to be smaller then the reference marker? I can't seem to move my image and have the generated object follow along with it.

    If I'm reading the demo correct. I created a new image reference set(that works). Do I need to modify the script to allow for the ar reference generated prefab to detect to the movement of the image marker. I'm not sure if that makes sense. If it doesn't I'll try again later...
     
  46. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    266
    I'm not getting any values out of anchorData.lookAtPoint in the editor when using the ar face tracking remote. All the other elements and blend shapes are working, but anything involving the new pupil or tongue tracking is not working.
     
    Stage64 likes this.
  47. gmxtian

    gmxtian

    Joined:
    Jul 10, 2012
    Posts:
    8
  48. jose999

    jose999

    Joined:
    Oct 20, 2015
    Posts:
    2
    Hello,

    As some user in this thread I'm having issues with the ARKitRemote iOS app, it just crash after press connect into Unity scene. I'm using latest Xcode and Unity betas and selecting iOS 12 as deployment target in Xcode.

    Any solution?
     
    luciewj and Pilltech101 like this.
  49. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    35
    Hello everyone for some reason, like jose99 my remote editor out of the blue just stopped working. I have tried everything to get it back up but nothing is working. Would anyone have any good ideas on what I can do to fix this problem?
    Thanks in advance
     
  50. unity_pm

    unity_pm

    Joined:
    Apr 7, 2018
    Posts:
    3
    Only virtual objects position and rotation which we want to reposition when new AR session started is the only thing which need to be saved right? AR world map is automatically get saved and reload by the ARkit 2.0?

    Hi Jimmya, just want all details on this..