Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. lusas

    lusas

    Joined:
    Mar 18, 2015
    Posts:
    6
    Thanks mimminito.

    @mimminito, @jimmya
    Another question, I struggle find an event to hook up to receive tracking status changed.
    form ARKit documentation -
    Instance Method
    session(_:cameraDidChangeTrackingState:)

    I can find UnityARSessionNativeInterface.GetARSessionNativeInterface().GetARTrackingQuality but I dont really want to call it every frame.
     
  2. Inderdeep

    Inderdeep

    Joined:
    Sep 9, 2014
    Posts:
    2
    @jimmya I kinda fixed this problem by converting the terrain into mesh and then baked the lightmap data with everything set static, after that I un-static everything before running the game. Question I want to ask is there anything bad using this hackish approach? Like are now shadows getting calculated for all object or will Unity ignore calculating shadows if lightmap data is present?
     
  3. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    269
    I put in the 3dof component after the fact, still acting wonky. halp @jimmya
     
  4. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    I wasn't suggesting that you should use 3DoF, just that your video shows there is a problem with tracking, ie. the camera is not moving relative to your object. So before you focus on the portal effect, I would take a step back and see if you can't get tracking to work.
     
  5. rlzh

    rlzh

    Joined:
    Sep 23, 2013
    Posts:
    2

    I think I am noticing the same thing with regards to objects drifting when they are scaled bigger. Were you able to resolve the issue? if so how?

    @jimmya is this a limitation with tracking in ARKit?
     
    Last edited: Jul 17, 2017
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Use UnityARCameraNearFar component on your camera (as example scenes do) and it will update the clip planes based on your camera near and far z.
     
  7. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Currently you will have to do this every frame - we are looking into implementing the trackingchange callback.
     
  8. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    @jimmya I've confirmed that when using ` config.alignment = UnityARAlignment.UnityARAlignmentGravityAndHeading;` it is the ARPlaneAnchors that jump around, not the camera. I've logged a few frames of a plane's position.

    ```
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (-0.3, -0.5, 0.1)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (0.1, -0.5, -0.3)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (-0.3, -0.5, 0.0)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (0.2, -0.5, 0.2)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (0.2, -0.5, -0.3)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (-0.3, -0.5, 0.0)
    F9F2D3B3-0100-0000-060D-2C4CFEFFFFFF: Anchor Translation (0.1, -0.5, 0.3)
    ```

    You can see it pops back and forth between a few possible values. Do you think this is a bug in ARKit?

     
    rlzh likes this.
  9. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It's possible - let me check.
     
    jessevan likes this.
  10. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    269
    I can get tracking, and I can get portal, but putting them together in the same scene is the main issue.

    This is the portal src im building off of:
    https://github.com/KillianMcCabe/SmoothPortals

    This is my source:
    https://github.com/UnityEQ/ARPortal
     
  11. chenditc_hyfield

    chenditc_hyfield

    Joined:
    Dec 28, 2016
    Posts:
    3
    Is OpenGL2 render a possible option to support?

    I tried to integrate this plugin into my existing project which require OpenGL 2 to render the screen (legacy reason). If I simply put the plugin in, the plugin is able to run the world recognition, but the camera image does not render into the scene. I guess that's because I was using OpenGL as the render API, but this plugin send the image to metal API.

    Is there another way to enable OpenGL2 rendering on this plugin?
     
  12. lusas

    lusas

    Joined:
    Mar 18, 2015
    Posts:
    6
    Is anyone using arkit asset from asset store? Trying to compile empty project and it fails building in xcode:

    Screen Shot 2017-07-18 at 09.31.24.png
     
  13. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    269
    Had same problem. I updated to: ios11b3, newest unity beta, xcode9b3.

    When i was having the problem, i was running beta 2 on ios and xcode. i forget the unity version, I just constantly update it.
     
    christophergoy, Gharry and jimmya like this.
  14. rahuxx

    rahuxx

    Joined:
    May 8, 2009
    Posts:
    537
    How to do a image based triggering with Unity Arkit?
    I have an image from which I want to load a 3D model of it.
    any suggestion would be helpful.
     
  15. mimminito

    mimminito

    Joined:
    Feb 10, 2010
    Posts:
    780
    You cant. ARKit does not support image based tracking. If you would like to do this then you would be better off using something like Vuforia to handle the tracking.
    The differences between the two are:
    - ARKit uses VIA (Visual Inertial Odometry) to track an environment. It analyses the camera frames and combines this with the sensors on the device to provide robust tracking.
    - Vuforia uses natural feature tracking to recognise an image and track its location. Once the image is out of view however the tracking is then lost.

    You could combine the two, so you can initiate your experience using Vuforia and then switch to ARKit, but you may get some flicker or a stop/start between the two camera systems.
     
  16. OneOfHaze

    OneOfHaze

    Joined:
    Jun 1, 2017
    Posts:
    7
    Blarp and rahuxx like this.
  17. Racancoj

    Racancoj

    Joined:
    Jan 15, 2015
    Posts:
    1
    Does anybody knows how to detect if the camera is detecting a plane or not? currently i'm working just with horizontal planes but sometime the camera is not detecting any planes so i would like to tell the user whether they can touch the screen or not to instantiate an object. Thanks!
     
  18. jenielkp

    jenielkp

    Joined:
    Jul 18, 2017
    Posts:
    1
    Now WRLD is supporting ARKit too. Check their blog for future updates.
     
  19. mimminito

    mimminito

    Joined:
    Feb 10, 2010
    Posts:
    780
    You can check the tracking state to see if its valid or not (UnityARTrackingState). You can also check the tracking state reason to warn your users of any issues with tracking (https://developer.apple.com/documentation/arkit/arcamera.trackingstatereason). For planes I would check the
    UnityARAnchorManager to see if any planes are being tracked.
     
  20. rracancoj

    rracancoj

    Joined:
    Jul 18, 2017
    Posts:
    4
    Thanks for you reply! I just need like bool variable that tells me if im hable to place an object or not, i have been trying to figure out with UnityARAnchorManager but did not succed. Wher do you think i can get if a plane is being detected?
     
  21. osarda

    osarda

    Joined:
    Aug 20, 2014
    Posts:
    6
    Hello Unity community,

    Just started checking out the Unity ARkit plugin. Got a couple questions. I was a user of Vuforia before this and it worked quite well but not as robust as the tracking in the new Apple AR API. I have imported a couple legacy characters in my ARkit scene which seem to drift nonstop forward even though I have the 'Foot IK' checkmarked in the animator. My humanoid characters do animate properly but I can't seem to figure out where the drift is coming from? Any ideas? Also - the Camera Parent should probably be 0'ed out in the pos & rot but is it okay to tweak the translation and rotation of the Child Main Camera? The characters always start in the wrong place and in the wrong rotation. Is there a way to make sure a gameobject can appear exactly at the scale, rotation or position when the AR XP starts? Trying to figure this out but there isn't much data or tutorials out there yet for obvious reasons. Any help or advice would be greatly appreciated!
    Thanks in advance.

    OSARDA
     
  22. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Not easily - the ARKit framework has a dependency on Metal.
     
  23. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Use the HitTest API (exposed via plugin) and filter with existingPlaneWithExtent in the resultstypes: https://developer.apple.com/documentation/arkit/arhittestresult.resulttype
     
  24. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    There are some resources in the README.md, TUTORIAL.txt and SCENES.txt files in the project. There are also a few tutorial videos linked in posts in this forum. As mentioned, you should try and understand how ARKit works to begin with, then see how the plugin projects implement some of the functionality.

    The main camera in the scene is not something you move around, but it gets set to 0,0,0 when you begin the session, and when you move the device around, it controls the movement of that camera. You should place the gameobjects you want in the world, and not as children of the camera.
     
  25. rracancoj

    rracancoj

    Joined:
    Jul 18, 2017
    Posts:
    4
    I really apreciate your reply but what i'm looking for is to know before the user taps on the screen if there is an available plane as it is shown in this Demo. i've been trying with UnityARSessionNativeInterface.ARAnchorAddedEvent but did not find any result to what im looking for.
     

    Attached Files:

  26. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793

    You don't have to wait for user to tap to do HitTest - keep doing a HitTest on the center of the screen every frame, and update the "focus" square based on the result (they solidify it when they have found a suitable plane under the center of the screen).

    BTW, the ARAnchorAddedEvent should also work. You can have it trigger and generate an object at the center of the plane it has found (it will work differently from the hittest above, since the center of the plane found could be offscreen). This is exactly what we do with UnityAnchorManager.
     
  27. eddiecohen

    eddiecohen

    Joined:
    Apr 15, 2016
    Posts:
    1
    Does anybody have a good solution to efficiently distribute builds over the cloud? Unity Cloud Build does not support Xcode 9 yet.
     
  28. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    If you're looking to beta test with a larger audience, TestFlight works fine with XCode 9.
     
  29. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    @jimmya did you end up replicating this? What'ya think, should I open ticket with Apple, or is there a chance its related to the plugin?
     
  30. rahuxx

    rahuxx

    Joined:
    May 8, 2009
    Posts:
    537
    @jimmya,

    We are using iPad mini 2 and iPad Air 2nd generation. Will this work for unity ARKit development. Or I need iPhone 6S and iPad 2017 only.
     
  31. mimminito

    mimminito

    Joined:
    Feb 10, 2010
    Posts:
    780
    Those devices will not work.
    The devices that use A9 or A10 chips are:
    • iPhone 6s and 6s Plus
    • iPhone 7 and 7 Plus
    • iPhone SE
    • iPad Pro (9.7, 10.5 or 12.9)
    • iPad (2017)
     
    jimmya likes this.
  32. theiajsanchez

    theiajsanchez

    Joined:
    Feb 2, 2017
    Posts:
    18
    Hi everyone,

    Maybe someone can help me with this problem: I need to get the image that the ARCamera is showing in order to broadcast it.

    So I tried to get the ARVideoTextureHandles from the ARSession interface, after that I use the UpdateExternalTexture method from Texture2D class for copying it, encoding it to PNG and converting it to base64 so I can send it through a network connection.

    When I receive that base64 string in the server and decode it, the image is full of gray pixels.

    Can someone help me sending the camera image through the network?

    Thanks in advance
     
  33. mimminito

    mimminito

    Joined:
    Feb 10, 2010
    Posts:
    780
    So your just trying to broadcast the devices camera feed, not the AR experience? This line from the tutorial.txt file might help you understand why you are seeing those pixels:

    "7. On the main camera for the scene, add the UnityARVideo MonoBehaviour component, and set the clear material in the inspector to point to the YUVMaterial in the project. You can look in the source at what this does: every frame, it takes the two textures that make up the video that ARKit wants to display from the camera, and uses the YUVMaterial shader to combine them into the background that is rendered by the camera. (see UnityARVideo,cs)"

    I think someone has asked this question before on this thread, if you do a little search a suggestion was posted.
     
    theiajsanchez and jimmya like this.
  34. idorurez

    idorurez

    Joined:
    Dec 28, 2015
    Posts:
    13
    Does anyone know if I can "transport" the Camera/user to a different part of a scene? Or similarly, maybe swap to a different camera?

    Playing with the ARCameraManager components hasn't yielded any results.

    Can anyone confirm that the position of the Unity camera will always be tied to the ARKit's camera?
     
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Unity camera that is specified in ARCameraManager will always have its transform updated by the input pose provided by ARKit. If you want to move to another area of the scene, you can instead move the rest of the scene in the opposite direction in relation to the camera as has been described in previous posts. You can use other cameras that are not affected by the transform sent by ARKit, but then basically you will not be using ARKit to "steer" the camera.
     
  36. idorurez

    idorurez

    Joined:
    Dec 28, 2015
    Posts:
    13
    Thank you for confirming my suspicions, and I appreciate the feedback. I was hoping I would not have to move the entire scene.

    To clarify, if I update the camera in ARCameraManager with SetCamera midway through a play, will it be controlled by the input pose from ARKit? Maybe also reinitialize the tracking session?
     
  37. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi all,

    We have added support for three features that were missing.
    1.) Add/Remove Anchors
    2.) Callbacks for ARKit Session Interruption status
    3.) A callback for ARKit tracking changed events.

    There is a new example scene named AddRemoveAnchorScene, that uses a new component named UnityARUserAnchorComponent. This component adds/removes anchors to ARKit based on the life cycle of the object. This is one way to use the add/remove anchor API, but don't feel limited by our implementation. If you want to implement your own version, there are methods on UnityARSessionNativeInterface named AddUserAnchor, AddUserAnchorFromGameObject, and RemoveUserAnchor.

    ARKit provides callbacks for when a session is interrupted (app going to background/phone call/etc.), and when the interruption has ended. You can now subscribe to these events through the plugin. The events exist on the UnityARSessionNativeInterface and are named ARSessionInterruptedEvent and ARSessioninterruptionEndedEvent.

    ARKit also provides a callback for when the tracking state changes. You can now subscribe to the ARSessionTrackingChangedEvent on UnityARSessionNativeInterface.

    The asset store package will take a couple of days to update, but you can get it from the bitbucket repository immediately here.
    As always, please let us know if you have any feedback. We will be watching the forums.

    Cheers,
    Chris

    edit: Mentioning people that requested these
    @lusas @rockstarsaad @roooo
     
    Last edited: Jul 19, 2017
  38. TJUnityBuilder

    TJUnityBuilder

    Joined:
    Jul 19, 2017
    Posts:
    2
    I built and loaded UnityParticlePainter with no problem but I'm only getting blue screen and the Color Picker? Is there something I'm doing wrong?
     
  39. Staus

    Staus

    Joined:
    Jul 7, 2014
    Posts:
    13
    Thanks for the great work guys! Just voicing as well that Vuforia support would be really appreciated! :) We're a team building a big installation with multiple users walking around in a VR environment. We've been doing lots of other kinds of tracking, but the flexibility and ease of implementation of ARkit is really a plus. The only downside is that we're afraid of users getting sidetracked. Of course we'll give them extra physical space for the expected drifting (this is art ;) It's not meant to be perfect), but having different areas where a marker could get the scene back on track would just be so useful!

    I've done a little test of my own in ARkit, by calibrating 3 predefined locations and setting the scene at their average while rotating the scene towards the green location I could get a room mesh to fit quite well (see result around 53sec):


    The green mesh is scanned using a Hololens and was just for reference. I'm sorry it probably looks pretty confusing, but it helped getting an idea of tracking quality. You can especially see drifitng along the staircase and upstairs when looking at the pillar. And later when I go downstairs again the 3 locations are slightly misplaced.
    On the plus side I learned that a Hololens scan is 1:1 with the ARkit world.

    I actually find this challenge super fun :) Someone mentioned Beakons as well. I've also been considering simply having some dedicated Vive tracking areas and then just using the ARkit as a transportation tracking between the Vive areas. So every time the user gets close to the Vive border the ARkit would take over, while at the same time also do a slight adjustment of the world so it aligns again... of course this is not really easy to scale and implement o_O
     
    mimminito and jimmya like this.
  40. IsaiahT

    IsaiahT

    Joined:
    Oct 10, 2014
    Posts:
    1
    hi there, i have the same problem previously and here is how i solve it and my guess.

    I have two copy of xcode in my mac (8.1 and 9 beta 3), i manually open the Simulator under
    "/Applications/Xcode-beta.app/Contents/Developer/Applications/"
    close it and restart xcode, and everything works.

    i guess it is because xcode 9 beta somehow referring to the old xcode 8 path for the simulator thing.

    hope it helps.
     
  41. oliver-jones

    oliver-jones

    Joined:
    Nov 8, 2010
    Posts:
    25
    Hi, I have the Far and Near script already attached, but its still clipping at a very close range. Any ideas?
     
  42. mmortall

    mmortall

    Joined:
    Dec 28, 2010
    Posts:
    89
    I was trying to build AR Application with Unity-ARKit-Plugin but it failed to compile because Unity uses some Metal classes, but in Simulator mode, only OpenGL is supported.

    Errors in ARSessionNative.mm:

    CVMetalTextureCacheRef _textureCache; unknown type name and so on
     
    Last edited: Jul 20, 2017
  43. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    I don't have a solution, but I have experienced the same issue. I've confirmed that I have the code from this commit, but I still find that the far clipping plane is set to 30. I didn't dig any deeper.
     
  44. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi @mmortall,
    As of right now, only Metal is supported for the ARKit plugin.
    Cheers,
    Chris
     
  45. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi @theiajsanchez,
    It sounds like you may be sending only one of the textures over the network which is a luminance map. Which would be grayscale. One option would be to render it to a render texture, then read the pixels back as RGB, encode as PNG and then send it over the network. The other option is that you could send both textures over the Y and UV, then combine them on the other side of the connection in a shader like we do in the plugin. Let me know if this helps
    Cheers,
    Chris
     
    theiajsanchez likes this.
  46. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey @Staus,
    When you say "Vuforia support would be really appreciated!" Do you mean the features that Vuforia has, or the actual Vuforia sdk? Could you expand on that a bit?
    Cheers,
    Chris
     
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Did you actually change the far clipping plane value on the camera to be larger? The script reads the value you have set on the camera.
     
    Last edited: Jul 20, 2017
  48. Snaxz

    Snaxz

    Joined:
    May 13, 2013
    Posts:
    15
    Hi all, getting some two different crashes in xCode on two different iOS devices:

    Dev set up:
    - Unity 2017.1.0f3
    - xCode 9.0 beta 3 (9M174d)
    - OSX High Sierra

    Phones / Crash Info:
    - iPhone 6s - iOS11 (15A5278f)
    xCode Crash on run "dyld: Symbol not found: _OBJC_CLASS_$_MTLToolsArgumentEncoder
    Referenced from: /Developer/Library/PrivateFrameworks/MTLToolsDeviceSupport.framework/libMTLInterpose.dylib
    Expected in: /System/Library/PrivateFrameworks/MetalTools.framework/MetalTools
    in /Developer/Library/PrivateFrameworks/MTLToolsDeviceSupport.framework/libMTLInterpose.dylib"

    However this app does run on device by itself

    - iPhone 6s - iOS11 (15A5318g)
    xCode Crash on run " Uncaught exception: UnrecognizedARTrackingStateReason: Unrecognized ARTrackingStateReason: 1"

    When app is run on device alone, it crashes as soon as camera turns on

    Thoughts?
    Thanks!
    J
     
  49. theiajsanchez

    theiajsanchez

    Joined:
    Feb 2, 2017
    Posts:
    18
    Hi @christophergoy thanks for answering! Can you give us some help with the steps you mentioned? Actually I don't know how to render the texture into a render texture, neither read the pixels back as RGB.

    This is what we tried (without success):
    Code (CSharp):
    1.        
    2. public class  SuperControlador : MonoBehaviour
    3. {
    4.         private WebSocketController ws;
    5.  
    6.         private Texture2D tex;
    7.  
    8.         // Use this for initialization
    9.         void Start ()
    10.         {
    11.             Application.runInBackground = true;
    12.  
    13.             ws = WebSocketController.GetInstance();
    14.  
    15.             tex = new Texture2D(100, 100);
    16.         }
    17.  
    18.         void Update()
    19.         {
    20.             RenderTexture rt = new RenderTexture(100, 100, 24, RenderTextureFormat.ARGB32, RenderTextureReadWrite.Default);
    21.             rt.antiAliasing = 1;
    22.             rt.filterMode = FilterMode.Bilinear;
    23.             rt.useMipMap = false;
    24.             rt.wrapMode = TextureWrapMode.Clamp;
    25.  
    26.             if (rt.Create())
    27.             {
    28.                 Shader.SetGlobalTexture(Shader.PropertyToID("_textureY"), rt);
    29.  
    30.                 RenderTexture.active = rt;
    31.                 tex.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
    32.                 tex.Apply();
    33.  
    34.                 ws.SendMessage(Convert.ToBase64String(tex.EncodeToPNG()), "stream");
    35.             }
    36.        }
    37. }
    38.  

    Thanks in advance
     
  50. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi @TJUnityBuilder,
    The second crash is due to your plugin being out of date. If you update to the latest (either bitbucket or asset store) you should not get the UnrecognizedARTrackingStateReason assert anymore.

    For the first crash, is the iOS 11 build older or newer?
    Cheers,
    Chris
     
  51. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey, is this the code you are using which sends the grayscale image? Or is this new code that doesn't work at all?
    Cheers,
    Chris
     
Thread Status:
Not open for further replies.