Search Unity

ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'ARKit' started by jimmya, Jun 5, 2017.

  1. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    These are just regular unity shadows as described here. In our case we use the directional light in the scene to cast shadows - there is no determination of actual light direction from the real scene, though you could use your own calculations to change the direction of the directional light for example.
     
    Aestial likes this.
  2. kchlab

    kchlab

    Joined:
    Jul 8, 2015
    Posts:
    1
    When app entered background, AR world coordinates gets messed up. Should I reset AR tracking & remove anchors every time app goes into foreground from background? or is there any proper way to handle AR tracking for app lifecycle?
     
  3. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    484
    Hey @Grislymanor,
    Thanks for the feedback. My language may have been too harsh. I encourage posting generic bugs to a wider audience rather than a thread that is dedicated to a specific feature, where someone not using ARKit, but has the same issue, might see it.
    Cheers,
    Chris
     
    KwahuNashoba and wdrescher like this.
  4. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    484
    Hey @ironcobratv,
    Looking at the log I see this
    Code (CSharp):
    1. 2017-07-03 10:50:48.759698-0500 arkitscene[501:105951] [Technique] World tracking performance is being affected by resource constraints [1]
    2.  
    This might mean that the camera tracking isn't working very well and the camera may be 'flying' away due to performance constraints. Are you doing any heavy scripting? You can try to orient yourself by placing objects near the camera in the scene. If you see them flying away, you know that something bad is happening :)

    Cheers,
    Chris
     
  5. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    484
    Hey @kchlab,
    You can check the tracking status to see if ARKit is having trouble tracking when coming to the foreground from the background. And yes, the typical pattern is to reset the tracking state of ARKit, like you proposed above, if things go awry.
    Cheers,
    Chris
     
  6. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    807
    What's the recommended way to import models to unity at real world coordinates to match 1:1 with ARKit?
     
  7. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    484
    Hi @ina,
    Both ARKit and Unity have a world scale where 1 unit = 1 meter. So any modeling software you use should be the same.
    Cheers,
    Chris
     
  8. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    This is answered in the WWDC session; <https://developer.apple.com/videos/play/wwdc2017/602/?time=1582>

    "During an interruption, it's also important to note that because no tracking is happening, the relative position of your device won't be available.
    So if you had anchors or physical locations in the scene, they may no longer be aligned if there was movement during this interruption. So for this, you may want to optionally restart your experience when you come back from an interruption."
     
  9. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    807
    Not sure if this is a feature request, but can you have the plane generated have evenly spaced UV's, so that a grid texture can be plastered on evenly - similar to the original ARKit example onstage where a grid is shown.
     
  10. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    This should be pretty straightforward in Unity - you can use a wrappable grid texture, and have a script on the plane that updates the material.tiling to match the plane extents.
     

    Attached Files:

  11. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    807
    have you been able to even get vuforia image targets to work with ARKit? it seems either one works or the other ... if vuforia camera activates then, the arkit camera does not ... and other way around
     
  12. sugarbank

    sugarbank

    Joined:
    Aug 27, 2015
    Posts:
    6
  13. OneOfHaze

    OneOfHaze

    Joined:
    Jun 1, 2017
    Posts:
    7
    Hi,

    I'm trying to use the Texture from my iPads camera with ARKit. In the editor everything works fine and the ARKit demo build works fine on the iPad when I'm not trying to use the camera. However my app crashes when I try to use the WebCamTexture. The crash happens when I call the Play() method. Here is my code:
    Code (CSharp):
    1. void InitWebCam()
    2.     {
    3.         WebCamDevice[] devices = WebCamTexture.devices;
    4.  
    5.         if (devices.Length > 0) {
    6.             Debug.Log ("found a webcam");
    7.             webCamTexture = new WebCamTexture ();
    8.             webCamTexture.Play ();
    9.         }
    10.     }
    Is it not possible to use the WebCamTexture with ARKit? In XCode I've added
    Code (CSharp):
    1. <key>NSCameraUsageDescription</key>
    2. <string>This application will use the camera for Augmented Reality.</string>
    To the plist file because the ARKit demo build wouldn't work without.

    Basically I'm just trying to get a Texture2D of the camera feed. I also tried to use the ARTextureHandles class but wasn't sure how to get a working texture from there.

    Any help is appreciated!
     
  14. Dami-Juana

    Dami-Juana

    Joined:
    Jul 21, 2014
    Posts:
    18
    Hello everyone,

    I'd like to know if it is possible to develop with ARKit from the Windows side of the force?
    We currently have an Hololens project that we want to adapt for iOS devices, and I wanted to know if the Unity Patch 5.6.1p1 for Windows includes what's necessary to build a valid ARKit .xcodeproj?

    I didn't find anything on the web to confirm that is not possible (on the other hand I haven't seen anybody developing with ARKit from a Windows PC though).

    Thank you!
     
  15. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    ARKit basically takes over the camera when activated, so you will not be able to get a camera feed directly if you want ARKit to work.
    UnityARVideo script uses the two ARTextureHandles and the YUV shader in a command buffer to render the video to the background. There are various methods to save your own copy of this - one way would be to create a rendertexture and have the command buffer blit to your rendertexture as well. Then you can use it as a regular Texture2D.
     
    kvzantenisaac likes this.
  16. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Unfortunately, there is no easy way to develop for iOS on Windows. There may be various things you can put together, but the simplest way is just to get a mac (you could even get a mac mini for starters). You could certainly share the unity project between mac and windows (for the most part).
     
    Last edited: Jul 6, 2017
  17. Thirty9

    Thirty9

    Joined:
    Apr 6, 2015
    Posts:
    16
    Hello!
    I'm having a problem when app is backgrounded. When I foreground my game from background, all the objects will be gone or floating in the air. So I tried to pause the ar session in OnApplicationFocus function. However, after session is paused, i am not able to resume the session by calling UnityARSessionNativeInterface.GetARSessionNativeInterface ().RunWithConfig

    Does anyone know how to fix the background/foreground issue?

    Thanks!
     
    DerekLerner and jessevan like this.
  18. phil-harvey

    phil-harvey

    Joined:
    Aug 27, 2014
    Posts:
    60
  19. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
  20. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Try RunWithConfigAndOptions - and ResetTracking and RemoveExistingAnchors for your RunOptions
     
    DerekLerner likes this.
  21. spaces_brad

    spaces_brad

    Joined:
    Apr 18, 2017
    Posts:
    2
    There is an even easier way. Just make a large inward facing room boxe and put a doorway on one side. Surface the inside of one room with the camera blit texture that is always in screen space cords. Then in the doorway make a single poly that faces outside of the room, put the camera blit texture on that.
    Place the room box in your 3d scene. Place yours scene in ARkit.
    walk around, done.
     
    artfish and jimmya like this.
  22. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    I'm experiencing the same issue. Those run options seem counter intuitive to me, I'd like the objects to maintain their position.
     
  23. Dami-Juana

    Dami-Juana

    Joined:
    Jul 21, 2014
    Posts:
    18
    Hi again!
    thank you for your answer jimmya.
    Usually, when I develop for iOS with Unity, I do my project on Windows, I build it for iOS the same way I could build it for Android or Standalone, AND ONLY THEN I have to export the result to a Mac to fire up Xcode (or using the 'iOS builder for Windows' asset to simumate Xcode behaviour).

    Is that possible when using the ARKit (like in any other iOS project) to develop a project on Windows with Unity 5.6.1p1 and only use a Mac with Xcode at the end of the production line?

    Thank you
     
  24. jkparamlabs

    jkparamlabs

    Joined:
    Jun 21, 2017
    Posts:
    3
    I tried the UnityAROcclusion Scene but the gameobjects are still showing when seen through a plane. Can you explain what the scene exactly does and how you have embedded occlusion?
     
  25. OneOfHaze

    OneOfHaze

    Joined:
    Jun 1, 2017
    Posts:
    7
    This was the answer I was looking for. Got it working now! Big thanks!
     
    jimmya likes this.
  26. Bravo_cr

    Bravo_cr

    Joined:
    Jul 19, 2012
    Posts:
    148
    I will like to bring back the two issues we are facing and have been already posted:
    1. How to scale the world without touching any Transform. If you scale it you are going to have problems with lighting, shadows, physics, particles and this is even not possible if you are using static meshes, like in an scenario
    2. How to initialize the origin when you need it. If you have to put a scenario which is marked as static you int move it.
     
  27. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Please the read forums before you post your questions. In this case, it appears this and previous posts answer your question?
     
    KwahuNashoba and DerekLerner like this.
  28. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    From the scenes.txt file:
    UnityAROcclusion.unity: This scene shows how to occlude virtual objects by using an occluder material.

    GeneratePlanes GO now creates instances of an occluderPlanePrefab which uses uses an occlusionPlaneMaterial, which in turn uses a MobileOcclusion.shader. The material has its renderqueue set to "Geometry-10", and so renders before any other geometry in the scene. The shader then writes to the depth buffer only, so any objects rendered behind this object will be rejected by the ztest and not render.

    Someone trying it out:
     
    Last edited: Jul 7, 2017
  29. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    As I said in my previous answer, there may be ways to do it possibly. but it is not a pipeline that is supported by Unity, and you will complicate your life with an obtuse build pipeline in addition to all the other experimental things going on here (beta ios, beta xcode, experimental plugin, etc). If you plan on doing any ios development at all, I would highly recommend getting at least a mac mini ($500).
     
  30. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    Thanks for the link. My apologies for making you repeat yourself.

    I realize this has nothing to do with the Unity framework, but it seems like an ARKit bug to me. I don't expect it to relocalize in all situations, but something seems off about how it currently works.

    Using the starter project in XCode I can cover the camera, shake the heck out of the phone, run to a new location causing the model to no longer be anchored. When I return to the location where I started, it immediately pops back into place. If I take that same sample app, it more often than not fails to relocalize after being backgrounded.
     
  31. phil-harvey

    phil-harvey

    Joined:
    Aug 27, 2014
    Posts:
    60
  32. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Cool there are now so many objects that you can create using Blocks that you can drag and drop into your Unity project (since it has obj importer)... can't wait to see how you guys use these :)
     
  33. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Nope - but you could create a shader with "ColorMask 0" that will do the same thing. (see MobileOcclusion.shader)
     
  34. DerekLerner

    DerekLerner

    Joined:
    Jun 12, 2017
    Posts:
    19
    I'd like to know what might be the best solution to providing a user with the ability to tap screen and then have the AR object(s) reload/re-lock in position based on the users current camera view? Is this as simple as an onclick GUI button to reload the current scene? Any suggestions or fingers pointed in the right direction (...or script) are appreciated. I'm basically looking for a Unity-ARKit-Plugin solution to what is discussed here...

     
  35. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    No worries man - but it does take a toll :p . When you run to a new location, the model will still be anchored since the session is still active and it can use the sensors and/or camera to keep track of where it is compared to where it was. When you background, your session no longer has access to the sensors or camera, so you lose tracking. When you next come to the foreground, it cannot know where the last tracking stopped. There may be a way to prevent this if ARKit kept the session alive and connected to the sensors (i.e. it was somehow part of the OS rather than the app). But then you would get the overhead of the framework on all your apps even those that did not use ARKit.
     
    DerekLerner likes this.
  36. christophergoy

    christophergoy

    Unity Technologies

    Joined:
    Sep 16, 2015
    Posts:
    484
    Hey @DerekLerner,
    The APIs that Apple has exposed for us to do something like this are the RunWithConfigAndOptions methods. If you take a look at those, you can restart the tracking session. There is an option to leave the current ARAnchors in place, or remove them. I think this is what you are asking about. If you choose to remove the tracking data, all ARAnchors will be removed, but anything that you placed will remain (since they aren't associated with anchors). I don't know if you can "re-lock" the scene to where it was, but you can certainly reload the scene if you want to re-orient the user in the augmented world you've created. Let me know if that answers your question.
    Cheers,
    Chris
     
    Last edited: Jul 7, 2017
    DerekLerner likes this.
  37. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    You can check the swift code and see that all they basically do is an equivalent of what I stated above (and previous posts). When you do that, your camera in the unity scene should automatically reset to the scene origin.
     
    DerekLerner likes this.
  38. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    I suppose there is no point arguing what it *could* do, we know what it does do. Still, I'm going to :D.

    My point about shaking the phone covering the camera and running around, is that it can relocalize using purely visual information after it has lost track of its location even after being fed lots of garbage inertial data. It makes sense that it can do this, that's what SLAM does.

    You wouldn't need to track in the background for this work, but you would have to ensure that the visual landmarks and map aren't deleted when the phone is backgrounded. Neither the video nor the documentation makes it sound like the phone can't relocalized. In fact, the run options make it sound like the opposite is true. What is the point in `resetTracking: false` if it never relocalizes? In fact, what's the point of `pause` if it behaves like `stop`?

    Lastly, I'm not saying it would always work, but most definitely could work a lot of the time. This is a core feature of SLAM.

    I've submitted a bug report to Apple, we'll see if they think its a bug.
     
    DerekLerner likes this.
  39. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Have they stated anywhere that it is SLAM? They have stated that they use Visual Inertial Odometry to do world tracking.
     
    KwahuNashoba likes this.
  40. DerekLerner

    DerekLerner

    Joined:
    Jun 12, 2017
    Posts:
    19
    @jimmya & @christophergoy

    Thank you both your replies. Yes basically this..."reload the scene if you want to re-orient the user in the augmented world". I think @jessevan and I are discussing similar concerns, whereby an app goes to background and then to foreground once a user has moved to a new location there are unexpected results. For what I'm working on right now, reloading the scene might solve this, so I'll keep searching for a good onclick scrip for that approach.

    Separately... it seems that I'm now in over my head, as I have no idea how or exactly where to modify or add... "RunWithConfigAndOptions - and ResetTracking and RemoveExistingAnchors for your RunOptions". Am I too far off to think that this might take place in UnityARCameraManager.cs ?

    I appreciate your patience and help as I'm still learning about all of this.
     
  41. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    You need some script code to be run when you press your button. Check links.
     
    KwahuNashoba and DerekLerner like this.
  42. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    165
    This is really interesting, will you develop a plugin so we can use Unity with the uPlay Stealth headset?
     
    jkparamlabs likes this.
  43. DerekLerner

    DerekLerner

    Joined:
    Jun 12, 2017
    Posts:
    19
    Thanks! @jimmya

    Guess I'm still not doing this correctly. I tried the below based on peicing together your reply.

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.UI;
    3. using System.Collections;
    4.  
    5. public class Reload : MonoBehaviour
    6. {
    7.     public void ReTrack()
    8.     {
    9.         ARKitWorldTackingSessionConfiguration sessionConfig = new ARKitWorldTackingSessionConfiguration ( UnityARAlignment.UnityARAlignmentGravity, UnityARPlaneDetection.Horizontal);
    10.         UnityARSessionNativeInterface.GetARSessionNativeInterface ().RunWithConfigAndOptions (sessionConfig, UnityARSessionRunOption.ARSessionRunOptionRemoveExistingAnchors | UnityARSessionRunOption.ARSessionRunOptionResetTracking);
    11.     }
    12. }

    Receiving the following errors...

     
  44. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    Are you missing `UnityEngine.XR.iOS' using directive?
     
  45. HonorableDaniel

    HonorableDaniel

    Joined:
    Feb 28, 2007
    Posts:
    2,812
    I have a few thoughts/questions.
    1) This plugin is currently blocking development on other platforms. Can you fix this?
    2) When are you going to integrate this into the main engine?
    3) The documentation seems like it's severely lacking. Also, when is ARKit going to be added to the official unity scripting reference?
    4) Better demos, if you could make one like the one that Apple provides that would be much appreciated.
    5) I can't figure out how to disable ARKit once it's enabled. I'm trying to make a toggle button that toggles AR mode on/off. Any hint?
     
    Last edited: Jul 8, 2017
  46. DerekLerner

    DerekLerner

    Joined:
    Jun 12, 2017
    Posts:
    19
    @jimmya that error
    Hi @jimmya
    Thanks for replying during the weekend
    That error does say I'm missing UnityEngine.XR.iOS
    I'm using Unity 5.6.1p1
    How do I install UnityEngine.XR.iOS
    Can't express enough appreciation for all of your help
     
  47. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    1) This plugin is experimental, so you should not be using it in production. Nevertheless, what do you mean by its blocking development on other platforms? How?
    2) Not anytime soon - first of all ARKit has not been released yet: its still in beta.
    3) What docs have you read and how are they lacking? Have you read this forum, the TUTORIAL.txt and other links provided here? Also see answer to question 2. Also, if you want to contribute some documentation, you can send it to me or do a pull request on bitbucket.
    4) See this post.
    5) UnityARSessionNativeInterface.GetARSessionNativeInterface ().Pause ();

    In general, we have made the plugin open-source so that people can alter it to fit their needs, but also so that people can contribute back to improve the experience for others. So if you think there is something lacking that you can fix, please submit a pull request for it on bitbucket.
     
    Last edited: Jul 8, 2017
  48. HonorableDaniel

    HonorableDaniel

    Joined:
    Feb 28, 2007
    Posts:
    2,812
    Thanks for the info.
    Compiler errors come up when attempting to build on other platforms. I should be able to import ARKit and not have it effect/block Android, Mac, etc.
    Ok, makes sense.
    Just found TUTORIAL.txt, it is helpful, thanks.
    Thanks
     
    GenieQuest likes this.
  49. morty42

    morty42

    Joined:
    Jan 31, 2017
    Posts:
    1
    Thanks for the great plugin jimmya. Really fun to play with.
    How would you compare performance against a SceneKit based project using ARSCNView?
    Is there overhead in getting the camera texture into unity?
    Thanks
     
    jimmya likes this.
  50. zeb33

    zeb33

    Joined:
    Nov 17, 2014
    Posts:
    91
    Hi
    What is the best approach for allowing a user to select a plane found in the scene? I see the hit test example can say what type of hit has been made but can you then remove all over planes just keeping the selected one alive?