Search Unity

Unity-ARKit-Plugin Step by Step

Discussion in 'AR' started by jimmya, Sep 13, 2017.

  1. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Unity-ARKit-Plugin Step by Step

    1. Read up on ARKit and how it works at a high level (https://developer.apple.com/documentation/arkit/understanding_augmented_reality). This plugin provides the scripting API which corresponds to the ARKit native interface, and then builds on it by creating some GameObject components that use that interface.

    2. Please reference our example scene in this project to see how we built up the ARKit example app that is provided. UnityARSessionNativeInterface.cs and the NativeInterface folder has the lowest level API. See below for details.

    3. Obtain a AR native session interface using UnityARSessionNativeInterface.GetARSessionNativeInterface(). Let's call this m_session in this tutorial. (See UnityARCameraManager.cs)

    4. Create an ARSession by calling m_session.RunWithConfig(config), where config will be either an ARKitWorldTrackingSessionConfiguration (6DOF) or ARKitSessionConfiguration (3DOF) with the corresponding parameters set to determine what kind of session you want. You may also initialize with m_session.RunWithConfigAndOption(config, option), where option allow you to reset the session if one has been started previously. (See UnityARCameraManager.cs))

    5. Every update, use m_session.GetCameraPose() to get the ARKit understanding of the camera position and rotation. Use utility functions that change coordinate systems to determine what the postion and rotation of the camera in the Unity scene should be. e.g. camera.transform.localPosition = UnityARMatrixOps.GetPosition(matrix); camera.transform.localRotation = UnityARMatrixOps.GetRotation (matrix); (See UnityARCameraManager.cs)

    6. Every update, use m_session.GetCameraProjection () to get the ARKit understanding of the camera projection parameters and use camera.projectionMatrix to set it on the Unity camera (See UnityARCameraManager.cs)

    7. On the main camera for the scene, add the UnityARVideo MonoBehaviour component, and set the clear material in the inspector to point to the YUVMaterial in the project. You can look in the source at what this does: every frame, it takes the two textures that make up the video that ARKit wants to display from the camera, and uses the YUVMaterial shader to combine them into the background that is rendered by the camera. (see UnityARVideo,cs)

    8. At this point, you should be able to place a bunch of 3D objects in the scene, build and run it and be able to see the objects you have placed and move the camera around in the scene by moving your device in the world.

    9. You can use the ARKit provided HitTest API to interact with the scene. See https://developer.apple.com/documentation/arkit/arhittestresult.resulttype for the types of things you can hit in the scene. m_session.HitTest(point, resultTypes) will return a list of the hit results. You can use the results to determine where to place your virtual objects. (see UnityARHitTestExample.cs)

    10. If you have asked ARKit to do plane detection for you in the session configuration, you can hook into the events that ARKit returns for those via the plugin: UnityARSessionNativeInterface.ARAnchorAddedEvent, UnityARSessionNativeInterface.ARAnchorUpdatedEvent, and UnityARSessionNativeInterface.ARAnchorRemovedEvent. The delegates for all of these take the form AnchorUpdate(ARPlaneAnchor arPlaneAnchor). You can choose to render GameObjects that correspond with these planes, or just use them to "anchor" your virtual content. (see UnityARAnchorManager.cs)

    11. If you would like to get ARKit frame updates, which includes the point cloud data (https://developer.apple.com/documentation/arkit/arpointcloud), hook into the event UnityARSessionNativeInterface.ARFrameUpdatedEvent. The delegate is of the form ARFrameUpdated(unityarcamera), and you can get point cloud data from unityarcamera.pointCloudData. (see PointCloudParticleExample.cs)

    12. Using ARFrameUpdateEvent as in the previous step can also be used update the camera position, rotation and projection matrix (instead of steps 5 and 6), but be aware that this would update at the pace of the ARKit rather than update rate of the Unity rendering engine. You may have to use utility functions provided to get the results in the right coordinate system.

    13. You can obtain light estimation of the scene (https://developer.apple.com/documentation/arkit/arlightestimate) by calling m_session.GetARAmbientIntensity() on every Update. (see UnityARAmbient.cs)
     
  2. jalemanyf

    jalemanyf

    Joined:
    Jun 16, 2013
    Posts:
    25
    Hello,

    I have done the complete tutorial (thanks to do it!) but now I want to change/customize it with my Own avatar, where I can download more faces? There is any tutorial to add all the shapes on a 3d face model to be used on arkit face tracking?

    Thanks
     
  3. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,983
    Edit: fixed
     
    Last edited: Mar 29, 2018
  4. farazs

    farazs

    Joined:
    Dec 7, 2016
    Posts:
    10
    Hi,
    The Field of view of the camera is not correct in iPad 2018 (9.7)- Camera resolution is 720p even I am using iOS 11.4
    Camera looks zoomed in. While it worked fine in my iPhone 7plus. - Camera resolution is fine in iPhone 7plus(Full HD).
     
  5. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    Hi Farazs,

    The camera resolutions used in an ARSession are lower than the maximum resolution that the camera can provide outside of the ARSession.

    For instance, I am running iOS 12 beta on an iPhone X.

    The supported video formats available (varying based on ARConfiguration type) are:
    • ARWorldTrackingConfiguration.supportedVideoFormats
      • 1920 x 1440 @ 60
      • 1920 x 1080 @ 60
      • 1280 x 720 @ 60
    • ARFaceTrackingConfiguration.supportedVideoFormats
      • 1280 x 720 @ 60
      • 1280 x 720 @ 30
    For more information on setting the preferred video format for the ARSession, please reference this documentation.
    https://developer.apple.com/documentation/arkit/arconfiguration/2942260-videoformat?language=objc

    Todd
     
  6. JoMaHo

    JoMaHo

    Joined:
    Apr 2, 2017
    Posts:
    94
    Hi!
    I've tried using the provided ARVideoFormats example in ARKIT1.5, but it seems that my phone will only record video in 720p, I need 1080p. Using Unity 2018.2.13, iPhone7, Xcode10.1, iOS beta5 of 12.1
     
  7. JoMaHo

    JoMaHo

    Joined:
    Apr 2, 2017
    Posts:
    94
    Bump - anyone with an idea why I can not record an ARKit scene in 1080p?
     
  8. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Is there something like this for ARKit3?