Search Unity

Official AR Foundation support for iPhone 12 Pro with LiDAR

Discussion in 'AR' started by todds_unity, Oct 13, 2020.

  1. todds_unity

    todds_unity

    Joined:
    Aug 1, 2018
    Posts:
    324
    AR Foundation supports the new iPhone 12 Pro LiDAR features with 4.0 (verified) and 4.1 (preview) versions.
    • AR Foundation 4.0 introduced ARKit scene mesh reconstruction support which is now available on the new iPhone 12 Pro through the ARMeshManager.
    • AR Foundation 4.1-preview introduced environment depth textures and automatic occlusion. These functionalities also operate on the new iPhone 12 Pro through the AROcclusionManager.
    With the addition of a LiDAR scanner, AR developers need to consider thermal effects that come with the new hardware. iOS exposes 4 levels of thermal state as described in the thermal state documentation.

    The “ThermalState” scene in the arfoundation-samples project provides a basic example for how to interact with thermal state changes by disabling nonessential features as the thermal state increases. AR developers should note that ARKit on devices in the Serious thermal state will automatically be throttled to a maximum 30 FPS. Furthermore, devices in the Critical thermal state will stop all ARKit functionalities. These measures serve to protect the device and adapt AR experiences as the temperature of the device changes.

    In the arfoundation-samples project, the following two files provide an interface for querying the current thermal state and for subscribing to thermal state change events.
    The sample “ThemalState” scene demonstrates how to use this thermal state information to disable less critical AR features as the thermal state increases. The ThermalStateHandling.cs illustrates how to query for the current thermal state and how to subscribe to the thermal state event.

    For the purposes of this sample, the sets of AR features serve to exhibit how to respond to thermal state changes. In developing an AR app, developers must choose which AR features are critical to the functioning of the app and which AR features are nonessential. Understanding the thermal impact of an app is crucial to providing a robust experience to the users.

    Finally, testing an app’s response to thermal state changes is easy in Xcode. In the main menu in Xcode with an iOS device connected, open the menu Window> Devices and Simulators. With a connected iOS device selected, setting a thermal state device condition simulates the thermal state on the connected device.

    thermal-state-2020-09-15.png

    When running an iOS app, the Energy Impact tab on the Xcode Debug Navigator illustrates the current and historical thermal state as well as any simulated device condition.

    thermal-state-report-2020-09-15.png

    More information on using Xcode thermal state conditions is available in this video for Designing for Adverse Network and Temperature Conditions.
     
    Last edited: Nov 5, 2020
  2. SYEDALIHASSAN

    SYEDALIHASSAN

    Joined:
    Sep 12, 2016
    Posts:
    4
    I have developed an AR mobile application with object detection using tensorflow. The app is running perfectly on iphone 12 mini and other iphones. But when i test on iphone 12 pro versions and ipad 12 pro the app is not showing 3D model when the phone camera is far from the detected object. When ever the app detects the trained object it is suppose to show the 3d model and place near to that object but in iphone 12 pro versions it is only showing 3D object when camera is near to detected object. I think may be lidar is creating problem? if yes then how to stop the lidar using c# code as i have developed the project in unity using ARfoundation and tensorflow.
    i am using arfoundation 1.0.
     
  3. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,601
    How does the LiDAR scanner on the iPhone 12 compare with the iPad Pro? Resolution, range etc.
     
  4. LT23Live

    LT23Live

    Joined:
    Jul 8, 2014
    Posts:
    97
    AR Foundation image tracking is not working on iPhone 12 Pro.... iPhone XR, 10, 8 are working fine. Android is working fine, but the 12 Pro is problematic. The camera feed works fine but it refuses to track image targets. Does this have anything to do with the Lidar tech?
     
  5. adammpolak

    adammpolak

    Joined:
    Sep 9, 2018
    Posts:
    450
  6. kjyv

    kjyv

    Joined:
    Feb 20, 2018
    Posts:
    53
    Works fine for us, so you probably need to give more details in what cases or with what configuration it does not work.
     
  7. IAmBurtBot

    IAmBurtBot

    Joined:
    Oct 3, 2014
    Posts:
    7
    Any updates on this issues? I'm still unable to get Image Tracking to work on iPhone 12 Pro.
     
  8. t_lith85

    t_lith85

    Joined:
    Nov 9, 2020
    Posts:
    3
    How do you get the intrinsics for the depth image?
     
  9. danUnity

    danUnity

    Joined:
    Apr 28, 2015
    Posts:
    229
    Does the iPhone 12 become more accurate when it comes to dimensions of the virtual world compared to the real world? Do I have to do anything differently to get that increased accuracy with iPhones that have LIDAR?
     
  10. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    Is there a simple way to get just the vertices from the iphone tof sensor?
     
  11. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    1,062
    Rickmc3280 and KyryloKuzyk like this.
  12. Mia_white

    Mia_white

    Joined:
    Apr 15, 2022
    Posts:
    13
    Any reply on this?
     
  13. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    Thats not really a unity question. As far as I can tell though, its the same sensor, so it should be identical. Take it with a grain of salt though because I'm not really sure.
     
  14. IR_Andrew

    IR_Andrew

    Joined:
    Jun 4, 2021
    Posts:
    5
  15. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    I am assuming you figured this out already. I havent been able to get this up and going due to lack of time mostly but... All you really need is the MeshFilter from the object being rendered. Then you get the Mesh, and then vertices, and then corresponding colors. Ex.

    vector3 vectors = myMesh.vertices;
    color32 colors = myMesh.colors;

    Im not familiar with the output, but I would assume something like vectors[0] corresponds with colors[0] and the meshes are rendered using the color. Assuming such would work for "Vector Cloud" "Point Cloud" etc.

    Anyone who is doing this... do you all have MACs and XCode? I assume that is the only way?
     
    Last edited: Sep 23, 2022