Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Question Does device ALWAYS track no matter what? How to make model stay put in real world?

Discussion in 'AR' started by newguy123, Jan 12, 2023.

  1. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,234
    Hi guys

    So after placing your prefab in the scene, either from a raycast or whatever, can I then disable all trackables? At this point are coordinates worked out from devcice gyro, sensors etc only? How does the AR camera know where my device cam is then at this point? Does it need to constantly track something in the cam feed to know where the device is?

    Reason I'm asking is that I'm testing various AR Foundation samples, in particular the Anchor samples. When I place an anchor on a plane, or on a feature point, for the most part it appears to stay put. I can walk away and come back and things seem to stay put, mostly since the ground is always in view. However as soon as I look at the sky, and back down, everything seems haywire asif the entire scene drifted somewhere else. After few seconds it does seem to snap back to position, but the point is that totally destroys the experience.

    If I look up at the sky, and then down again, I would expect the virtual camera to know where the device camera is from its sensors etc. Why is this happening?

    If my placed prefab is a 50meter Christmas tree and I'm right next to the tree at the base, I would want to look up at the star at the top of the tree, however like I say, it seems to loose tracking when I look up and only snaps back to place when I look down or more eye level, which means I can only ever see the base of the tree, unless I step back 20 or 30 meters to get the whole tree in view.

    Do I need ARWorldMap to make things stay put better, or that wont make much difference in my use case?

    Should I be looking at GeoLocationAnchors or Lidar Scanner without visibly meshing perhaps?

    My questions relates to the iPad Pro M2 and AR experience the size of a football field, although the user wont walk further than maybe 20 meters or so, but there will be various LARGE scale prefabs in the scene.
     
    Last edited: Jan 12, 2023
  2. davidmo_unity

    davidmo_unity

    Unity Technologies

    Joined:
    Jun 18, 2019
    Posts:
    99
    The device is recalculating it's understanding of the world. Anchors a fundamental AR primitives that are constantly updated as the device understands the space around it so when you point the device up it's attempting to re-establish its understanding of the world hence the drift (the device lost its location for a second and needs to recalculate). If you are more interested in the object staying at a single point on screen you may consider disabling it's ARAnchor component when in view and re-enabling it when it's out of view. This way it stays static when on screen then resumes updating when not in view.
     
  3. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,234
    No I'm NOT interested for the object to stay in a single position on screen. I am interested in the object staying in a single position in the real world, IE, once placed, it should stay put and not drift anywhere. It should be like I'm looking at a real object in the real world.

    Real objects from the real world does not drift when you move your camera around. So I'm interested in how to best achive that with AR, in a way that let me look anywhere without loosing tracking or without the placed object drifting.

    I already have these tracking features on in an attempt to remove drift:
    plane tracking
    point cloud
    image track
    occlussion
    what else can I add for the tracking to be more accurate and reduce drift?

    You'd think an M2 ipad Pro will have a better understanding or more accurate tracking, however its producing same results as an old Android phone
     
  4. Cery_

    Cery_

    Joined:
    Aug 17, 2012
    Posts:
    47
    Im not a ARFoundation or an ARCore/ARKit Dev but you should reduce the tracking features as much as possible. If you don't need planes disable them, if you don't need a point cloud disable it. It only draws performance from the underlying ar engine during the time the unity side is busy updating it. The ar engine uses all the data it can get and updates the object location to stay as stable as possible.
    What you are describing is a limitation of the ar systems in general. A lot of the tracking is done by visual tracking. The IPad even has a lidar sensor which is basically a depth camera. The gyro, magnetometer etc. is only used to improve the data from the camera feed. If you look at the sky there is no distinguishable features to orient from. The sky looks the same regardless where you stand on the field - thus the device is not able to orient itself. During the time when the camera feed doesn't supply useful data your scene drifts away due to the inaccuracy of the gyro etc. Those sensors are mostly build to detect wether the device is upright or in landscape orientation they are not very precise. If you then focus on something that is distinguishable the system rediscovers known points and compares the virtual pointcloud/map with the current camera feed to correct all drift that happened during the time you looked at the sky. If successful it will snap back in place.
     
    andyb-unity likes this.
  5. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    793
    ARKit device tracking is based on Apple's scene understanding and tracking technologies, which create a world map by tracking feature points in your environment. To learn more, I recommend this WWDC talk which summarizes the basics of ARKit: https://developer.apple.com/videos/play/wwdc2018/602.

    Scene understanding is enhanced within certain depth ranges on devices with LiDAR capabilities; however, both LiDAR and Apple's camera-based tracking are useless if the device is pointed at the sky. ARKit will perform poorly in this condition.

    If you require tracking while the device is pointed at the sky, your best bet is to use a geospatial tracking solution. There is no way to configure simple plane detection or anchor tracking to work reliably if your device is pointed at the sky.

    Some places you could start:

    Google Cloud Anchors via Google's ARCore Extensions for AR Foundation: https://developers.google.com/ar/develop/unity-arf/getting-started-extensions

    Our ARKit GeoAnchors sample:
    https://github.com/Unity-Technologies/arfoundation-samples#geo-anchors
     
    ArmanUnity likes this.