Search Unity

Question Creating a persistent AR experience in a static real world location.

Discussion in 'AR' started by merrythieves, Jul 20, 2021.

  1. merrythieves

    merrythieves

    Joined:
    Apr 28, 2021
    Posts:
    14
    I want to take a known, static, real world location and use it to build out an AR experience that will work across devices. This is turning out to be more difficult than I anticipated. A few requirements and wishes:
    • Must be able to re-localize to accuracy within a few inches
    • Must work on iOS and Android
    • Must work outdoors, in different lighting conditions, or if people or other objects are in the tracking frame
    • Should use sensor fusion/SLAM to maintain elements position
    • Preferably isn't an api/service/cost per hit, but I'll concede on this if it is the best solution
    I can't tell if I am overthinking this... What is the most straightforward way to align an AR scene to the real world (accuracy ~a few inches?) at the scale of a room, public square, or courtyard with non-static lighting?

    I have an iPad pro with 3d scanner and can create scanned environments and point clouds of various formats. Ideally I'd pull an environment into Unity, place my objects all around, hit build and then you'd have an uncle and his name would surely be Bob.

    Do I use a point cloud trackable in AR Foundation Do I need to reverse engineer google cloud anchors? Can I just make a ar foundation anchor and save it and bake it into my app somehow? Or should I just be using image targets and only spawning objects near them?

    Vuforia area targets basically provide the workflow I am looking for (see here:
    ), but the problem is they don't work well outdoors or in scenes with non-static lighting. So I'm looking for a similar solution that can solve this issue.
    Thanks.

    A few options I have explored:
    1. AR Foundation basic Anchors. Can't be saved or shared(?)
    2. Vuforia area targets. Almost perfect, except can't work if lighting changes (i.e. outdoors)
    3. AR Foundation Image Targets/other image target solutions. Good for experiences local to that image, but for the larger environment there is too much drift.
    4. Google Cloud Anchors. Promising but difficult to use. Unfriendly workflow. Non existent support. Poor documentation.
    5. Unity MARS. Unclear how well this use case is supported currently. Uses image targets (drift).
    6. ARWorldMap. Only works on iOS.
    7. Stardust SDK. Looks promising, but relocalization has not proven reliable in my testing.
    8. 8th Wall. Complicated, expensive, and unclear how well it plays with unity. Primarily for web. Also apparently just does image tracking underneath as well(?)
    9. Wikitude. Is this pricing real life? Tops out around $5000/yr USD and that doesn't even include being able to use with Unity yet for which you will need to "contact them." I will pay for a solution, but probably can't take out a 2nd mortgage for it.
    10. ARWay. Inconsistent relocalization in testing. Also doesn't constantly re-localize ($?) and therefore drifts.
    11. Maxst+. Looks promising.
    12. Azure Spatial Anchors. Looks promising.
     
  2. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,145
    In my experience, Google Cloud Anchors work great both on iOS and Android. Google updates their repo frequently and there are two large example projects to study.
     
    J1wan likes this.