Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official Introducing Meta Quest Support in AR Foundation

Discussion in 'AR' started by KevinXR, Jun 20, 2023.

  1. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    MetaOpenXRPackage_UPM.png
    Introduction

    AR Foundation enables you to create multi-platform augmented reality apps with Unity. In an AR Foundation project, you choose which features to enable by adding the corresponding components to your scene. AR Foundation enables these features using the platform's native SDK, so you can create once and deploy to multiple platforms (mobile and XR headsets).

    We are introducing support for Meta Quest 3, Meta Quest 2 and Meta Quest Pro to AR Foundation through a preview of a new Meta OpenXR Feature package. This package is currently in an experimental state and depends on the Unity AR Foundation package and the Unity OpenXR Plugin Package.


    Feedback

    We want to hear from you. We're especially interested in:
    • Is the documentation helpful?
    • Workflows that are unclear?
    • Which features do you want to see supported next?
    Please feel free to post your feedback in this thread or in this sub-forum.

    Installation
    The experimental Meta OpenXR package is currently available in the Unity Package Manager (UPM). Since it's an experimental package, it will not show up in the UPM search. You will need to add this package by typing in its name directly into UPM.

    To download the Meta OpenXR package, open the Unity Package Manager from inside the Unity Editor, click the plus (➕) symbol in the top left, select “Add package by name” and type com.unity.xr.meta-openxr. Once downloaded, it will automatically trigger other required packages, such as the OpenXR Plugin and AR Foundation packages, to download. For sample content, check out Simple AR and Anchors on Github.

    Note: AR Foundation on Quest relies on Meta's Scene feature for plane data. That means you must perform a Scene Capture via Room Setup on your Quest to see planes. See Room Setup instructions for details.

    How to report bugs
    Ideally we'd like any bugs reported through the built in bug reporter tool, as that will automatically provide us with some relevant context.
     
    Last edited: Jun 27, 2023
  2. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    979
    Will Apple Vision Pro also be supported through the AR Foundation?
     
  3. mfuad

    mfuad

    Joined:
    Jun 12, 2018
    Posts:
    334
    Yes, you can expect features like AR Foundation and the XR Interaction Toolkit to be well integrated in Unity's support for visionOS.
     
  4. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    979
    Does it mean that the AR Foundation (ARKit) app will work on visionOS without any changes?

    Plane Detection, Face Tracking, Image Tracking
     
  5. mfuad

    mfuad

    Joined:
    Jun 12, 2018
    Posts:
    334
    Bringing an app that uses AR Foundation / ARKit to visionOS will certainly require additional work -- input and interactions is just one example area. But let's keep this thread focused on the topic at hand. We will keep the community updated on visionOS support when we have more to share.
     
  6. Needle0

    Needle0

    Joined:
    Jul 20, 2013
    Posts:
    11
    Waitasec, I was under the impression that with Quest 2 and Quest Pro, there are no depth sensors and you can't obtain the raw camera feed, so you could only use environment data defined by having the user move the motion controller and manually trace objects like desks and chairs in their room. Is the version of Unity AR Foundation that runs on Quest 2/P somehow capable of automatic algorithmic plane detection, without requiring the user to manually map them out??
     
  7. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
    So I got everything set up using a QPro and 2022.3.3f1 (also tested 2023.1.0b20) using Meta OpenXR 0.1.1 and AR Foundation 5.1.0-pre.6, I also added hand tracking using XR Hands 1.2.1, but for the life of me I cant get AR Plane Manager to visualize planes using the Quest Pro, am I missing something?

    Or does it not support AR Plane visualization and detetion like in ARCore?

    This is as vanilla as it gets:
    upload_2023-6-22_22-51-4.png
     
  8. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    You will still need to manually map out planes.
     
  9. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    > for the life of me I cant get AR Plane Manager to visualize planes using the Quest Pro, am I missing something?

    @WayneVenter great question. AR Foundation plane detection on Quest is powered by Meta's new Scene feature: https://developer.oculus.com/documentation/unity/unity-scene-overview/

    We are working on improving our documentation for this, but on most recent versions of Meta Quest software what you need to do is go to Settings > Experimental > Room Setup and select the Set up button. This will take you into Meta's Scene Capture flow. From there if you add features to your room that include plane components (at time of writing, Desk and Couch reliably produce planes, but Meta is also iterating on their backend and this could change), then AR Foundation will be able to access the pre-stored plane information from your Scene and use that as the backend of plane detection.

    To be clear, AR Foundation does not perform any raycasts for real-time plane detection on Quest as you might expect from ARCore or ARKit.

    For other Project Setup troubleshooting, I recommend taking a look at our Project Setup docs page: https://docs.unity3d.com/Packages/com.unity.xr.meta-openxr@0.1/manual/project-setup.html. This page currently does not mention Room Setup but we'll add this in a future release.
     
    KevinXR likes this.
  10. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
    Awesome, I thought it was something like it that. ie from the Room Setup, but thought you could do the setup from inside the Unity app. I was hoping for the ability to maybe raycast to the floor and then draw my own "room setup" and store those planes/meshes for future persitant use, guess the feature flip/flop between Meta backend and AR Foundation is going to take some time to stabalize.

    Now my next big ask, will I eventually be able to remote play from Editor to device like with the Oculus sdk? This will speed up development time.

    And lastly, assuming the Quest 3 has a depth sensor, are there plans for an ARCore like SLAM/Raycast hit implementation or maybe answering my own question, it will still rely on Meta's Scene feature and AR Foundation is a wrapper in effect.
     
  11. FarmerInATechStack

    FarmerInATechStack

    Joined:
    Dec 28, 2020
    Posts:
    55
    Thanks @KevinXR @mfuad @andyb-unity and team! I'm excited to try this out and it's perfectly timed for a VR Jam that just started today too. I'll be giving this a shot as part of that jam. Thank you again and cheers!
     
  12. pieroo87

    pieroo87

    Joined:
    Mar 4, 2014
    Posts:
    3
    Hi, I'm very interested in the Meta OpenXR Plugin, but I have two questions:
    • Is there a size limit for the Scene Model feature? How can I create experiences in huge indoor environments (e.g., museums or palaces)?
    • When do you plan to make image tracking (or QR code scanner) available?
    Thank you!
     
  13. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    268
    The blog post states this :
    But it seems false about the Anchors when I go on the documentation :
    Capture d’écran 2023-06-27 à 11.23.08.png

    So : Does Anchors works on Quest ?
     

    Attached Files:

    Last edited: Jun 27, 2023
  14. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    268
    What is the scene model feature ? I don't see anything on the documentation page about this feature
     
    Last edited: Jun 27, 2023
  15. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    268
    So you are
    So you are saying that Plane Detection on Quest does not actually detect planes ?!
    It only takes the planes from the Room Setup done by the user beforehands and give them accessible to the developers ?
    What a shame ! (not your fault, only Meta's fault to not be able to detect meshs)
     
  16. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    We're looking forward to hearing about your experience!


    We are actively working on Quest Link support to speed up development iteration time. Stay tuned, it's coming soon!

    We're going to hold off on discussing what Quest 3 may or may not have and wait for Meta to officially announce more information.
     
    Gruguir and FarmerInATechStack like this.
  17. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    Hey, I looked through Meta's Scene documentation and I don't see mention of a Scene Model size limit. To create an experience for a large indoor environment, you'd have to run Room Setup on your Quest in that environment to identify the walls, surfaces, and other planes.

    We're using this preview to help us understand which features people would like to see next. How would you like to use image tracking/QR scanning on Quest? What other features would you like to see next?
     
    unity_qdktnygEzYPZmg likes this.
  18. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    Anchors are supported! Thanks for catching that. We'll update the documentation.

    Yes, AR Foundation on Quest takes the planes from the Room Setup done before hand.
     
  19. mr0ng

    mr0ng

    Joined:
    Jun 22, 2017
    Posts:
    8
    Hi @KevinXR Here's a video capture of the Anchor scene. My planes successfully appear in as yellow planes, but I'm unable to create any anchors no matter which buttons I press on my controllers. I did move the UI from screen space to world space so that I could see the logs. I was planning on further tweaking the anchor scene, but wanted to check here first to make sure I wasn't doing something incorrectly.
     

    Attached Files:

  20. mr0ng

    mr0ng

    Joined:
    Jun 22, 2017
    Posts:
    8
  21. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    268
    Do you have a sample that we can build on our Quest to test that ?
     
  22. pieroo87

    pieroo87

    Joined:
    Mar 4, 2014
    Posts:
    3
    Thanks for the reply!
    I want to create an AR app for meta quest pro using the passthrough feature.
    The user moves within a large environment. It can display information in augmented reality by framing the marker/QR code near a point of interest.
     
    unity_qdktnygEzYPZmg and Febays like this.
  23. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    We're going to push an update shortly that should fix this.


    Saving anchors is not currently supported in the preview.

    As I mentioned to mr0ng, we're working on an update to this Anchor sample so that it works properly on Quest.
     
  24. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    A Scene Model is a Meta Quest concept--"a single, comprehensive, up-to-date representation of the real physical world that is easy to index and query." It's explained in Meta's Scene Overview docs.
     
  25. medinafb01

    medinafb01

    Joined:
    Feb 1, 2021
    Posts:
    14
    Hello,
    First of all thank you for sharing,
    I was able to run it on my Quest 2 without any problem,
    - Passthrough works straight away
    - And also the "plane detection" in big quotes, since it uses the room setup.
    Image2.jpg

    IMG_7768.PNG

    I know it's early stage, but it's definitely a great advance.

    Feedback:

    1. The documentation is short but clear
    2. The workflow is clear, maybe because I am already used to work with ARFoundation.
    3. Meshing

    Thanks for your great work.
     

    Attached Files:

    Last edited: Jun 28, 2023
    andyb-unity and KevinXR like this.
  26. cspinger80

    cspinger80

    Joined:
    Jul 20, 2020
    Posts:
    1
    Is the PT working for u in editor mode? For me it only works in builds atm.
     
  27. YUAN_YAO

    YUAN_YAO

    Joined:
    Feb 6, 2022
    Posts:
    1
    I am using Unity 2022.3.1 LTS. However, when I type "com.unity.xr.meta-openxr" in the "Add package by name" of Package Manager, there is an error as this picture shows. upload_2023-6-29_20-56-36.png
     
  28. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @cspinger80 We are working on Quest Link support but not quite ready yet. Will share as soon as we have it!
     
  29. taimex

    taimex

    Joined:
    Feb 20, 2022
    Posts:
    13
    I've spent hours and hours trying to get this to work with 2022.3.3f1.

    Started with already working project. Added the meta-openxr package and additional steps to it.
    Did the same with AR Sample's Simple AR as a base.
    Started from scratch with AR template.

    Never get pass through and proper AR tracking working.

    Starting to wonder is something is wrong with my Quest 2.

    @medinafb01 has it working so the release is good. I'm just missing the key step or a good Quest.

    If I had a known working example at least I can see what differs or learn my Quest 2 is at issue.

    Could Unity or someone else just post an package or zip that JUST works.

    Please post!!!
     
  30. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    268
    I made it work (only tested passthrough) with Unity 2021.3 without importing any AR Sample
     
  31. taimex

    taimex

    Joined:
    Feb 20, 2022
    Posts:
    13
    Finally got it working AR Foundation.

    Found a couple of issues that were killing it.
    1) Had to turn on Passthrough in the Meta Quick Setting panel
    2) I'm using Hands package. If you 'start' your App with hands; it will crash. You must start with controllers. The you can switch to hands
    3) Must have first create the Room Setup. If you never had created one, it didn't work (passthrough) for me.

    Hope this helps other. BUT a working Asset would have helped me and others
     
  32. Lontis

    Lontis

    Joined:
    Sep 10, 2017
    Posts:
    5
    I would like to create an app that uses the camera to take snapshots in an outdoor area not already predefined using the scene set up. Will this ever be possible? Is there any way to access the camera feed?
     
  33. MarkJeffc

    MarkJeffc

    Joined:
    Feb 14, 2022
    Posts:
    5
    Created app from scratch, but can't get passthrough working.
    I've followed taimex's advice to turn on Passthrough in the Meta Quest Setting panel; and I've created a Room Setup (which I know works, as it shows in "The World Beyond"). But my app just shows the model I'm trying to display, and a black background. I'm presuming my issue is in my camera setup - any suggestions? upload_2023-7-3_17-28-39.png
     
  34. rportelli

    rportelli

    Joined:
    Apr 26, 2020
    Posts:
    3
    I had a similar issue, to fix it I had to move to standard rendering pipeline.
    I think the universal rendering pipeline is not supported yet.
     
  35. olivia_unity391

    olivia_unity391

    Joined:
    Feb 1, 2023
    Posts:
    1
    Is it true that URP is not supported yet? because, I followed the exact basic settings, but path through doesn't work with my build apk file.
     
  36. MarkJeffc

    MarkJeffc

    Joined:
    Feb 14, 2022
    Posts:
    5
    I did take look more at the AR Foundation documentation and worked through the steps in Universal Render Pipeline | AR Foundation | 5.0.6 (unity3d.com) including adding 'AR Background Render Feature' , but it made no difference to my project.

    So I started again with a normal Unity 3D (core) template and used the older Built-in Render Pipeline, re-importing the assets I'd used for my starter project. This time passthrough worked (following the other instructions above).

    Could a Unity developer please comment on whether the Meta OpenXR Feature is compatible with AR Foundation's URP support or not?
     
  37. CiaranWills

    CiaranWills

    Unity Technologies

    Joined:
    Apr 24, 2020
    Posts:
    198
    FYI version 0.1.2 of the package is now available.
     
  38. CiaranWills

    CiaranWills

    Unity Technologies

    Joined:
    Apr 24, 2020
    Posts:
    198
    No, the Quest doesn't make the camera feed available to applications.
     
  39. CiaranWills

    CiaranWills

    Unity Technologies

    Joined:
    Apr 24, 2020
    Posts:
    198
    With URP make sure you turn off HDR, Terrain Holes and any post-processing passes. URP has many possible configurations, some of which don't work well with passthrough, and we are working with the URP team on this.
     
    FarmerInATechStack likes this.
  40. endasil_unity

    endasil_unity

    Joined:
    Jun 28, 2018
    Posts:
    19
    Thanks for the update. With this information I was able to get URP working. To save time for those who do not immediately know where to find these settings, they can be found under the settings folder when starting with an URP template. It is in the Universal Render Pipeline Assets and their Universal Render Data assets. Select all the ones without - Renderer. Then in the inspector you can turn off Terrain holes and HDR . Next select the assets ending with - Renderer, there we can disable post processing. You will need to select them one at a time. I tried just turning off post processing on the camera but that did not help, had to do it on the asset.
     

    Attached Files:

  41. MarkJeffc

    MarkJeffc

    Joined:
    Feb 14, 2022
    Posts:
    5
    Note for early adopters.
    On page: Project setup | Meta OpenXR Feature | 0.1.2 (unity3d.com)
    the Project Setup, point 4 has changed from v0.1.1 to v0.1.2

    Previously: 4. In the Android tab, under Interaction Profiles, add Meta Quest Touch Pro Controller Profile.
    Now: 4. In the Android tab, under Interaction Profiles, add Oculus Touch Controller Profile.

    I think that should have been highlighted in the ChangeLog.
    I worked on Quest2 with the previously suggested 'Meta Quest Touch Pro Controller Profile' and couldn't get controller tracking working. Back to a working state now I've seen that documentation change.
     
    andyb-unity likes this.
  42. mr0ng

    mr0ng

    Joined:
    Jun 22, 2017
    Posts:
    8
    @CiaranWills any updates regarding the anchor sample scene working? Last time I tried the anchor scene, it did not work, message above in thread. @KevinXR
     
  43. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    We've got a fix that's currently going through QA. We'll post here when the repo is updated.
     
    mr0ng likes this.
  44. DudoTheStampede

    DudoTheStampede

    Joined:
    Mar 16, 2015
    Posts:
    81
    Hi! With this limitation so it's impossibile to have Image Tracking working?
    We would like to use it in XR so that you can watch a marker (as a painting) and then see some animations or 3d objects (like you do with standard AR on mobile now).
     
    unity_qdktnygEzYPZmg likes this.
  45. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
  46. DudoTheStampede

    DudoTheStampede

    Joined:
    Mar 16, 2015
    Posts:
    81
    Yeah I get that, I was asking if it will ever be possible or if it's a real limitation due to the unavailability of the camera feed (as I would assume).

    Actually @KevinXR asked about Image Tracking ( here ) so I hoped there would be some kind of possibility (I think you guys are better connected with Meta Developers and maybe you can have some more info).
     
  47. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    We don't have anything to report on image tracking support. We're constantly sharing what we hear from the community with Meta so thanks for sharing your feedback.
     
    andyb-unity likes this.
  48. dilmerval

    dilmerval

    Joined:
    Jun 15, 2013
    Posts:
    232
    Guys, I recently released a new YouTube video to cover Unity's new Meta OpenXR Package, where I go over all of the new features including:
    • Device Tracking (HDM + Controllers)
    • Camera component (for Passthrough features)
    • Plane Detection
    • Plane Classification
    • Raycasts
    • Anchors
    Take a look at my write-up of these new features and steps to set it up here. Or you can watch the full video below, thank you.

     
  49. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
  50. jtran

    jtran

    Joined:
    Jun 20, 2013
    Posts:
    7
    Is there any way to call the Meta Quest room setup from OpenXR or AR Foundation? In case the player hasn't had a room setup previous?