Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official Introducing Meta Quest Support in AR Foundation

Discussion in 'AR' started by KevinXR, Jun 20, 2023.

  1. jtran

    jtran

    Joined:
    Jun 20, 2013
    Posts:
    7
  2. davidresolution

    davidresolution

    Joined:
    Apr 29, 2021
    Posts:
    6
    I followed this, set my main camera background to black with 0 alpha, activated passthrough on the Meta Quick Settings but I still can't see passthrough. Is there anything that I might be missing?
     
  3. davidresolution

    davidresolution

    Joined:
    Apr 29, 2021
    Posts:
    6
    I managed to make it work right after posting by adding the "AR Camera Background" to my main camera. Even though it says that "This component has no effect on Meta Quest devices"
     

    Attached Files:

  4. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @jtran Meta has an API to allow users to call room setup, but we haven't integrated it into the Meta OpenXR package yet. Thanks for this question-- feedback like this helps us prioritize which features would be most useful for our users as we consider what to work on next.
     
    ericprovencher and jtran like this.
  5. CiaranWills

    CiaranWills

    Unity Technologies

    Joined:
    Apr 24, 2020
    Posts:
    198
    Yes, unfortunately when you create a new default URP Pipeline Asset it does not work on Quest until you make these changes. We are working with the URP team on this as well us providing samples to get you going right out of the box.
     
    andyb-unity likes this.
  6. Aupuma

    Aupuma

    Joined:
    Feb 15, 2017
    Posts:
    42
    I want to use this package to get Passthrough working on Quest without using the Oculus integration package. When setting up the scene, do I need to add the AR Session prefab with the AR Session and AR Input Manager? Or that isn't required for simple passthrough?
     
  7. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @Aupuma AR Session and AR Input Manager are still required. More setup is required if you want to use input from the Quest controllers. Our latest release of AR Foundation Samples contains Quest-compatible sample scenes and a new XR Origin prefab you can use as reference: https://github.com/Unity-Technologies/arfoundation-samples
     
  8. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @davidresolution I investigated your claim that our docs are incorrect and ARCameraBackground actually does play a role in Passthrough. I'm unable to reproduce this. I downloaded the latest AR Foundation Samples and removed all the ARCameraBackground components, and passthrough still worked as intended. Perhaps something else was different between your builds?
     
  9. mgerritsen

    mgerritsen

    Joined:
    Jan 28, 2018
    Posts:
    1
    Hi Unity Team,

    Great, all the tools you have made available to make an XR app for Quest headsets. Because I am already developing an app with XR in mind for Quest 2, Pro and 3 with XR interaction toolkit, I have a question. I am also considering to transition to AR Foundation because of the support and features like plane detection and camera passthrough.

    So the questions I have are:
    - The app I am making uses XR Hands, XR Interaction Toolkit and Oculus Integration (only for passthrough and room setup), would it be advisable to start using AR Foundation for XR headset apps in the future (especially with the release of Quest 3 and Vision Pro in mind)?
    - If AR Foundation is missing a feature XRI does have, can I use AR Foundation and XRI at the same time?
     
  10. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    @mgerritsen If your goal is to build cross platform XR apps, we advise using AR Foundation. When it comes to room setup @andyb-unity mentioned earlier, Meta has an API to allow users to call room setup, but we haven't integrated it into the Meta OpenXR package yet. Your end users will have to run room setup before running your app right now. Also, passthrough customizations (posterization, colour tints, etc) that are available with the Oculus Integration Asset are not currently supported.

    AR Foundation is intended to be used with XRI. If you have issues using the two together, please let us know.
     
    FarmerInATechStack likes this.
  11. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    To add to @KevinXR's comment, you should be able to use all of these packages at the same time. AR Foundation + XRI + XR Hands are all designed to be used together. The Oculus Integration Asset is not designed to be used with AR Foundation per se, but they are expected to be compatible if you want to roll both.

    Your mileage may vary if you try to mix API calls exposed by both XRI and the Oculus Integration Asset, where they would both call the same OpenXR API's under the hood. We don't test this configuration very much, and I think you would want to stick primarily to one package's implementation per feature anyway.
     
    Last edited: Aug 3, 2023
    FarmerInATechStack likes this.
  12. MarkJeffc

    MarkJeffc

    Joined:
    Feb 14, 2022
    Posts:
    5
    Here's a link to a short video demonstrating a simple project I wrote as training material, using only Meta OpenXR Feature, AR Foundation & OpenXR Toolkit. It's an example of showing off a museum exhibit, with an ability to pick up the item and give extra information for a visitor.
     
    FarmerInATechStack likes this.
  13. FarmerInATechStack

    FarmerInATechStack

    Joined:
    Dec 28, 2020
    Posts:
    55
    I'm really excited for this @andyb-unity ! I'd understand if you can't share an official date, but do you know if that support for Passthrough Over Link will be shared on the order of days, weeks, months, or something else? Also thank you for all of the great communication and support around this.
     
  14. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @FarmerInATechStack Hard to say. We are working with Meta to define expected behavior in various cases that may result in necessary fixes for either or both of us. Trying to land in Q3 but that's not a guarantee at this point.
     
    FarmerInATechStack and Qleenie like this.
  15. FarmerInATechStack

    FarmerInATechStack

    Joined:
    Dec 28, 2020
    Posts:
    55
    @andyb-unity got it and thank you for the fast response! For now, I believe that means our fastest workflows for testing OpenXR with these new Quest features would either be (1) build and test on devices or (2) maybe learning how to use the simulators in-editor might help. Is that correct or do you have any other recommendations?

    For context, I'm specifically trying to test prototypes with Passthrough and Hands using OpenXR, XRI, etc. I want to target devices such as Quest 3 and Vision Pro.
     
  16. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    I'm not familiar with your options for in-Editor simulation of hand tracking, but for the AR Foundation features yes you might be able to get some mileage out of our XR Simulation feature (in AR Foundation 5.0 and newer). XRI 2.5 (coming later this year as well) will add XRI support for XR Simulation, but right now it is difficult to combine XRI into the in-Editor workflow.

    Testing on devices for sure works, yes. Quest 2 has black-and-white passthrough which isn't the best experience, but all the OpenXR API's are supported on Quest 2. Many of us are using Quest 2 to test our own OpenXR packages.
     
  17. FarmerInATechStack

    FarmerInATechStack

    Joined:
    Dec 28, 2020
    Posts:
    55
    ORIGINAL POST
    @andyb-unity and @KevinXR are you aware of available project samples that combine AR Foundation, Meta Quest Support, XRI, and XR Hands via OpenXR? I see from your notes that this should be possible, but I'm having trouble getting that set up.

    Current Blocker - Scripts Go "Missing"
    • One of the builds worked with Passthrough and Hands.
    • When I closed and reopened the editor, I see the errors in the attached image. Some XRI scripts "go missing."
    Please share if you have any input on missing XRI scripts or samples that combine the packages above. I'd appreciate any tips and thanks again for your work!

    UPDATES
    • UPDATE #1: in case it helps anyone, I'm starting over from scratch and trying to put together a basic sample. Here's a GitHub repo: https://github.com/farmerinatechstack/HandsAndPassthroughTests/tree/main. It's still not working (I just get a black screen even in a build), but now that's because I believe I'm missing something on the passthrough configuration (not the issue above on missing scripts).
    • UPDATE #2: basic passthrough on a build now works in that ptojrect. I realized my URP settings were incorrect. The repo is updated and also anyone can check the previous comments in this forum for more details on the fix (turn off HDR, turn off Terrain Holes, etc). The next step is to put Passthrough and Hands into a scene together.
    • SUMMARY: the repo linked above has a sample that works. I never resolved the "missing scripts" issues because that didn't appear again when I started from scratch.
     

    Attached Files:

    Last edited: Aug 24, 2023
    MarkJeffc likes this.
  18. MarkJeffc

    MarkJeffc

    Joined:
    Feb 14, 2022
    Posts:
    5
    Started adding hands to my demo project. Visualization working ok, using 'pinch' to pick up objects. The problem with not being able to rotate a grabbed object (as seen in the video) has been fixed in 'OpenXR Plugin 1.8.2'.
     
    FarmerInATechStack likes this.
  19. WarpBubble

    WarpBubble

    Joined:
    Dec 4, 2013
    Posts:
    33
    Hello! When you say "We are working on Quest Link support", do you mean strictly for editor play testing purposes, or actually supporting PC VR over link? We have a PC VR application that would hugely benefit from shared spatial anchors / room setup to allow multiple users in the same space. I think Meta now support this, but we are currently using OpenXR and would love to be able to continue supporting other headsets.
     
  20. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
  21. WarpBubble

    WarpBubble

    Joined:
    Dec 4, 2013
    Posts:
    33
    Hi Andy, sorry for my slow response. Maybe my question was unclear, let me rephrase it...

    The link you posted in your replay, at the top of the page says..

    "The Meta OpenXR Feature package enables Meta Quest device support for your AR Foundation projects."

    What I'm asking is, are you only supporting Meta Quest for on-device (Android) projects, or will you also support Meta-Quest for PC VR projects over Link, specifically in the area of local / shared Spatial Anchors?

    Essentially I would like to be able to use AR Foundation spatial anchors on a PC VR project using both Meta and Vive headsets.

    Thanks!
     
    Qleenie likes this.
  22. MisterMan123

    MisterMan123

    Joined:
    Jan 6, 2018
    Posts:
    5
    Is lighting estimation available for Meta Quest devices? I see on the AR Camera Manager component it's listed under "Properties for Other Build Targets." If it isn't, are there any alternatives that could be used instead?
     
  23. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @WarpBubble We're still working through some Link stability issues with Meta, but the intent is that much like the existing Oculus Integration asset, if you press Play in the Editor you will be able to run your project over Link (including all AR Foundation features such as Anchors, although we don't yet support shared Anchors)

    @MisterMan123 Good question! At this time, no. I'm not aware of any alternatives from Meta. If you want to follow along with everything that Meta can possibly offer, this is the link to their SDK docs: https://developer.oculus.com/documentation/native/android/mobile-intro/. We are working closely with Meta to develop C# APIs that call into their OpenXR Mobile SDK under the hood.
     
  24. tylerkealy

    tylerkealy

    Joined:
    Oct 8, 2018
    Posts:
    5
    Hi! I think I have an issue relevant to this thread. I believe my Plane Detection has stopped working due to the new v57 Meta update for my Quest Pro. Unsure if there is any insight the Unity Dev Team has on your end in fixing the issue -- perhaps not since it seems like an issue through the Meta end (from what I gather). I wrote a more detailed post on the meta forums.

    I'm using the project setup outlined in this post/thread using OpenXR, the Meta OpenXR Feature Group and AR Foundation. It seems others who responded to my meta forums post have found workarounds using Oculus XR Plugin and the Oculus Integration Package, but no mentions of this specific layout. I noticed earlier @andyb-unity mentioned that "The Oculus Integration Asset is not designed to be used with AR Foundation per se." So I'm assuming there's a different way to fix my problem.

    I detail in my post there, but it appears that "Room Setup" has been rebranded to "Space Setup" and has been moved from an experimental feature to a supported one under the Physical Space settings tab in the Quest. Since this change, my plane detection broke: assuming I'm not getting the room's spatial data through Unity. Strangely, this isn't reflected in the change log for v57.

    Any help at all would be greatly appreciated! Thanks so much for you time. I hope I didn't derail this thread !
     
  25. ankur-unity

    ankur-unity

    Unity Technologies

    Joined:
    Aug 12, 2021
    Posts:
    34
    @tylerkealy - Starting v57, Quest apps are requires to ask for a special permission
    com.oculus.permission.USE_SCENE
    from users to show scene data. We are working on a patch of
    com.unity.xr.meta-openxr
    with the fix that will resolve the plane issue.
     
    tylerkealy likes this.
  26. tylerkealy

    tylerkealy

    Joined:
    Oct 8, 2018
    Posts:
    5
    Thanks so much! - this worked

    I just had to add this line to my manifest (for anyone else who has this issue):

    <uses-permission android:name="com.oculus.permission.USE_SCENE"/>
     
    NigeAnThat, andyb-unity and mekin like this.
  27. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    FYI @tylerkealy we recently released com.unity.xr.meta-openxr version 0.2.1 which does the same.
     
    tylerkealy and mekin like this.
  28. jtran

    jtran

    Joined:
    Jun 20, 2013
    Posts:
    7
    Does anybody know how to get game objects that are created when the Room Scene is generated?
     
  29. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
  30. cubaschi

    cubaschi

    Joined:
    Oct 23, 2012
    Posts:
    51
    Do we have access to the actual bounds of a trackable? I'm using the plane detection example and it generates planes for some objects (e.g. a table) I setup as a box reaching to the floor. I really need to create boxes for our MR playground in Tentacular. Any reason why some types ("other"-> "bed" seems to get assigned to other) don't get tracked by ARPlaneManager?
    https://docs.unity3d.com/Packages/com.unity.xr.meta-openxr@0.2/manual/plane-detection.html
     
  31. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @cubaschi AR Foundation 5.1 does not have a means to provide you the 3D bounding box data produced by the Meta OpenXR runtime. We are working with Meta on a new API design for AR Foundation that can handle bounding box data, and we'll share more about that sometime in the future. In the meantime, if your app requires this data, you may be better off developing with Meta's Oculus Integration Asset for now.

    To your question about "bed", Meta's API does not produce planes for this object type. Here is a link to their docs that shows which classifications produce planes (or "2D" data, as they call it):
    https://developer.oculus.com/documentation/unity/unity-scene-supported-semantic-labels/

    We are working on an update to our documentation that links to this table as well.
     
  32. cubaschi

    cubaschi

    Joined:
    Oct 23, 2012
    Posts:
    51
    Thanks for the answer. We went full OpenXR to have a unified XR rig (tracked pose driver) for all platforms. Is it possible to use the Oculus Integration (OVR) and access presence platform API with that setup? Full disclosure: we shipped a VR title for 5 platforms but I still don't fully understand the XR plugin architecture/ecosystem.
     
  33. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    Yes. https://developer.oculus.com/documentation/unity/unity-scene-overview/

    OVR is currently further along in their integration of the Presence Platform API's (which you might expect, as it's a Meta product). The value add of AR Foundation is to be able to access the Presence Platform via an API that also works on other platforms (ARCore and ARKit currently).

    We're wrapping up the 1.0.0 release of this Meta OpenXR plugin with Passthrough, Plane detection, Anchors, and Raycasts, but plenty more to come in future versions.
     
  34. Ilikeit_ChristianB

    Ilikeit_ChristianB

    Joined:
    Apr 20, 2022
    Posts:
    2
  35. fcascinelli

    fcascinelli

    Joined:
    Aug 12, 2017
    Posts:
    7
  36. TahneeSmith

    TahneeSmith

    Joined:
    Oct 27, 2016
    Posts:
    5
    When using the OpenXR Meta package 0.2.1, there is a compiler error:
    Code (CSharp):
    1. Library/PackageCache/com.unity.xr.arfoundation@5.1.0-pre.6/Runtime/VisualScripting/Units/Events/EventUnits/SessionStateChangedEventUnit.cs(50,30): error CS0506: 'SessionStateChangedEventUnit.StopListening(GraphStack)': cannot override inherited member 'EventUnit<ARSessionState>.StopListening(GraphStack)' because it is not marked virtual, abstract, or override
    2.  
    Will this be fixed soon @andyb-unity ?
     
    heartingNinja likes this.
  37. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @Ilikeit_ChristianB Probably, but I don't know when! We just shipped 1.0.0 which will appear in the next patch of the 2022 LTS. We are just now starting our next phase of work with Meta, so there's nothing to announce yet in terms of new minor or major package versions, features, or release dates.
     
  38. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    fcascinelli likes this.
  39. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @TahneeSmith The issue here is that the Unity Visual Scripting team inappropriately allowed breaking changes in their minor version release 1.9.0. They have since fixed the issue in 1.9.1. If your project doesn't use Visual Scripting, you can remove the package from your project (it's installed by default): https://docs.unity3d.com/Manual/upm-ui-remove.html, or you should upgrade your Visual Scripting version to 1.9.1.

    The next patch of the 2022 LTS will also use 1.9.1 instead of 1.9.0 but unfortunately they built this Editor containing the breaking change.
     
    heartingNinja and MagiJedi like this.
  40. Extrys

    Extrys

    Joined:
    Oct 25, 2017
    Posts:
    341

    Can you tell me then, are you using passthrough, and Handtracking, in a totaly platform agnostic way, so you dont have to use OVR for that anymore? am i correct?
     
  41. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    @Extrys In AR Foundation we surface Passthrough to you via the ARCameraManager component. In this sense our implementation is platform agnostic-- you can use the same ARCameraManager component with ARCore and ARKit as well.

    Note that under the hood we call the same Meta OpenXR extensions that the OVR plugin calls, so we aren't doing anything unique on the backend compared to OVR.

    The benefit of using AR Foundation instead of OVR is the multi-platform aspect, but the cost is that we currently support fewer features of the Meta Presence Platform than the OVR plugin does. So you must decide what's best for each app you want to make. (Or if you choose, you can also roll both AR Foundation and OVR in the same project.)

    Finally note that AR Foundation does not support hand tracking. This comes from the XR Hands package, which our team doesn't work on, but it's certainly designed to be platform-agnostic as well.
     
    Extrys likes this.
  42. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
  43. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    835
    Is there any chance that Link support for Passthrough will work? And another question, would AR Foundation and Passthrough in theory also work with HDRP? Only URP is being mentioned, I wonder if this is a hard requirement.
     
  44. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    989
    I believe that no, it does not work with HDRP. URP or built-in work, and we're working with the URP team to improve the experience: https://docs.unity3d.com/Packages/c.../project-setup.html#universal-render-pipeline

    Yes this will work in the future. We're working with Meta on Link stability issues before we ship support for it in AR Foundation.
     
  45. AiDev123

    AiDev123

    Joined:
    Sep 7, 2016
    Posts:
    17
    Can the room scanning (seen in the new quest 3 "First Encounter" demo) be done yet? If not, when would we be looking at something like that making use of the quest 3's depth sensors?
     
  46. jtran

    jtran

    Joined:
    Jun 20, 2013
    Posts:
    7
    I am also curious when we can get the Mesh data from the Quest 3's depth sensor. Is there a roadmap when we will get this feature in ARFoundation?
     
    GameFinder likes this.
  47. KevinXR

    KevinXR

    Unity Technologies

    Joined:
    Jan 19, 2023
    Posts:
    23
    Hey @AiDev123 and @jtran, how do you plan to use Quest 3's mesh data? Adding support for Quest 3 mesh data in AR Foundation is on our roadmap along with some other features like persistent anchors, shared anchors, and more, that have been much requested.

    I'm in the process of updating our public roadmap page which allow you to provide direct feedback and influence when these features ship. I'll share a link here when it's up to date.
     
  48. cubaschi

    cubaschi

    Joined:
    Oct 23, 2012
    Posts:
    51
    I'd love to have a way to use the room mesh as a collider for physics interaction
     
    KevinXR and Qleenie like this.
  49. Qleenie

    Qleenie

    Joined:
    Jan 27, 2019
    Posts:
    835
    Dynamic occlusion would be another great feature and 100% necessary for believable AR.
     
    KevinXR likes this.
  50. cubaschi

    cubaschi

    Joined:
    Oct 23, 2012
    Posts:
    51
    I've made a build using plane detection that works on my Quest 2. When I run it on my Quest 3 there are no planes at all - the room has been set up. I checked player settings > XR plugin management > OpenXR (1.8.2) >Meta Quest support settings cog and it seems there is no option for Quest3 in version 1.0. Could this be the reason for my issue?

    MetaQuest.jpg