Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Question Whats the difference between Unity Open XR and Meta Open XR, can I use both?

Discussion in 'AR' started by CMDR_DISCO, May 5, 2024.



    Feb 10, 2019

    This is probably a lame question. When I go through most tutorials, I end up at the Package Manager installing an XR Pluggin Management. I am guessing this is the Unity Open XR.

    I am looking for some functionality that Meta Open XR supports. In particular, I was told this:

    Meta's OpenXR runtime does support projecting passthrough onto a given mesh. You can also manipulate color via compositor layers. More information here:

    Our Unity OpenXR: Meta package does not integrate these features though. We basically just implement a toggle to turn passthrough on or off. If you'd like to see these deeper integrations in Unity, please leave a note on our road map:
    I believe this means that I can't support passthrough mesh projection with Unity's Open XR?. Is there a way to have both implementations of Open XR at the same time so that I can support mesh projection?

    What does Unity OpenXR support that Meta doesn't? Like can it run on more devices, while meta only runs on quest?

  2. andyb-unity


    Unity Technologies

    Feb 10, 2022
    Let's begin with a definition of terms.

    XR Plug-in Management: This package facilitates the Unity XR lifecycle and allows you to enable XR support for various platforms such as Android, iOS, and the Universal Windows Platform. Its code has nothing to do with OpenXR.

    OpenXR Plug-in: This package "plugs in" to XR Plug-in Management, and enables your Unity app to run on OpenXR runtimes. When this package is installed, you see an OpenXR checkbox in XR Plug-in Management.

    Unity OpenXR: Meta: This package depends on both the OpenXR Plug-in and AR Foundation. It builds on the base functionality of the OpenXR Plug-in, allowing you to build AR Foundation apps for the Meta Quest OpenXR runtime. It defines AR Foundation subsystem implementations for various AR features as explained in its documentation.

    With this cleared up, we can answer your questions.

    Exactly. Unity OpenXR: Meta is one of many implementations of AR Foundation. When you build an app with AR Foundation, you enable yourself to deploy to ARCore, ARKit, visionOS, Meta Quest, HoloLens, and other platforms as explained in the AR Foundation manual.

    For this reason, our first priorities with this package have been to enable support for features that are also available on other platforms, allowing AR Foundation users to port their apps to various different platforms.

    Unity OpenXR: Meta does not integrate passthrough mesh projection, but Meta's OpenXR runtime (C++ code that you don't have access to in Unity) is capable of this feature.

    To access passthrough mesh projection in Unity today, you would need to use some other plug-in entirely, potentially replacing both Unity OpenXR: Meta and the OpenXR Plug-in in your project. There may be a solution for this in the Meta Quest Developer Hub, which is where you should start for advanced Meta-specific features.

    It may be possible to mix and match Meta Quest Developer Hub solutions along with our Unity OpenXR: Meta package. I don't know, and we don't test this, so I can't advise you. But it may be possible. Otherwise your options are to use a Meta Quest-specific solution such as Meta Quest Developer Hub, or make a feature request that we support passthrough mesh projection in the Unity OpenXR: Meta package.
    KyryloKuzyk and DevDunk like this.
  3. bindon


    May 25, 2017
    Could you explain how any of this relates (or doesn't relate) to "meta xr all-in-one sdk" ?

    [Just as an aside, people at Unity and Meta might want to take note that currently coming to XR development in Unity from the outside right now is monumentally confusing.

    The number of different approaches / sdks / versions of sdks / versions of unity / plugins / versions of plugins / misinformation / information overload / toolkits / samples / samples that don't work / sdks that are buggy / incomplete documentation / out of date documentation / documentation that turns out to apply to something other than the thing you have been trying to get to work / samples that turn out to be for something similar but different / samples that relate to older versions that are now deprecated / samples that relate to different approaches than the one you have been taking / components that depend on other components to work without any easy way of finding out these recipes / etc. etc.

    Far as I can tell, there is simply no single place I can go to get the answers that I need. I just have to somehow unpick the last five to ten years of continuously evolving and overlapping approaches and versions and frameworks and sdks and plugins and samples and documentation, and somehow understand ALL OF IT before I am able to make any kind of sensible choices about how to proceed.]
    Last edited: Jun 12, 2024
    noshio_dahar likes this.
  4. DevDunk


    Feb 13, 2020
    If you want to do it the unity way, unity learn has a solid way to get you started using cross platform tools :)
    All on one site

    Meta SDK is, well, an sdk specifically for meta devices. If you use it, it might not always work on other headsets. See these as meta specific features/code/workflows. If you only target meta, this is fine. If not, there might be issues.

    The meta/oculus xr plugin is a plugin that allows you to run unity projects on meta headsets. This can be abstracted by XR interaction toolkit or other systems (they can just use the new input system to manage all input from different hardware) to actually make the features