Search Unity

Question Strategy for getting project working on both Meta Quest 3 *and* iOS?

Discussion in 'AR' started by iamthatis, Nov 5, 2023.

  1. iamthatis


    Oct 29, 2023
    To preface, I'm as nooby as they come, but playing around with Unity to learn some new skills and having a great time!

    I want to play around with AR/MR (as opposed to VR). In this particular case I want to have a turtle that walks around my apartment floor. I can get the basics of this working with the base AR template in Unity and running it on iOS, however I cannot seem to get it working at all on my Meta Quest 3. Running the project just results in a black screen.

    On the flip side, if I use the MR project template, I can get it running on Meta Quest 3, but on iOS there's no passthrough camera feed and no way to select any of the options.

    Am I doing something really silly here that's an obvious fix? I don't really need input controls per se (in terms of hand tracking or anything), if anything I might use an external game controller, but I just want a core base that can detect and spawn things on the floor of my apartment for both iOS and Meta Quest 3.
  2. DevDunk


    Feb 13, 2020
    If the features you need are available with the Meta OpenXR plugin for AR Foundation, I suggest to use that. Then you develop for AR Foundation and should be able to deploy on different devices.
    If you want to do more complex input handling XR IT could help.
    Did you follow a guide for the setup you currently have? what did you already try?

    The templates are not always up to date, so I usually start from a blank project and build from there
  3. jakem-unity


    Unity Technologies

    Feb 14, 2023
    When you change platforms, be sure to also update the plugin provider for the appropriate platform in Project Settings > XR Plug-in Management. For Meta you need to select OpenXR and NOT Google ARCore.
    andyb-unity likes this.