After reading the recently released XR Platform blog post, I, as many other users, had a lot of questions that this left unanswered. I commented, and got a direct reply by Unity, that still leaves questions. So this forum thread is to follow up on that! Please note that I really don't want to argue which features are supported and working in theory and practice. I just want to get an understanding of how all the different New And Shiny Things are supposed to align with each other. Ideally, I'll figure out what I'm doing wrong. Also, please don't reply with stuff like "this and that is in preview" or "you can keep using the deprecated systems for so and so long". Previews aren't really previews if they can't even be tested, and starting new projects on officially deprecated tech isn't great either. 1) Quest support with the new XR Plugin System Currently, neither latest released 7.1.8 URP nor latest URP master (as of today) work on Vulkan at all in VR. I would really like to understand why Unity keeps saying it does on 2019.3 - what am I missing here? This is a screenshot of Unity's sample URP scene (with postprocessing disabled, updated to latest SRP master (same on 7.1.8), updated to XR Management, using Oculus Loader, using Vulkan, stereo mode set to "multiview") on Quest: One thing I noted is that LogCat outputs Unity: Multi pass stereo mode doesn't support Camera Stacking. Overlay cameras will skip rendering. despite rendering mode being set to "multiview" and there not being any stacked cameras in this scene. On OpenGLES3 it looks slightly better - left eye renders properly. Right eye is still black. So my conclusion is that Quest does not work with URP on XR Management yet and I really hope I'm just forgetting a stupid toggle here. Bug Reports that are blockers: Case 1214807 [URP] Quest with Vulkan is broken on new XR Plugin System Case 1214827 [URP] Quest with OpenGLES3 is broken on new XR Plugin System (note: I also tested this today with the latest SRP master branch and latest 2019.3 branch. Same issues.) Case 1215369 Quest Vulkan built-in render pipeline broken 2) Quest support with "deprecated" VR Only URP version I know of that supports FFR, does not break ShaderGraph and works on Quest is my own fork, which is pretty old by now and still does not have GPU Instancing. Newer official versions always break one or another feature. Bug Reports that are blockers: Case 1201706 [ShaderGraph] Compilation Errors on shader that uses IsFrontFacing and Texture Array (note: Texture Arrays are pretty much needed for Quest for performance reasons. They worked in an older version of SRP and broke somewhere around when GPU instancing was fixed.) https://issuetracker.unity3d.com/is...-each-others-shadows-when-close-to-each-other 3) Input System support (this is mixed between VR / AR) URP is throwing continuous errors with Input System. This bug is known since at least August: https://issuetracker.unity3d.com/is...input-handling-is-set-to-input-system-package and QA today told me "there's no timeline for this". AR Foundation does support Input System, however, the AR Foundation Samples don't. It's not that hard to change that (it is for beginners though!) but gives me the feeling that the AR Foundation team isn't really testing with the new Input System. XR Interaction Toolkit does not support Input System. Touch Samples for Input System work, but as soon as you try to use them outside the sample project all the "Composite Bindings" are missing. The Input System examples only work in "New Input System" exclusively, they don't work if "Both" (old & new) are active In an ideal world, someone at Unity would sit down and actually try to create real-world, cutting-edge projects - combining packages, really - that compile in Editor and build to a device and still have the existing samples functioning: A) Starting a new Handheld AR Project right now and wanting to use the latest and greatest features: ARFoundation + XR Management + ARCore XR Plugin + ARKit XR Plugin + Input System + XR Interaction Toolkit. Bonus points: URP, ShaderGraph, not using the "Both" mode for Old and New Input but solely "New" B) Starting a new VR Project right now for Oculus Quest URP + ShaderGraph + XR Management + Oculus XR Plugin + Input System + XR Interaction Toolkit C) Bonus Points: Starting a new VR Project right now that is cross-platform between Rift, Vive, and Quest (disclaimer: I know this isn't really possible right now since we're all waiting for a reaction from Valve) @LeonhardP it would be great if you could get Matt Fuad to see this and ideally reply here - I'd really like follow up with him about the remaining questions after he replied to my initial batch of questions. Thanks!