Search Unity

Official XR Plugins and Subsystems

Discussion in 'AR/VR (XR) Discussion' started by matthewr_unity, Jun 12, 2019.

  1. lz7cjc

    lz7cjc

    Joined:
    Sep 10, 2019
    Posts:
    538
    a436t4ataf likes this.
  2. rinkymehra

    rinkymehra

    Joined:
    Mar 20, 2020
    Posts:
    4
    I start a brand new LWRP 2019.2.1 project. I download the XR Management package (2.0.0-preview26), and upgrade the XR Legacy Input Handlers to 2.0.6. I go into the ProjectSettings for XR, download the Oculus loader, add the Oculus plugin. Add a pose tracker to the main camera in the sample scene. Set it to be center eye. Hit play. mobdro

    Regards, tubemate
     
    Last edited: Jun 18, 2020
  3. G33RT

    G33RT

    Joined:
    Dec 27, 2013
    Posts:
    52
    I have a project with 3 XR loaders: WMR, Oculus and the preview of OpenVR loader. Based on *what* will the correct loader be used?

    With Oculus and WMR all seems fine, the correct loader is used depending on the headset connected. But when I add the OpenVR loader it will be activated even though there is no SteamVR on the machine (but instead a WMR headset).

    I suppose behind the scenes the loaders are asked one by one through the API if a headset is present ... and the first to say "yes" wins? And I suppose the OpenVR loader always says "yes" (just to screw the others). Or is that too far fetched ;)

    Anyway, is there a way to control the "order" in which the loaders are used/queried?
     
    Last edited: Jun 10, 2020
  4. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    The best we know so far:

    http://snapandplug.com/xr-input-too...ly-start-VR-after-the-app-is-already-running?

    TL;DR - use the other API to read the .loaders var and overwrite it with a re-ordered list. Docs say this works - but several people say it doesn't (and right now I don't have multiple vendor hardware to test with :()

    And a bunch of us trying to reverse-engineer the incorrect / broken Unity docs:

    https://forum.unity.com/threads/how-to-start-the-game-without-vr-with-the-new-xr-stack.904976/
    https://forum.unity.com/threads/que...nt-initialize-on-startup.819798/#post-5958632

    (I've reported bugs on the incorrect docs, and I've been asking for a correct / official response for about a month now, but no reply yet from Unity)
     
    fherbst, ROBYER1 and andybak like this.
  5. bugrock

    bugrock

    Joined:
    Sep 29, 2018
    Posts:
    8
    I just tried the new XR Management plugin stuff yesterday but there seems to be a problem with rendering to the right eye when using multipass.
    Right eye rendering only works for me if I have the "Camera Target Eye" set to "Both". If I set the target to "Right" it wont render to the right eye. This is important to me because I use two cameras and have one camera target eye set to left and the other set to right. I need to be able to do this because I need to be able to independently control each eye and display different things in each eye individually. That may sound weird but I develop eye training apps.

    Everything works fine if I don't install the XR management system.

    I should also point out that if I use the OVRCameraRig and the XR Management System it renders to both eyes even when I set targets to Left and Right but the display is double vision and funhouse wacky. Again OVRCameraRig works fine with legacy VR

    Could someone please try two cameras and set the targets to left and right eyes and let me know if their right eye renders? Or could someone try the OVRCameraRig and use the "use per eye cameras" and let me know if it renders without double vision. In either scenario with the new XR plugin stuff. There appears to be no issues with legacy.

    This is all using multipass. Single pass cant be used for independent eye rendering.

    Im using unity 2019.4.1f1
    Oculus Rift (not S)

    Thanks
     
  6. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    My guess: what you describe is not supported and never will be. Based on the previous responses from Unity staff (and e.g. what they implemented re: render-to-texture in VR) their intention is that the eye-and-camera rendering is all automatic and lets the system from engine > OS > hardware > LCDs do its own work.

    c.f. their statement that FOV can't be changed (by design) because FOV should be dictated by hardware (and they have a good point: changing the FOV in engine would sort-of work, but can screw-up assumptions made by the hardware in its own internal optimizations).

    You might be able to achieve something by doing a custom XR Plugin - I haven't tried it, but since it's providing the hardware-abstraction layer, in theory I think you should be allowed to do what you want there with simulating your own custom eye hardware. But really you're hoping to proxy/wrap/facade one Plugin with your own Plugin, and I have no idea if that's possible (in theory: yes; but the SDK may in practice make that hard/impossible).

    NB: obviously better to get a response from the Unity team, but if you want that you should log a bug report from inside Unity, and it takes a couple of months to get a reply, so offering my "guess" here as somethign that might help in the short-term :).
     
    gjf and ROBYER1 like this.
  7. bugrock

    bugrock

    Joined:
    Sep 29, 2018
    Posts:
    8
    Thanks for the reply!
    Surely this should be supported. Its a feature exposed right in the Camera component in the editor. Could you or anybody else please open a new project and set the the Camera Target Eye to "right" and tell me what happens? (using Oculus Rift, XR Management/plugins and multipass)

    camera.png camera2.png
     
    Last edited: Jun 23, 2020
  8. bugrock

    bugrock

    Joined:
    Sep 29, 2018
    Posts:
    8
    So I solved the above problem by disabling HDR in the camera. That was a trial and error fluke but it worked. Still a bug but I don't need HDR at the moment but will in the future.
    This workaround exposed a 2nd bug...
    Incorrectly the Projection Matrix for each eye is the same when using left and right eye targets. The value for m02 should be positive for the right eye and negative for the left or else you get double vision and other weirdness. For some reason they were both negative. Maybe the left eye is getting copied to the right or maybe just a typo in the code. Anyway I worked around the bug by calling the SetStereoProjectionMatrix function with the correct values. I had to do it manually because I cant seem to get the values from GetStereoProjectMatrix until after one iteration of Update() display.png cameradialog.png

    Code (CSharp):
    1. void Start()
    2.     {
    3.         Matrix4x4 leftmatrix, rightmatrix;
    4.  
    5.         leftmatrix.m00 = 1.200088f;
    6.         leftmatrix.m01 = 0f;
    7.         leftmatrix.m02 = -0.1470005f;
    8.         leftmatrix.m03 = 0f;
    9.         leftmatrix.m10 = 0f;
    10.         leftmatrix.m11 = 1.008082f;
    11.         leftmatrix.m12 = -0.1125376f;
    12.         leftmatrix.m13 = 0f;
    13.         leftmatrix.m20 = 0f;
    14.         leftmatrix.m21 = 0f;
    15.         leftmatrix.m22 = -1.0006f;
    16.         leftmatrix.m23 = -0.60018f;
    17.         leftmatrix.m30 = 0f;
    18.         leftmatrix.m31 = 0f;
    19.         leftmatrix.m32 = -1f;
    20.         leftmatrix.m33 = 0f;
    21.  
    22.         rightmatrix.m00 = 1.200088f;
    23.         rightmatrix.m01 = 0f;
    24.         rightmatrix.m02 = 0.1470005f;
    25.         rightmatrix.m03 = 0f;
    26.         rightmatrix.m10 = 0f;
    27.         rightmatrix.m11 = 1.008082f;
    28.         rightmatrix.m12 = -0.1125376f;
    29.         rightmatrix.m13 = 0f;
    30.         rightmatrix.m20 = 0f;
    31.         rightmatrix.m21 = 0f;
    32.         rightmatrix.m22 = -1.0006f;
    33.         rightmatrix.m23 = -0.60018f;
    34.         rightmatrix.m30 = 0f;
    35.         rightmatrix.m31 = 0f;
    36.         rightmatrix.m32 = -1f;
    37.         rightmatrix.m33 = 0f;
    38.  
    39.         CameraL.SetStereoProjectionMatrix(Camera.StereoscopicEye.Left, leftmatrix);
    40.         CameraR.SetStereoProjectionMatrix(Camera.StereoscopicEye.Right, rightmatrix);
    41.     }
     
    fherbst and a436t4ataf like this.
  9. colinleet

    colinleet

    Joined:
    Nov 20, 2019
    Posts:
    189
    I've found the exact same problem using the most recent 2019.4.1f1 LTS using XR Plugin Management 3.2.10 and Oculus XR Plugin 1.3.4. The issue isn't just with the IndexTouch, but is happening with the GripTouch feature too for me.
     
    Last edited: Jun 26, 2020
  10. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    colinleet likes this.
  11. lz7cjc

    lz7cjc

    Joined:
    Sep 10, 2019
    Posts:
    538
    After installing XR management and removing the old GoogleVR for cardboard, how are we meant to control the scene whilst in editor/game mode on a Mac?
    Kind of important!
     
  12. mbarnes-mv

    mbarnes-mv

    Joined:
    Aug 21, 2019
    Posts:
    8
    @mfuad

    Hi Matt,

    Can you point me to the documentation and examples for implementing an XR display provider plugin or subsystem?

    Thanks,
    Mark
     
  13. holo-krzysztof

    holo-krzysztof

    Joined:
    Apr 5, 2017
    Posts:
    77
    For anyone trying to implement their own XR SDK provider:

    If you've created a package from scratch and are wondering why Unity never seems to recognize your UnitySubsystemsManifest.json, you need the following lines in your package.json:

    Code (JavaScript):
    1. "keywords": [
    2.         "xreditorsubsystem"
    3.     ]
    The documentation doesn't mention this anywhere and I just wasted 2 days trying to find the cause of this bug.

    Anyone from Unity, please fix this. It is ridiculous and I'm exhausted.
     
    gjf likes this.
  14. mbarnes-mv

    mbarnes-mv

    Joined:
    Aug 21, 2019
    Posts:
    8
    Thanks for the info.
    Where did you get the XR SDK provider documentation?
     
  15. holo-krzysztof

    holo-krzysztof

    Joined:
    Apr 5, 2017
    Posts:
    77
    It's included in the .zip file you can download after signing up for XRSDK here: https://create.unity3d.com/vsp-signup-form

    I think it was already posted in this thread, a few pages back.

    Anyway, after unpacking that there is a Documentation/Manual folder.
     
  16. mbarnes-mv

    mbarnes-mv

    Joined:
    Aug 21, 2019
    Posts:
    8
    Thanks I got it (from Matt).
     
  17. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    128
    It looks to me like Camera.main.transform as well as the left, right, and "center eye" anchor transforms are all one frame delayed when used in Update() or LateUpdate(). Is there a fix for this? I'm trying to position some virtual cameras for portal rendering and the one frame delay causes jitter.

    When running on a PC, this is not a problem. Update() sees values for Camera.main position and rotation consistent with what will actually be rendered. When I introduce a fake one-frame delay (by caching the previous frame's values), I can see the same sort of jitter that I always see in VR.

    This is on Oculus Quest. Is this a Unity XR bug or an Oculus bug?
     
  18. holo-krzysztof

    holo-krzysztof

    Joined:
    Apr 5, 2017
    Posts:
    77
    Is there a way to tell Unity to do single pass rendering with XRSDK?

    The Windows MR XRSDK plugin does it automagically; however, my own plugin always gets UnityXRAppSetup.singlePassRendering == false in the PopulateNextFrameDesc call.

    It's the same case for the included Display sample project. Following code reports that rendering does not use single pass, neither in the editor, nor in a standalone build:
    Code (CSharp):
    1.     private void LogXRSettings()
    2.     {
    3.         var stereoMode = XRSettings.stereoRenderingMode;
    4.         Debug.Log($"Stereo mode: {stereoMode}");
    5.  
    6.         var displays = new List<XRDisplaySubsystem>();
    7.         SubsystemManager.GetInstances<XRDisplaySubsystem>(displays);
    8.  
    9.         if (displays.Count > 0)
    10.         {
    11.             bool singlePassDisabled = displays[0].singlePassRenderingDisabled;
    12.             bool legacyRendererDisabled = displays[0].disableLegacyRenderer;
    13.  
    14.             Debug.Log($"Single pass disabled: {singlePassDisabled}, legacy renderer disabled: {legacyRendererDisabled}");
    15.         }
    16.     }
    I've found nothing in the docs so far.
     
    fherbst likes this.
  19. Hunters-i

    Hunters-i

    Joined:
    Jun 28, 2020
    Posts:
    1
    arch.png.jpg
    Hello Everyone I am Cris
    I get this problem
    No Active Engine.XR.ARsubsystems.XRsessionSubsystem is available. please ensure that a valid loader configuration Exists in the XR project setting.

    How can i fix this, need Help?
     
    Last edited: Jul 15, 2020
    Dalton-Lima likes this.
  20. Dalton-Lima

    Dalton-Lima

    Joined:
    Dec 21, 2016
    Posts:
    19
    I am having these same errors using ARFoundation 4.1.0-preview5 in 2019.4.4f1 when try to run the app on the Editor, in the Android and iOS the app is working fine. I suppose we need a 'mock' plugin for the Editor.
     
  21. mikewarren

    mikewarren

    Joined:
    Apr 21, 2014
    Posts:
    109
     
    daftechnologylab likes this.
  22. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    That was 6 weeks ago. Still no reply. i.e. it's been 2.5 months waiting for a reply to the bug reports, and in the forums from the Unity XR team(s). This kind of radio-silence doesn't make the XR/XRIT tech look good.
     
    Shizola likes this.
  23. noemis

    noemis

    Joined:
    Jan 27, 2014
    Posts:
    76
    Hey there,
    so let me add one more post to this over flooded forum with countless requests with no answer:

    Could you please add support for PicoVR Headsets to your XR thing? I develop for since 6 years VR apps and this XR idea would help a lot. I'm already using it for a prototype. Support for the big player like oculus makes sense, I know, but they have shown very limited support for smaller companies or independent developers. And kicking the oculus after less than 2 years is just one thing. We completly switched to Pico Headsets for business requests, but can't use the XR framework, because they are not supported.

    So I'd like to hear why or what can I do to support this idea?

    Thank you!
     
  24. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    http://snapandplug.com/xr-input-too...Plugin-make-a-new-plugin-for-custom-hardware?

    One of the key points of Unity XR is that Unity *isn't* creating these things, they've made it so anyone can. I haven't tried it myself, but the Unity team seems pretty keen to make it easy/open for you to add your own support for unusual/niche/custom hardware.
     
    noemis likes this.
  25. noemis

    noemis

    Joined:
    Jan 27, 2014
    Posts:
    76
    Ok, cool. Thank you for this hint. I'll contact Pico and ask, what the current state is or if they started trying.
     
    Last edited: Jul 28, 2020
  26. aserobinnorth

    aserobinnorth

    Joined:
    Jan 2, 2019
    Posts:
    1
    @Alexees @mfuad I encountered the same issue when building multiple targets using SuperUnityBuild, the issue is that the Oculus XR Plugin's OnPreprocessBuild callback assumes that the active Editor target group is the target being built when checking the Player graphics API settings, but it would appear that if the two don't match (e.g. the current Editor target group is Standalone, but you're trying to build for Android), the target group is only switched after any OnPreprocessBuild callbacks have run, resulting in the error you're seeing.

    I fixed this by updating the Oculus XR Plugin's callback to be as follows, replacing instances of EditorUserBuildSettings.activeBuildTarget with report.summary.platform:

    Code (CSharp):
    1. public void OnPreprocessBuild(BuildReport report)
    2. {
    3.     if(!OculusBuildTools.OculusLoaderPresentInSettingsForBuildTarget(report.summary.platformGroup))
    4.         return;
    5.  
    6.     if (report.summary.platformGroup == BuildTargetGroup.Android)
    7.     {
    8.         GraphicsDeviceType firstGfxType = PlayerSettings.GetGraphicsAPIs(report.summary.platform)[0];
    9.         if (firstGfxType != GraphicsDeviceType.OpenGLES3 && firstGfxType != GraphicsDeviceType.Vulkan && firstGfxType != GraphicsDeviceType.OpenGLES2)
    10.         {
    11.             throw new BuildFailedException("OpenGLES2, OpenGLES3, and Vulkan are currently the only graphics APIs compatible with the Oculus XR Plugin on mobile platforms.");
    12.         }
    13.         if (PlayerSettings.Android.minSdkVersion < AndroidSdkVersions.AndroidApiLevel19)
    14.         {
    15.             throw new BuildFailedException("Minimum API must be set to 19 or higher for Oculus XR Plugin.");
    16.  
    17.         }
    18.     }
    19.  
    20.     if (report.summary.platformGroup == BuildTargetGroup.Standalone)
    21.     {
    22.         if (PlayerSettings.GetGraphicsAPIs(report.summary.platform)[0] !=
    23.             GraphicsDeviceType.Direct3D11)
    24.         {
    25.             throw new BuildFailedException("D3D11 is currently the only graphics API compatible with the Oculus XR Plugin on desktop platforms. Please change the Graphics API setting in Player Settings.");
    26.         }
    27.     }
    28. }
    I have also reported this as a bug with the Oculus XR Plugin package
     
    Last edited: Jul 31, 2020
  27. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    Can I suggest that one of the reasons so many questions go unanswered is that this thread is 7 pages long? And that's small by comparison to some I see.

    What happened to the idea of "1 question per post"? Surely we want a single clear post title and a single on-topic discussion for each question? Why does everyone jump into these messy mega-threads?
     
    gjf and a436t4ataf like this.
  28. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    Yeah, agreed. That's why I started a FAQ using the answers I found in the main thread (which is now 11 pages long :) - this thread is shrot by comparison) - see my SIG below for the link.

    A couple of us - e.g. @ROBYER1 and I - push people to start new threads when it's obviously a distinct / new issue - but it's easier for us to see when it's "your problem is unique and new" vs "you are stuck on something common", so I understand that sometimes people post in the main thread(s) simply because they're not sure where to post.

    But +1 for preferring to see 1-post-per-issue.
     
    ROBYER1, gjf, andybak and 1 other person like this.
  29. Shizola

    Shizola

    Joined:
    Jun 29, 2014
    Posts:
    475
    Unity employees rarely answer any questions on XR. If you make a new thread it will drop off the first page and never be seen again within a day. At least if you post in here someone may scroll through it.
     
  30. Deleted User

    Deleted User

    Guest

    One reason for me to use existing threads is that they are the ones that I found. Others will find those too. More threads means more different titles. It's a trade off between thread length and search hits.
     
  31. Deleted User

    Deleted User

    Guest

    WMR and the new XR Unity plugin. Who else uses Windows Mixed Reality? Why can't I see the controller? I installed everything according to the manual.
     
  32. noemis

    noemis

    Joined:
    Jan 27, 2014
    Posts:
    76
    So in case some other need this info. Pico answered to my request:

    :)
     
  33. noemis

    noemis

    Joined:
    Jan 27, 2014
    Posts:
    76
    Unity does not draw the controller if you only use "tracked pose driver". With theses you get the the position and rotation. If you want to see controller, you can use simple models for development or you'll also find the original controller for example here:
    https://github.com/Microsoft/MixedRealityToolkit-Unity/
     
  34. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,933
    That sounds bad :(. Q3 ends next month! XR is a long long way away from being finished. It needs at least another couple of preview versions to be stable - so many bugs right now.

    And if they do a preview release, and then "finish" in Q3, then there's going to be zero fixes between preview and live (it takes 2-4 months for them to even open the XR bug reports, let alone fix them).

    I have XR bugs that were only accepted in the last couple of weeks - on a Q3 schedule, it looks like it's 50/50 whether they'll be fixed :(.
     
    gjf likes this.
  35. Deleted User

    Deleted User

    Guest

    The problem is that even the "tracked pose driver" in the image is not displayed. Because of this, there is no teleportation function. The point in the center of the screen responds only to the touchpad by moving slightly and returning to its original position.
     

    Attached Files:

  36. Deleted User

    Deleted User

    Guest

    @AndreiARVR It's just a pose driver, mapping controller data to the transform it's attached to. What you want to see is up to you. As noemis said, the MRKT handles instantiating dummy or actual models but it had to be implemented. Games usually do not even spawn the real controller but a fancy looking gadget.
    Same goes for teleportation. That's nothing that's supported out of the box (MRKT does support it). Unity hands you the data, what you do with it is up to you.
     
  37. Deleted User

    Deleted User

    Guest

    Tracked pose driver is not displayed. There is no reaction to controllers in Unity. I can't even see what I attached in the picture.
     
  38. Deleted User

    Deleted User

    Guest

    In my case I got problems with the Oculus Quest as the names for controllers changed because of this.
    Maybe it's the same for WMR. You should debug the controller detection in the WindowsMixedRealityDeviceManager.
    Or you need to check that you use the right WindowsMixedRealityDeviceManager. There is one for the old and the new XR package.
     
    Deleted User likes this.
  39. Deleted User

    Deleted User

    Guest

    Thank You Bro! Put the following settings as in the screenshot and everything worked.
     

    Attached Files:

    • 12.PNG
      12.PNG
      File size:
      79.2 KB
      Views:
      382
  40. MisterMan123

    MisterMan123

    Joined:
    Jan 6, 2018
    Posts:
    5
    Is the WMR XR plugin supposed to be compatible with URP? I encountered this a couple months ago when first trying to learn Unity's XR system, but while using the WMR plugin and using URP I'm only getting objects rendered to my left eye. The WMR plugin only supports Single Pass Instanced rendering, which I was under the impression URP does as well, but I can't seem to find a fix for this.

    Using Unity 2020.1, XR Plugin Management 3.2.13, Windows XR Plugin 3.2.0, Universal RP 8.2.0
     
  41. abelsang9683

    abelsang9683

    Joined:
    Sep 22, 2017
    Posts:
    1
    did you find out the solution :(
     
  42. RupeOxVR

    RupeOxVR

    Joined:
    Jul 18, 2018
    Posts:
    7
    An update from Pico: they are delayed and are now targeting the end of October.
     
  43. brianpkenney10

    brianpkenney10

    Joined:
    Nov 8, 2016
    Posts:
    27
    Hi,

    Did you ever get this problem solved? I am having the same problem and can't see to find a solution.
     
  44. FloBeber

    FloBeber

    Joined:
    Jun 9, 2015
    Posts:
    166
    Hey, does someone know if it's possible to modify the XR Providers list in XR Plug-in Management by code?
     
  45. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
  46. FloBeber

    FloBeber

    Joined:
    Jun 9, 2015
    Posts:
    166
    Thank you for your reply but that's not what I meant. Sorry if I was not clear!
    I simply want to allow my build scripts to check/uncheck Plugin-Providers:



    J.J. Hoesing from Unity talks about it in his recent Facebook Connect presentation:
     
  47. gjf

    gjf

    Joined:
    Feb 8, 2012
    Posts:
    53
  48. FloBeber

    FloBeber

    Joined:
    Jun 9, 2015
    Posts:
    166
    fherbst and gjf like this.
  49. jamie_xr

    jamie_xr

    Joined:
    Feb 28, 2020
    Posts:
    67
    I'm currently using the Oculus XR Plugin, and I think its' great, It's really gonna help when supporting other VR systems, and the API is very nice to use. I'm very glad that i could avoid using the oculus integration package.

    Until now!
    There are no support for oculus platform things like friends/leaderboards in the XR package. Which means I need the integration. I can already tell they are not gonna play with each other very well.

    Does anybody have any suggestions or have done this before. I want something elegant.
    Thanks
     
  50. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    Fortunately they are designed to work together, the integration just uses the Oculus XR Plugin and builds the platform-specific features on top.
    Note that the idea of XR Management is to provide the common parts between platforms, not platform-specific details. For this you'll probably always need to use the vendor's platform integration in addition.

    So, tl;dr: they work well together, on 2020.1+ the Oculus XR Plugin is even a requirement for the Oculus Integration.
     
    jamie_xr likes this.