Search Unity

Official XR Plugins and Subsystems

Discussion in 'AR/VR (XR) Discussion' started by matthewr_unity, Jun 12, 2019.

  1. gtk2k

    gtk2k

    Joined:
    Aug 13, 2014
    Posts:
    288
    Can I create the XR plugin with C#?
     
  2. HuiyunChen

    HuiyunChen

    Joined:
    Oct 22, 2020
    Posts:
    1
    Have you figured it out?
     
  3. Gobstar

    Gobstar

    Joined:
    Nov 12, 2015
    Posts:
    7
    Did you ever get a response on FFR?
     
  4. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    @Gobstar FFR is supported in recent Oculus XR Plugin versions. Works in built-in and URP 10 (and maybe earlier, not sure when it was added).
     
  5. battou

    battou

    Joined:
    Jan 25, 2011
    Posts:
    222
    How do you get Trigger touch and Thumb touch using XR Plugins?? OculusUsages do not work!
     
  6. mikerz1985

    mikerz1985

    Joined:
    Oct 23, 2014
    Posts:
    79
    Use the legacy input system and make sure you setup all the Axes for vr input, or use the new input action system with default mappings, or get the oculus plugin from the store and directly use their methods
     
  7. battou

    battou

    Joined:
    Jan 25, 2011
    Posts:
    222
    So XR Plugins system still dont support touch detection?( Could you please point me to some info on using legacy input for VR?
     
  8. jaredlandetta

    jaredlandetta

    Joined:
    Jan 5, 2021
    Posts:
    1
    I'm running into the same error. I'm using a MagicLeap1 and one of their templates (Unity Template -0.24.0) from their website. I'm not sure what to do.
     
  9. RuneShiStorm

    RuneShiStorm

    Joined:
    Apr 28, 2017
    Posts:
    264
    What is this XR for? I have it and it give me error messages but I cant remember why I go it.. Does it have anything to do with Porting to Xbox?
     
  10. glenneroo

    glenneroo

    Joined:
    Oct 27, 2016
    Posts:
    231
    XR = eXtended Reality. VR = Virtual Reality. AR = Augmented Reality.

    Please use Google next time before spamming the forums. Or just read a couple of the forum entries.
     
  11. rob_vld

    rob_vld

    Joined:
    Jul 9, 2013
    Posts:
    191
    By setting an enum in the project Hierarchy it helps me with the workflow without having to go to player settings everytime enabling / disabling the XR Manager

    With the code below i am trying to achieve 2 things:

    When the enum is set to CLIENT_DESKTOP -- i do not want SteamVR(or any other device) to be launched at application start, in either the editor or the build...
    This works for either.case

    When the enum is set to CLIENT_VIRTUAL_REALITY -- i do want VR to be initialized
    This only works with the editor... with a build i am getting a black screen

    Does anyone know what i am missing, or know of a better workflow?

    Thanks

    This script has been set as first script in the execution order

    Code (CSharp):
    1.  
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5. using UnityEngine.XR;
    6. using UnityEngine.XR.Management;
    7. using Valve.VR;
    8.  
    9. namespace Project
    10. {
    11.     public class ProjectSettings : MonoBehaviour
    12.     {
    13.         public GameObject cameraDesktop;
    14.         public GameObject cameraVirtualReality; //SteamVR CameraRig
    15.  
    16.         [SerializeField] Unity.XR.Oculus.OculusLoader oculusLoader; // Scriptable Object
    17.         [SerializeField] Unity.XR.OpenVR.OpenVRLoader openVRLoader; // Scriptable Object
    18.  
    19.         public static ProjectSettings instance;
    20.  
    21.         public enum ClientType
    22.         {
    23.             CLIENT_DESKTOP,
    24.             CLIENT_VIRTUAL_REALITY
    25.         }
    26.  
    27.         public ClientType clientType;
    28.  
    29.         private void Awake()
    30.         {
    31.             if (instance == null)
    32.             {
    33.                 instance = this;
    34.  
    35.                 if (clientType == ClientType.CLIENT_VIRTUAL_REALITY)
    36.                 {
    37.                     List<XRLoader> xrLoaderList = UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.loaders;
    38.                     xrLoaderList.Clear();
    39.  
    40.                     UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.loaders.Add(openVRLoader);
    41.                     UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.loaders.Add(oculusLoader);
    42.  
    43.                     UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.automaticLoading = true;
    44.                     UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.automaticRunning = true;
    45.  
    46.                     StartCoroutine(EnableVirtualReality());
    47.                 }
    48.                 else
    49.                 {
    50.                     List<XRLoader> xrLoaderList = UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.loaders;
    51.                     xrLoaderList.Clear();
    52.  
    53.                     cameraDesktop.SetActive(true);
    54.                 }
    55.  
    56.                 LoadResources();
    57.             }
    58.         }
    59.  
    60.          private IEnumerator EnableVirtualReality()
    61.         {
    62.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.InitializeLoaderSync();
    63.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.activeLoader.Initialize();
    64.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.activeLoader.Start();
    65.             yield return null;
    66.  
    67.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.StartSubsystems();
    68.             yield return null;
    69.  
    70.             cameraVirtualReality.SetActive(true);
    71.             yield return null;
    72.         }
    73.  
    74.         private void OnApplicationQuit()
    75.         {
    76.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.automaticLoading = false;
    77.             UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.automaticRunning = false;
    78.  
    79.             if (clientType == ClientType.CLIENT_VIRTUAL_REALITY)
    80.             {
    81.                 UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.DeinitializeLoader();
    82.  
    83.                 List<XRLoader> xrLoaderList = UnityEngine.XR.Management.XRGeneralSettings.Instance.Manager.loaders;
    84.                 xrLoaderList.Clear();
    85.             }
    86.             else
    87.             {
    88.             }
    89.         }
    90.     }
    91. }
    92.  
     
  12. rob_vld

    rob_vld

    Joined:
    Jul 9, 2013
    Posts:
    191
  13. ozgurozansen

    ozgurozansen

    Joined:
    Oct 24, 2013
    Posts:
    2
    I just can't stand that Unity oficially pretends that such an option (Stereo Display [non-head mounted]) never presented. Unity is the #1 API to breaks working things in future versions and never talk about them.

    We also use a CAVE-like system and ugraded our own framework to work with UIToolkit component, which means we now depend on the 2020+ versions. But, surprise! Unity removed the Stereo Display option in their new XR Management subsystem and kills the legacy XR. Moreover, there is no tutorial or satisfying documentation to start developing (from scratch here obviously) your own plug-in for non-HMD systems.

    Have you been successful in your case using Unity 2020+ along with your CAVE system?
     
    cecarlsen, Gruguir, glenneroo and 2 others like this.
  14. jdh5259

    jdh5259

    Joined:
    Sep 14, 2017
    Posts:
    20
    We are in the same boat. We have been working on upgrading to Unity 2020 LTS but have no clue how to replace the Stereo Display (non-head mounted) support. Are there any resources for how to actually write a new XR plugin? The manual (https://docs.unity3d.com/Manual/xr-sdk.html) doesn't really have any examples.
     
  15. foonix

    foonix

    Joined:
    Dec 15, 2019
    Posts:
    20
    Maybe try MockHMD? It is a stereo display that does not require a device. Its main purpose is for testing correct VR render output, but it works fine in a build as the main XR camera.
     
    zezba9000 likes this.
  16. ConanB

    ConanB

    Joined:
    Jun 25, 2019
    Posts:
    18
    Not quite the same thing, as what we need for hardware stereoscopic isn't a left-right setup, but rather an API that actually implements quad-buffering / hardware stereoscopic (i.e. OpenGL glDrawBuffer(GL_BACK_LEFT) glDrawBuffer(GL_BACK_RIGHT) that handles the stereoscopic on a 120fps+ single monitor/projector).

    I gave up on hoping I'd hear back from Unity about it, and just wrote a plugin that forces an OpenGL window in a separate thread to override Unity's main window and steals Unity's back buffer for use in the OpenGL window that supports quad-buffering. I'd prefer not having to do that, but it was the easiest solution without Unity actually bringing back hardware stereoscopic support. That said, I haven't bothered looking if Unity has changed their stance in the past couple of months as I was tired of wasting my time trying to find an answer :(
     
    Reahreic likes this.
  17. ConanB

    ConanB

    Joined:
    Jun 25, 2019
    Posts:
    18
    Had a quick look over the updated XR Plugin documentation. Better than it was, but still not super easy to make sense of at a glance. But by the looks of it you could write a plugin that sets up the graphics device to enable quad-buffering, then perhaps catch callbacks for when each eye starts to draw to setup the correct draw buffer (i.e. glDrawBuffer(GL_BACK_LEFT) when the left eye starts to draw).

    I don't have time at the moment to test that out, plus I have a workaround that works for the time being, but it'd be good to test it out. If anyone is up for trying it out and letting us all know that'd be great. Since there are a few of us needing Stereo (non-head mounted) perhaps we should start a separate thread for discussion and development.
     
    Last edited: Apr 29, 2021
  18. Hobodi

    Hobodi

    Joined:
    Dec 30, 2017
    Posts:
    101
    Hello. Could you please help with such a problem.
    How can i set the format for the render of the eye texture before initalization?
    Does it depend on RenderTextureFormat.Default for the platform?
    I want to reduce the size of the render texture due to the low bandwidth of the target mobile device with tile GPU, but I have absolutely no idea where I can set the render texture descriptor for this.
     
  19. Chootin

    Chootin

    Joined:
    Jan 22, 2017
    Posts:
    4
    Hi there,

    I'm trying to put together a unity-xr-sdk based plugin to perform stereo rendering for simulating robotics. The goal of the plugin is just to perform the stereo rendering, then blit the result to a render texture to be retrieved at a later time.

    Having spent many hours messing with it, while I can get the stereo render to view in the Unity window as a mirror, I cannot figure out how to get the information from the UnityXRRenderTexture into a render texture I can get access to in C#.

    Any help appreciated. Cheers!
     
  20. holo-krzysztof

    holo-krzysztof

    Joined:
    Apr 5, 2017
    Posts:
    77
    I haven't tried this, but theoretically you could create a RenderTexture in C# and pass that into your native plugin, then use UnityXRRenderTextureFormat::kUnityXRRenderTextureFormatReference in your display plugin when telling Unity what to render into.
     
  21. Tuncle

    Tuncle

    Joined:
    Oct 1, 2018
    Posts:
    23
    I am trying to implement my own XR Display provider. If i enable single pass in native (textureArrayLength =2 , one render pass with two render params ), the following error will every frame and nothing gets rendered:

    07-26 12:41:00.968 16381 16444 E Unity : OPENGL NATIVE PLUG-IN ERROR: GL_INVALID_OPERATION: Operation illegal in current state

    i have checked the value "frameHints->appSetup.singlePassRendering", it returns true every frame.

    however, if i enable the deprecated XR Settings, and choose stereo rendering mode to "Single Pass", like following, every thing works fine (enable the xr plugin-in at the same time):
    upload_2021-7-27_13-7-1.png

    So, what is the setting about stereo rendering mode in the xr plugin-in manager?
     
  22. Chootin

    Chootin

    Joined:
    Jan 22, 2017
    Posts:
    4
    Cheers for the advice!

    I was not aware of UnityXRRenderTextureFormat::kUnityXRRenderTextureFormatReference; however, I have so far been unable to find a reference on how to take a RenderTexture from C# to a UnityXRRenderTextureId in the native code. My attempts so far have ended up giving me a "Color reference texture (id: xxxxx) not found" message or a MarshalDirectiveException.
     
  23. holo-krzysztof

    holo-krzysztof

    Joined:
    Apr 5, 2017
    Posts:
    77
    Ok, maybe the reference thing is only useful for textures created on the C++ side.
    I've found one more thing, would this help you?
    https://docs.unity3d.com/ScriptReference/XR.XRDisplaySubsystem.GetRenderTextureForRenderPass.html

    You can get a RenderTexture in C# that way if you know which render pass you're in.
     
  24. RancherosDigital

    RancherosDigital

    Joined:
    Sep 19, 2022
    Posts:
    11
    I am having a problem with WaveSDK / OpenXR for HTC Focus 3

    The app doesn't even run, and I am getting this error:
    Code (CSharp):
    1. 2022/09/19 18:18:11.405 4746 4746 Error AndroidRuntime Caused by: java.lang.ClassNotFoundException: Didn't find class "com.htc.vr.unity.WVRUnityVRActivity" on path: DexPathList[[zip file "/data/app/com.rancherosdigital.awesomesauce-cuXwoPNgBT9y2xpLIk2u6g==/base.apk"],nativeLibraryDirectories=[/data/app/com.rancherosdigital.awesomesauce-cuXwoPNgBT9y2xpLIk2u6g==/lib/arm64, /data/app/com.rancherosdigital.awesomesauce-cuXwoPNgBT9y2xpLIk2u6g==/base.apk!/lib/arm64-v8a, /system/lib64, /product/lib64]]
     
  25. RancherosDigital

    RancherosDigital

    Joined:
    Sep 19, 2022
    Posts:
    11
    Ok, fixed this. Ran into the next problem:

    I'm porting a VR app from Oculus Quest to Vive Focus 3. Code-wise no problems, the app runs, camera is tracked, controller actions are recognized, all fine.

    But there are random crashes where the camera view just freezes and is fixed to the users head.

    It doesn't matter if I use the Wave SDK or Open XR.

    There is no error log at all when this happens.

    Any ideas?
     
  26. thep3000

    thep3000

    Unity Technologies

    Joined:
    Aug 9, 2013
    Posts:
    400

    Some things to try (from hardest to easiest, I guess :) ):
    * attach debugger (android studio in native mode) and pause execution when it's frozen, see what the callstack is.
    * can try similar with managed debugger if it's stuck in c# code
    * with openxr, there's a runtime debugger which connects over player connection similar to the unity profiler. Getting a dump from that could tell us if it's stuck in the vendor's runtime.
    * maybe log in a monobehavior update to see if we're still pumping frames and the compositor is frozen, or if the app is frozen
     
  27. zhutou

    zhutou

    Joined:
    Sep 16, 2015
    Posts:
    1
    MockHMD not support stereoSeparation、stereoConvergence
     
  28. cecarlsen

    cecarlsen

    Joined:
    Jun 30, 2006
    Posts:
    862