Search Unity

Question how to acheive cross platform usability with OpenXR?

Discussion in 'XR Interaction Toolkit and Input' started by TzuriTeshuba, Jan 7, 2023.

  1. TzuriTeshuba

    TzuriTeshuba

    Joined:
    Aug 6, 2019
    Posts:
    185
    May sound like a silly question since that is the point of OpenXR, but here is what confuses me. I can't just try and see as I dont have access to HMDs other than the Quest 2 at the moment. In my OpenXR tab in the XR Plugin Management, I can add different interaction profiles for different devices. Do I need to add all the devices that I intend to target? Or just the device I am testing on in the editor? From what I've understood from OpenXR talks, is that the idea is that an application using OpenXR should even work on future devices. This would mean that I do not need to explicitly list them anywhere during development.

    For context, my game does not use any special capabilities beyond standard VR input (hmd + controller position/rotation, grip, trigger)

    Of course, everything should be tested on a device I plan to target, but I like to know if my configurations are theoretically proper in the meantime.

    upload_2023-1-7_20-35-28.png
     
  2. DevDunk

    DevDunk

    Joined:
    Feb 13, 2020
    Posts:
    5,060
    List all controllers you want to target. OpenXR means that all supported hardware will get the hmd and controllers tracked. Because input will be handled differently from device to device you just need to add a controller profile and if you use OpenXR input bindings then the controllers will mostly 'just work'.
    It's more like you don't need to change your project, just add support
     
    TzuriTeshuba likes this.
  3. TzuriTeshuba

    TzuriTeshuba

    Joined:
    Aug 6, 2019
    Posts:
    185
    thanks, appreciate it. cleared up a lot for me!
     
    DevDunk likes this.