Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Unity wants to learn about your experiences working on a Unity project today. We'd like to hear from you via this survey.
    Dismiss Notice
  3. Unity 2020 LTS & Unity 2021.1 have been released.
    Dismiss Notice
  4. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  5. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

Unity XR Input: Possible to simulate input events via a fake InputDevice?

Discussion in 'AR/VR (XR) Discussion' started by trzy, Jun 5, 2020.

  1. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    99
    Hi,

    I'd like to be able to simulate 6dof-tracked controller input in the Unity Editor when in play mode. I'm not aware of any headset emulation solutions so I'm developing my own (simple mouse-look for head motion plus WASD motion controls). To simulate the tracked controllers, I use the mouse and a script attached to the controller objects.

    However, simulating button presses (e.g., grip, trigger, etc.) would I think require writing an InputDevice that the system could discover. Is there a way to do this? Or an alternative way I could go about writing a simulation environment?

    Thank you,

    Bart
     
  2. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,401
    Wrap the InputDevice queries in your own little module. Make the rest of your code go through that module rather than accessing InputDevice directly. Now you can have that module work by other means (standard Input) when running on desktop.
     
    Iron-Warrior likes this.
  3. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    99
    Unfortunately I'm using Unity XR interaction toolkit, which provides various scripts that use InputDevice. These should not be modified. I'm wondering if there is a way to extend InputDevice and then inject that into the input system. If not, seems like a feature Unity should support?
     
  4. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,401
    Oh. Well yeah, can't help you in that case. :)
     
  5. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,215
    Seems like a good idea to me.

    Unity staff on this forum have said that the interfaces for making custom VR Plugins are open to anyone, for free - http://snapandplug.com/xr-input-too...Plugin-make-a-new-plugin-for-custom-hardware? - so I'd start there: opt-in for the SDK for making a custom plugin, and see if you can make a software-only InputDevice easily.

    (and if not ... I'd log bugs against the SDK stuff, because IMHO this use-case really should be supported, and it would be a benefit for Unity + everyone to have reference implementations / software mocks ... but I suspect that this stuff might already be something they provide if you sign up for that, because it's so useful)
     
    trzy likes this.
  6. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    99
    I really hope Unity adds a way to create mock inputs for custom HMD simulators! There should be a public interface for this.

    For now, I found a kludge to accomplish what I want: introspection. My controller simulator script allows the controller to be moved around the screen using the mouse (with the left button held), forwards and backwards along the z axis using the scroll wheel, and can feed Unity XR interaction events into Unity's XRController script (which must also be present on the object) by poking at its internals.

    Specifically, there is an internal method named UpdateInteractionType() that is used for input playback, apparently. Obviously, this implementation detail can change at any time but it will probably be easier to maintain this approach than trying to edit the package scripts.

    It's not a great solution but it suffices for now.

    Code (csharp):
    1.  
    2. using System;
    3. using System.Collections.Generic;
    4. using System.Reflection;
    5. using UnityEngine;
    6. using UnityEngine.XR.Interaction.Toolkit;
    7.  
    8. public class EditorControllerSimulator : MonoBehaviour
    9. {
    10. #if UNITY_EDITOR
    11.   public float controllerDefaultDistance = 1;
    12.   public float scrollWheelToDistance = 0.1f;
    13.   public KeyCode selectKey = KeyCode.Mouse1;
    14.   public KeyCode activateKey = KeyCode.KeypadEnter;
    15.  
    16.   private XRController m_xrController;
    17.   private float m_distance = 0;
    18.  
    19.   private Type GetNestedType(object obj, string typeName)
    20.   {
    21.     foreach (var type in m_xrController.GetType().GetNestedTypes(BindingFlags.NonPublic | BindingFlags.Public))
    22.     {
    23.       if (type.Name == typeName)
    24.       {
    25.         return type;
    26.       }
    27.     }
    28.     return null;
    29.   }
    30.  
    31.   private Dictionary<string, object> GetEnumValues(Type enumType)
    32.   {
    33.     Debug.Assert(enumType.IsEnum == true);
    34.     Dictionary<string, object> enumValues = new Dictionary<string, object>();
    35.     foreach (object value in Enum.GetValues(enumType))
    36.     {
    37.       enumValues[Enum.GetName(enumType, value)] = value;
    38.     }
    39.     return enumValues;
    40.   }
    41.  
    42.   private void UpdateXRControllerState(string interaction, KeyCode inputKey)
    43.   {
    44.     // Update interaction state
    45.     bool state = Input.GetKey(inputKey);
    46.     Type interactionTypes = GetNestedType(m_xrController, "InteractionTypes");
    47.     Dictionary<string, object> interactionTypesEnum = GetEnumValues(interactionTypes);
    48.     MethodInfo updateInteractionType = m_xrController.GetType().GetMethod("UpdateInteractionType", BindingFlags.NonPublic | BindingFlags.Instance);
    49.     updateInteractionType.Invoke(m_xrController, new object[] { interactionTypesEnum[interaction], (object)state });
    50.   }
    51.  
    52.   private void LateUpdate()
    53.   {
    54.     float scroll = Input.mouseScrollDelta.y;
    55.     if (Input.GetMouseButton(0) || scroll != 0)
    56.     {
    57.       // Scroll wheel controls depth
    58.       m_distance += scroll * scrollWheelToDistance;
    59.       float depthOffset = controllerDefaultDistance + m_distance;
    60.  
    61.       // Mouse position sets position in XY plane at current depth
    62.       Vector3 screenPos = Input.mousePosition;
    63.       Ray ray = Camera.main.ScreenPointToRay(screenPos);
    64.       Vector3 position = ray.origin + ray.direction * depthOffset;
    65.       transform.position = position;
    66.     }
    67.  
    68.     // Interaction states
    69.     UpdateXRControllerState("select", selectKey);
    70.     UpdateXRControllerState("activate", activateKey);
    71.   }
    72.  
    73.   private void Awake()
    74.   {
    75.     m_xrController = GetComponent<XRController>();
    76.   }
    77. #endif
    78. }
    79.  
     
    Last edited: Jun 6, 2020
    P_Jong, InnoactiveDavid and hessex like this.
  7. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    99
    This is a good idea although sadly beyond the scope of what I have time for. However, filings bugs against the SDK might be more feasible.
     
  8. tobermanar

    tobermanar

    Joined:
    Aug 17, 2016
    Posts:
    8
    Hi guys,

    I'm still stuck on the same problem. For UI Events, it's quite a big deal not to be able to fake inputs in editor. If anybody manage to succeed (thanks trzy for your code but I cannot yet manage to make it work), I'd be happy to have a workaround.

    Since I haven't seen the information here, Unity is working on a "simulated hmd" for editor in the short term roadmap. I'm quite sure I have seen this somewhere but I can no longer find the link. I will edit this when (if) I find it.

    Have you guys looked into the Controller Recorder ? It seems to be a way too.
     
  9. a436t4ataf

    a436t4ataf

    Joined:
    May 19, 2013
    Posts:
    1,215
  10. trzy

    trzy

    Joined:
    Jul 2, 2016
    Posts:
    99
    I don't know about UI Events -- I actually have never used Unity's UI systems -- but I can share my simulation code for the XR Interaction Toolkit. It's quite unpolished and it works by peering into the guts of XRController and manipulating internal methods directly. This is obviously brittle and if Unity changes the implementation of XRController substantially, will have to be adjusted accordingly.

    The Unity interaction model has abstracted away buttons in favor of high-level actions (e.g., 'select' and 'activate'). I allow you to map these to keyboard keys or mouse buttons. But as I also need access to the lower-level "trigger" and "grip" buttons, I have a small abstraction layer around those. It is up to you to ensure that "select"/"activate" are consistent with "trigger"/"grip" in the EditorControllerSimulator's properties.

    I've attached the scripts and below is a screenshot demonstrating how to wire them up. It's a little funky for now.

    Wiring Up the Scripts

    1. Make sure an XRController is present on both controllers.

    2. Include only *one* EditorControllerSimulator script on one of the controllers, *not both*. Why? Because I bind a key ("Switch Controller Key" property) that allows you to toggle between controllers. The script will automatically find the next controller and at start up, defaults to the controller it is attached to. It's probably a good idea to modify the script so that it exists on its own object outside of the VR Rig and at start-up, either searches for an initial controller to grab or exposes a property you can set.

    3. Add ControllerInput to *both* controllers. I use this to get the button state and it's just a thin wrapper over Unity's functions. EditorControllerSimulator also uses this to inject fake button presses when playing from the editor (there is no way that I'm aware of to route simulated inputs to Unity's API).

    Clipboard01.jpg

    That's it!

    Usage Instructions

    Hit 'play' and then hold the left mouse button to move the controller. You should see it respond. Use the scroll wheel to move it along the z axis. Press the right mouse button (Mouse 1) to simulate a "select". Press Enter to simulate an "activate". Likewise, these are mapped to simulate grip and trigger, respectively.

    Press the back quote (tilde) key to switch control to the other controller.
     

    Attached Files:

    dnnkeeper, Vued, hessex and 1 other person like this.
  11. tobermanar

    tobermanar

    Joined:
    Aug 17, 2016
    Posts:
    8
    @trzy Thanks a lot ! That's a huge help.
    I just needed to add the the UI press reference in the lateUpdate function (and the corresponding key definition) and it's working like a charm !

    Thanks to you I can avoid to go too deep into previewPackage code comprehension and loose a couple hours/days of brain overload. Here is a virtual cookie and a lot of sympathy =)

    //Class begining
    public KeyCode activateUI = KeyCode.E;

    //LateUpdate
    UpdateXRControllerState("uiPress", activateUI);
     
  12. Polff

    Polff

    Joined:
    May 18, 2017
    Posts:
    27
    I actually released an VR Simulator for the XR Interaction toolkit to the assetstore in April but it kinda works the same way by updating the interaction type. So no need for getting it, if you guys got this solution to work. ;) Planning on adding a couple more features though.
     
  13. Vued

    Vued

    Joined:
    Aug 2, 2020
    Posts:
    3
  14. jvetulani

    jvetulani

    Joined:
    Dec 20, 2016
    Posts:
    43
    Hi!

    Does @trzy's solution work for everyone with canvases? I can see them react to the ray passing over the buttons and activating the hover, but pressing the grips or triggers won't actually click them.
     
  15. marwi

    marwi

    Joined:
    Aug 13, 2014
    Posts:
    122
    I think I might be able to provide a solution for simulating devices soon. Started building an XRInputSubsystem for ARSimulation that allows creating devices with usages on managed side. If anyone would be interested in testing feel free to send me a DM (as of now I only tested/built the plugin for windows)

    Here is an example of how the API looks like right now if you want to create custom devices (some controller with one trigger in this case)

    upload_2020-10-6_16-52-48.png


    Devices injected that way are discovered via InputDevices.GetDevices(devices);

    So for example this component just works as if you had an HMD connected but instead it's just another transform in the editor
    upload_2020-10-6_16-53-21.png

    So far I tested creating controllers and headsets that way:
    upload_2020-10-6_17-0-53.png

    upload_2020-10-6_17-1-3.png
     
    Last edited: Oct 7, 2020
  16. marwi

    marwi

    Joined:
    Aug 13, 2014
    Posts:
    122
    Last edited: Oct 6, 2020
  17. Polff

    Polff

    Joined:
    May 18, 2017
    Posts:
    27
    Oh, so it is possible to create custom devices on the managed side? Interesting ... Tried that a while ago but ran into several issues.

    I am currently using Unitys XR SDK to create a native plugin for a custom XR Provider. It's working quite well, you even get Input signals via the legacy input helpers and generic XR Input bindings with the new Input System. But in the end it's basically doing the same, routing inputs to a custom XR Device and controlling it. I'll soon transition my VR Simulator Asset to this new System, so it's not as bound to XR Interaction Toolkit anymore. I'll probably also release a stripped down free version of my asset in the process.
    If anyone is interested in doing something similiar, I can give some directions. :)
     
  18. marwi

    marwi

    Joined:
    Aug 13, 2014
    Posts:
    122
    It is or will be with my plugin (I also built a XRPluginSubsystem). What kind of issues did you have?

    I built it in a way so I can create and control devices on managed side, that way I dont have to touch unmanaged again if I need another type of device or so (at least that's the goal/idea)
     
  19. Polff

    Polff

    Joined:
    May 18, 2017
    Posts:
    27
    Ah I see, so your plugin exposes the methods to create devices with features to the managed side. That's a great way of doing things ... building those native plugins every time you update is a pain.
    Well with issues I mean I managed to create devices on pure unmanaged side but wasn't able to manipulate them in any usefull way.
     
  20. marwi

    marwi

    Joined:
    Aug 13, 2014
    Posts:
    122
    Yes exactly!
    Let me know if you want to test the plugin :)
     
  21. Phanou

    Phanou

    Joined:
    Jan 4, 2017
    Posts:
    5
    Hello
    Since the last version of XR Interaction Toolkit v1.0.0, your script seem broken.
    The m_xrController.GetType().GetNestedTypes return nothing.
    An Idea ?


    private Type GetNestedType(object obj, string typeName)
    {
    foreach (var type in m_xrController.GetType().GetNestedTypes(BindingFlags.NonPublic | BindingFlags.Public))
    {
    if (type.Name == typeName)
    {
    return type;
    }
    }
    return null;
    }
     
    P_Jong likes this.
  22. Matt_D_work

    Matt_D_work

    Unity Technologies

    Joined:
    Nov 30, 2016
    Posts:
    189
    Yes, if you're using the Input System package. We actually use this approach to take kb/mouse data and drive VR Controllers / HMD for our simulator in the latest version of XRI / XRI Samples.
     
  23. Phanou

    Phanou

    Joined:
    Jan 4, 2017
    Posts:
    5
    Finally all work find with XR Interaction Toolkit v1.0 without this script.
    With the new Input System I just replace some XR InputAction by Mouse or Keyboard InputAction.

    upload_2021-1-4_17-49-51.png
     
unityunity