Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Haptic feedback on different devices

Discussion in 'Input System' started by PixelLifetime, Sep 25, 2019.

  1. PixelLifetime

    PixelLifetime

    Joined:
    Mar 30, 2017
    Posts:
    90
    What is the correct way of handling haptics on different devices? Personally, I am interested in `Gamepad`. From the link that got from @jonas-echterhoff - https://docs.unity3d.com/Packages/c...#UnityEngine_InputSystem_Gamepad_PauseHaptics

    But I have no idea of how to get the specific device.
    I have got it working with this code.

    Code (CSharp):
    1. private IEnumerator Start()
    2. {
    3.     Gamepad gamepad = InputSystem.GetDevice<Gamepad>();
    4.  
    5.     gamepad.SetMotorSpeeds(0.6f, 0.8f);
    6.  
    7.     yield return new WaitForSecondsRealtime(3f);
    8.  
    9.     Debug.Log("Haptics Turn Off");
    10.  
    11.     gamepad.PauseHaptics();
    12.  
    13.     yield return new WaitForSecondsRealtime(2f);
    14.  
    15.     gamepad.ResumeHaptics();
    16.  
    17.     yield return new WaitForSecondsRealtime(3f);
    18.  
    19.     Debug.Log("Haptics Turn Off");
    20.  
    21.     gamepad.PauseHaptics();
    22. }
    Is this a correct way of handling it? Or is there some kind of higher level API (no problem to create one ourselves, it's just that I don't want to be silly and reinvent the wheel)?
    Is it better to `SetMotorSpeeds(0f, 0f)` or `PauseHaptics`?
    Does `PauseHaptics` do the same under the hood (if we don't consider caching of previous value)?

    From
    https://docs.unity3d.com/Packages/c...ystem_InputSystem_GetDeviceById_System_Int32_

    `GetDeviceById` is useful to get specific devices. My devices are automatically connected to input in Editor UI, so I don't have control out of the box which devices should be used by `Actions`. Thus I also don't have references to those devices store somewhere to tell which one is which. I can get devices from `InputSystem -
    public static ReadOnlyArray<InputDevice> devices { get; }` and query it by type to get devices that I need. But after I do that, how do I know which one controls player one, for example. Also, what is the point of `GetDeviceById` if I can get reference from this query and cache it?

    Sorry for so many questions, they might have been asked before. If you could answer them or point me to threads that answer them, if API won't change much - it would be a great source of information here to create video tutorials for youtubers to ease the confusion and repeated questions.
     
  2. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,666
    The main difference with using `PauseHaptics` is that you can call `ResumeHaptics` to restore the previous values.

    To get the device used by a player -- depends. Are you using PlayerInput?
     
  3. PixelLifetime

    PixelLifetime

    Joined:
    Mar 30, 2017
    Posts:
    90
    But can I as well call - SetMotorSpeeds(0f, 0f); and it would stop the motors from working or it doesn't matter because they are always in running state just with 0 speeds? I mean, would it turn them off completely or would they just run at 0 speeds? Or do they run at 0 by default all the time when other speed isn't set? (This is not that important, I am just curious if it would consume power or not, if it depends on the device and everything, then there is no need to answer this)

    I am using `InputReferenceAction` which I drag from an asset of `Input Controls Import Settings (Input Action Importer)`. I think I have more controls over input if I do it this way, is my assumption correct? PlayerInput seems to be responsible for only a few actions. Also, it's very convenient to use events and asset references this way for designing.
     
    Last edited: Sep 25, 2019
  4. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,666
    `SetMotorSpeeds(0f, 0f)` should turn off the left and right motors and they will not use power. However, it is possible for a device to implement more haptics. For instance the Xbox One gamepad has four motors which can be controlled separately (which we only support on the Xbox One console). `PauseHaptics` would disable everything in such a case. `SetMotorSpeeds(0f, 0f)` would only disable the left and right motors. For gamepads currently supported on desktop PCs we don't support more than the left and right motors, so the effect would be identical.

    So, if you are using InputActions, you can get the currently active device from the `CallbackContext` received with your callbacks.
     
  5. PixelLifetime

    PixelLifetime

    Joined:
    Mar 30, 2017
    Posts:
    90
    Thanks, `callbackContext.control.device` seems to return what is expected.

    From the looks of it we can't do more complex stuff in Editor, I was trying to find a way to have events called for specific device on a specific action (this might be hard to implement natively, and not worth the complexity). So I generated a script which has quite some explanations. I will have a look at source code of `PlayerInput` and how it handles multiple devices when new one is added. Do you have any other references or examples I could look up?

    My goal is to assign haptic feedback for supported devices, even if there are multiple devices. But I want to do it event based for every specific `Action` called by specific device, instead of having `if (this is gamepad) gamepad.shake`, `if (this is mobile) mobile.shake`... It would transform simple action binding callback into switch || Of course I could have it all stored in a dictionary and for device name/type/deviceId call appropriate feedback, some kind of interface between haptic and device that is in use.

    In the end, I want to create extension, something like this `device.TriggerHaptic(modes, settings...)`. The only concern I have if it's going to be supported later on out of the box. Because it would be a waste for me to work on something for quite some time and then it would be in the system anyway. Is this kind of behaviour on the roadmap for InputSystem?

    Thanks for answers!, so far, they have helped me a lot.
     
  6. PixelLifetime

    PixelLifetime

    Joined:
    Mar 30, 2017
    Posts:
    90
    I see that most of the stuff is already documented. I was looking at it a few months ago and it didn't have much, and it stated that documentation is 10% complete, awesome that most of the stuff I can look up myself now. Do you also have a hint of where I would be able to see upcoming/roadmap features? Is there this kind of a page?
     
  7. jonas-echterhoff

    jonas-echterhoff

    Unity Technologies

    Joined:
    Aug 18, 2005
    Posts:
    1,666
    There is no public roadmap atm, because we have to figure out how to allocate team resources post 1.0. But roughtly, the highest priorities for now are:

    -Get 1.0 into the 2020.1 verified packages set
    -Build a path to deprecate the old system
    -Figure out how to deal with input in the DOTS world.
     
  8. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Should an oculus quest developer rely on the new input system? haptics are vital for us.
     
  9. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Quick general comment RE haptics. What we have ATM is what I'd consider the barest basics. In that at least there's a way to produce haptic effects.

    We are in the process of thinking about how to have a much higher level support for haptics that would solve things such as cross-device consistent playback (two different devices yet roughly equivalent playback of the same haptic effect) as well as easier authoring. However, the key word here is "thinking". We're really no further than that and all that's certain at this point is that there will not be a big jump in the short term.

    So for now, yes, it's down to writing custom code to control haptic effects over time.

    Sidenote on the current setup: the haptics support we have so far generally relies on interfaces that are meant to help shielding you from having to code paths for specific device types. Example:

    Code (CSharp):
    1. myAction.performed +=
    2.     context =>
    3.     {
    4.         var device = context.control.device;
    5.         if (device is IDualMotorRumble rumble)
    6.             rumble.SetMotorSpeeds(0.1, 0.2);
    7.     };
    @StayTalm_Unity ^^
     
    hippocoder likes this.