Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Normalization of input device values?

Discussion in 'Input System' started by jwvanderbeck, Nov 9, 2019.

  1. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    One of the goals of a good input system is to abstract the actual input device. My code shouldn't care if the action value it is getting comes from a mouse, or a game pad, or a keyboard, or even some custom thingy. It just receives a value and I act upon it.

    The problem however is what do you do when the input values are very different from one device to another. For example I am trying to implement zooming on a map and I currently have bound Vector1 actions for the mouse wheel, or a composite Vector1 using two keys on the keyboard. However the min/max values from the mouse are FAR FAR higher than those of they keyboard, which means the only way I can handle this is to treat each device individually as a special case - which in my mind defeats the whole point.

    Now I've laid out the problem, but don't necessarily know the solution from the Input System side. What I was thinking, but don't know if its doable, is that the Input System could somehow know what the maximum values are for the individual devices it works with and then normalize the values from the different devices that are all mapped to the same action.
     
  2. Twin-Stick

    Twin-Stick

    Joined:
    Mar 9, 2016
    Posts:
    111
    I just tackled something very similar for my project. I needed to have the ability to move a cursor on the screen for both gameplad/console and mouse/pc - and I wanted to keep the control code completely unaware of what was sending input to it.

    So my very specific case, I had an input controller class which sends the parsed input to the object controllers. I did a control scheme check, if it was PC - send the mouse input, if it was console, I sent the last screen pos + delta (direction of movement from the stick) * whatever sensitivity value felt nice.

    So similar to your query, I wanted to have a controller class completely unaware of the input device yet behave exactly the same regardless.
    And for my project, it works perfectly.
     
  3. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    That's basically what I am doing now but it seems to me that the Input System package should be handling that for me so my own game's logic doesn't need to care where the input is coming from.
     
    Twin-Stick likes this.
  4. SomeGuy22

    SomeGuy22

    Joined:
    Jun 3, 2011
    Posts:
    722
    That is the traditional way to do it, and a valid one at that. But you're right, the point of the input system is to just get the value without worrying about hardware specifics. And that's exactly why they added the Processors per-binding.

    Go to your Action and check one of the bindings. See that under the "Processors" tab there are many processors that will actually modify the value they pass to the action. One of which is "Normalize". You also have "Normalize Vector2", and even "Scale". So you can either Normalize all bindings to the -1 to 1 range, or you can simply scale your keyboard composite binding to match that of your mouse wheel.
     
    Twin-Stick likes this.
  5. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    Heh figures the feature would already be there and I just didn't see it!
     
  6. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    So I just tried this but it doesn't appear to be working. Not sure what I might be doing wrong. On the "Scroll/Y [Mouse]" binding, I added a Processor, Normalize and set min to 0, max to 1, and zero to 0.

    But in my code I'm still getting values of 0-120 from the scroll wheel.

    EDIT: Ok after looking at the code for this, it actually works the other way. You put in the raw values for min/max and it normalizes those values to -1..1

    Unfortunately this isn't very helpful as it just gets you back to being tied to device specifics. Yeah it pushes it back a bit, but in order to use this Normalize processor I need to know the input range of the device. And that can vary from device to device. My mouse seems to have a scroll range of -240 to 120 (yeah not sure why the asymmetry there) but another mouse would probably be different.
     
    Last edited: Nov 11, 2019
    Palineto likes this.
  7. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    I guess for now I'll just use something like -100 to 100 as a sort of default range and then allow setting a zoom speed. Its still far better than the roughly 120 times faster than keyboard that it was before.

    I guess if I need anything more precise for any function I'll have to let the player do a calibration first, but that seems so old school. I thought it was possible to get that information directly from the device/or OS or something.
     
  8. SomeGuy22

    SomeGuy22

    Joined:
    Jun 3, 2011
    Posts:
    722
    Well... yeah, that's how it has to work. Because there's nothing to "normalize" on in a 1D Vector. Let's say you have a scroll on a scale of 0-60, and you want to normalize it to 0-1. This is accomplished by dividing by 60. But the function has no clue what range it should do this on, because it doesn't know what values to compare it to--it doesn't know what number to divide by. Normalization only works "automatically" when you have 2D or beyond, because it can reference the magnitude of each individual axis and scale accordingly so that the magnitude of one axis is proportional to the magnitude of the other brought down to 1.

    Typically scroll wheel hardware applies the inputs in "ticks". My best guess is that the 120 value refers to 120 pixels of motion across the screen in a single tick, or some other measurement. I'd also wager that something like an Apple mouse or touchpad has more degrees of motion than your standard 5 button, so those might register input events at lower values. However the key takeaway is that the reason these values are different is to accurately approximate the amount of physical motion from your device. Even though an Apple mouse might come in with a 10 value instead of 120, it's likely because the scrolling "event" happens more frequently due to hardware mechanics and so actually scrolls at the same speed. So really, if you want to emulate the effect of scrolling like through a web browser, you should just leave the value alone and manually scale the entire Action's value to match the speed of your game. Doing this means that people with lower mouse scroll values will likely have a higher frequency of input events, meaning the resulting motion should be about the same as those with high scroll values if you compare over the course of several seconds.

    I should also mention that this hardware-agnostic value scaling likely happens through the OS. On Windows you can set your scroll speed in the mouse options, so I'd presume it affects the value read by Unity. That means no matter what, users will scroll slower or faster depending on their personal OS preferences, there's no avoiding it. Unless... (and this is where it gets tricky...)

    If you're just looking to consider each scroll event as a single "tick", you can Clamp the value instead of Normalizing it. That means that no matter how big the value is for each mouse, the scroll value will always come in at either 1 or -1. Then you can speed up or slow down according to your game speed so that every single input (scroll and keyboard) makes the zoom at the same rate. However, be warned: while this may make things "technically" the same, it completely ignores point of the dynamic values in the first place, which is to make the physical motion of scrolling universal. Essentially, if you use this method with the same Apple mouse example I had before, people with the Apple mouse have their scroll events happen more frequently. People without the Apple mouse have events that happen less frequently. Since everything is scaled to 1 regardless of the frequency, you will see people with the Apple mouse scroll more than people without one.

    With that in mind, my recommended solution is to just leave the mouse binding values unchanged, so that your map can account for the frequency of scroll events and just have people scroll using their personal preferences. Then I would say just scale the keyboard binding to match that of a standard mouse, say 100 or so. Then if you're still unhappy you can always add a user defined parameter for the speed. I hope this has helped in some way. You're right to question every corner case about potential input devices, it shows that you've really thought about how things will end up working for the user. So hopefully this example will show that doing these thought experiments and understanding the data you're working with can go a long way towards making better programs.
     
  9. jwvanderbeck

    jwvanderbeck

    Joined:
    Dec 4, 2014
    Posts:
    825
    Sure I understand how it works. Its essentially just a remap from x-y to 0-1.

    However once upon a time games required you to "calibrate" your input devices so they could "see" the maximum and minimum values from your device. Games almost never do that anymore. I assumed that was because hardware devices essentially make that information (min/max) available from a query rather than requiring input. Either the device or the OS.

    So what I was hoping the normalization processor did was get the min/max from the hardware/os and then use that data so that no matter what hardware was hooked up it would always be 0-1.
     
  10. SomeGuy22

    SomeGuy22

    Joined:
    Jun 3, 2011
    Posts:
    722
    Nope as far as I'm aware there is no OS-agnostic way to easily retrieve that information, and there never has been. A quick search shows that deep in the .NET API there is a SystemInformation parameter called MouseWheelScrollDelta, which supposedly gives the amount of movement by the wheel, probably taking the user preference into account. And there's also MouseWheelScrollLines which probably does the same but in a different measurement. You might be able to factor those values and perform some approximation by converting them to what's used by the InputSystem. But both of those are children of System.Windows.Forms, which means I don't think they will work properly under Mac.

    I'm pretty sure what most games that use the scroll wheel do is exactly what I said in my last post--just leave the value as is. Because it's already supposed to approximate the amount of motion done by the physical device. Plus you'd want it to reflect each user's OS settings as well.