Search Unity

Input System Update

Discussion in 'Input System' started by Rene-Damm, Dec 12, 2017.

Thread Status:
Not open for further replies.
  1. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Sounds buggy. Could you give a few more details regarding the setup? (Unity version, gamepad you're using, anything else noteworthy)

    There's a lot of lag with backports ATM with fixes landing out of sync a lot. We're trying to address that.

    What gamepad and what OS?

    Thank you for the report and the details. There's been a number of changes in 0.2-preview to the guts of the InputAction machinery and from what we're seeing so far, they're causing a number of issues such as the ones you're seeing. We're looking into it.

    What keyboard layout are you using? Note that the input system does NOT respect keyboard laying when identifying keys. I.e. all keys are identified by physical location and named according to the US keyboard layout. "A" is always the key to the right of caps lock even if the current keyboard layout assigns a different character to the key. So, I'm wondering if this is what's causing the confusion here.

    We'll probably need a mechanism to eventually also allow binding by "display name" so that you can choose whether to bind according to keyboard layout instead. Would be easy to add.
     
  2. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    Uuuuh...

    You are joking, right? If not then your whole input system is utterly useless...

    When one pushes a key on a keyboard, it should be the code associated with the character that's forwarded and not the "physical" key.

    Because when the code is executed it binds the wrong position back on the keyboard...

    If I bind "Z" to go forward and then in the game I have to push "W" for the same effect, that is totally stupid, sorry to be blunt.

    This is a silly architectural implementation and, as I said, utterly useless and I sincerely hope this will be fixed... because this is a big issue.
     
    _slash_ and frankadimcosta like this.
  3. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Hehe, don't be too quick to jump to conclusions :) The design here is intentional and actually useful :) Not having reliable WASD bindings in games and instead requiring devs to either create locale-specific keyboard mappings or requiring users of other keyboard layouts to first rebind the default bindings when starting to play a game, has been one of the pain points with the old input system.

    That said, no question that both ways are important and both are fully supported by the Keyboard device. Each key can be looked up either way. It's just that the binding system used by actions so far doesn't have a means of creating paths that look up controls by their display names. But, as said, that's not hard to add. Just stuff to do.
     
    Cataire and FROS7 like this.
  4. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    I'm sorry but the way I see it: if I bind movement to "zqsd", I expect to use "zqsd" for moving... and this isn't what appears to be here.

    Now, be able to have "default bindings" adapt to the currently active user keyboard layout would be great for instance, and I don't know if it'd be possible here, but fact is that the main feature does not appear to work for me.

    If I bind actions like this (in code or through code):
    Z: go up
    Q: go left
    S: go down
    D: go right

    Well I don't expect to only be able to go left by pressing A...
     
  5. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Yup, agree that in addition to adding the ability to bind either way, we need the UI to make it pretty clear which way the bindings you choose work.

    This is something that probably extends to other devices, too. The system's overall approach is to not care what's written on a control and consider that a display issue. E.g. on gamepads, the individual buttons are all called all kinds of different things on all kinds of different pieces of hardware and platforms but the system just goes and assigns names according to physical location on the device. However, in no case is the discrepancy more visible than in the case of the keyboard.
     
    FROS7 likes this.
  6. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    I see, maybe that's why non-american layouts are so poorly supported in most games...

    But as a non-american layout user, I hope you can understand how that lack of support can feel bad or even despicable in time...
     
  7. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    @ZoidbergForPresident Looking at your use case, it seems that you're actually better off with a language-*neutral* approach. Based on your bindings, I assume you're using an AZERTY layout in a French locale. Binding specifically to that layout will mean that whoever uses your app with a different layout will *not* get the expected bindings.

    This is why system goes by physical location by default. If you bind to WASD, you'll get proper controls no matter what keyboard layout you're using. So AZERTY, QWERTZ, QWERTY, and whatever else will work just fine. And by using display names in the UI, the right key names will be displayed, too. E.g. internally the system will have a binding for the "a" key but it will display "Z" in the UI when the layout is actually AZERTY.

    Just to emphasize that, the US layout is only used as a reference for naming but otherwise plays no role in how keyboard support works.
     
    FROS7 likes this.
  8. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    That is really hard to understand... wouldn't it be possible to have a default development layout that could be changed for each dev machine (and automatically detected for end user) and that everything would be adapted automatically?

    A sort of automapping + not using qwerty as default layout would be the idea?
     
  9. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Good questions. I think there has to be some reference layout but the editor UI should probably take measures to hide that for the most part. Talking it over with the UX guys to find a good solution :)
     
    FROS7 likes this.
  10. optimise

    optimise

    Joined:
    Jan 22, 2014
    Posts:
    2,129
    Hi @Rene-Damm. Does this new input system has been improved and achieved native level performance just like you use the low level platform library API to develop for the target platform? For example, at Android and iOS platforms, I believe it has input lag at old input system.
     
  11. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    Cool, thanks! :)
     
  12. hurleybird

    hurleybird

    Joined:
    Mar 4, 2013
    Posts:
    258
    360 and Windows 10.
     
  13. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,338
    You could have different views for the keyboard that gave the same information, in some of the same way that you do with the controllers having "south" "xbox A" and "playstation x" targeting the same physical button.

    I'm not convinced that that would be worth it, especially with the huge amount of different layouts out there. I read through all the variants of QWERTY on wikipedia and my head hurts now.
     
  14. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
    @Baste - it goes much deeper, DVORAK and COLEMAK and QFMLWY and the dreaded TNWMLC.

    And then you have non-US layouts for DVORAK/COLEMAK, Programmer's versions.

    There's a sea of keyboard configurations.
     
  15. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    OK sorry to insist on this but I keep thinking about how things are done technically on your side.

    You say that it's the physical positions of the keys that you work with... yet you use keycodes for bindings.
    It does sound a bit silly and really prone to annoying workflows as we've seen.

    Bear with me, I know I don't know much about how it works under the hood but...
    Would it be possible to, as far as keyboards are concerned, the keyboard bindings WILL NOT be character keycodes but some id that is independent of any layout (like id 0 to 103, as 104 is the highest key count possible on a keyboard?) so that it solves the problem of the disparity in keyboard layouts... AND at the same time feed the input system a keyboard layout, the one used by the user (dev or player), so that the correct key character is displayed in the UI (both on the dev side in Unity - because in the shortcut managers it's always the qwerty layout too - and the game UI) ?

    Sorry, just throwing this out here... trying to find the best solution. If this is possible then it's basically done because if you want to support keyboards with more keys, just add ids, and if you want to support more keyboard layouts, just add more physical ids to keyboard layout mappings...

    What do you guys think? Would that be possible to implement?
     
  16. JamesThornton

    JamesThornton

    Joined:
    Jun 26, 2015
    Posts:
    52
    Thanks for adding pen pressure! Is there a simple way to interact with UI buttons and sliders with a pen? It no longer simulates a mouse click by default.

    I'm sure it's possible with IPointer, but wasn't sure if there was some way to automatically enable it (since interacting with UI is expected behavior) Thanks for any advice!

    EDIT: As expected, buttons were easy to invoke. Will tackle sliders next. But I'd still be interested to know if there will (eventually) be default pen interaction with the UI. Cheers!

    Also, using Intuos5 on Windows 10
     
    Last edited: Mar 3, 2019
  17. Heimlink

    Heimlink

    Joined:
    Feb 17, 2015
    Posts:
    29
    I've reported a bug on github, with my sample scripts. Thanks for your help.
     
    Rene-Damm likes this.
  18. Heimlink

    Heimlink

    Joined:
    Feb 17, 2015
    Posts:
    29
    Another query...

    When I enable an Action Maps via a Press action, a "Performed" phase is fired from the same button if it's bound in the Map, being enabled. Is this the intentional behaviour?

    In my example, I'm trying to implement a Pause menu, toggled from the start button. I have a two Maps defined:
    • One which binds the start button, to allow the player to pause the game during play.
    • One which contains the pause menu interaction bindings, as well as the start button bound to action which will resume play.
    When I press the start button, I disable the calling Map and enable the other. However the other fires a performed action on the start button, as soon as its enabled.
     
  19. gegagome

    gegagome

    Joined:
    Oct 11, 2012
    Posts:
    392
    hi guys

    When I generate a C# class it doesn't generate all of the bindings like all tutorials show. It does generate the class but it is missing all of the bindings.

    Any ideas?
    Thanks
     
  20. BlackPete

    BlackPete

    Joined:
    Nov 16, 2016
    Posts:
    970
    My laptop doesn't even have a keypad. What happens if you try to bind to, say, NumPad-Enter? Also on my laptop, Delete and Insert are on the same physical key, so what happens if you bind Delete and Insert to separate functions? I'm also struggling to see the benefit in auto remapping to some physical location.

    Is it possible that mapping based on physical locations is overthinking things a bit?
     
  21. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    Well, I'd agree it's not the best architecture choice. :p
     
  22. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    Checked this one?
     
  23. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,792
    It ensures AZERTY users don't get shafted in cases where the developer hard coded WASD controls. Which is pretty cool.
     
  24. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    No, it doesn't.

    It would be useful if you could use "physical positions" instead of keycodes for bindings... but you don't, so... it's taking things the wrong way I'd say...
     
  25. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,792
    No?
     
  26. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    If I have to keep a qwerty layout in mind while deciding for bindings (whether in dev or as a end user), I really can't see the point of it.

    You bind X to an action, you push X in game for that action, end of story. You're supposed to be able to change your bindings in the Unity launcher anyway, so... even if the devs are lazy, you're still supposed to have that.
     
  27. Innovine

    Innovine

    Joined:
    Aug 6, 2017
    Posts:
    522
    And how would you define the physical positions? Not all of us use a standard 101 key PC keyboard. Or a 104 key windows one. Are you suggesting unity adds geometry files which identify the physical layout first? And what about programmable keyboards like a Kinesis which can remap itself in its firmware. Say the user switches to Dvorak, or Colemak, now wasd is wars or <aoe. I dont think its such a good idea to rely on physical positioning as a definition. But it gets worse. How do you propose to define the physical layout on this: https://simplyian.com/assets/maltron-lefthand-keyboard.jpg or this: https://images.pcworld.com/reviews/graphics/161402-alphagrip_final_original.jpg
     
  28. AndrewKaninchen

    AndrewKaninchen

    Joined:
    Oct 30, 2016
    Posts:
    149
    You know, this code generation based on an editor created asset thing the Action Maps do should really be incorporated more in other places. All the places (Animator, I'm looking at you, as always). It feels really good to have my statically typed references and not using strings. Only thing I miss is having the generated scripts be sub-assets to their defining object. I realize the editor doesn't really support that (the gods know I've tried), just saying it would be cool if it could. With Visual Scripting coming sometime it might be more relevant of a thing, anyway.
     
    noio likes this.
  29. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    That is why it should be clear by now that using keycodes is way better.

    If they really want to normalize and use physical keys then they should use IDs instead of badly naming the keys. Ids from 0 to the max number of keys on a keyboard (or even just 101, who needs more really?), with that list of id and a layout you need nothing else. You bind to the ids and, depending on what the user has as the active layout, display the correct key names/codes.
     
  30. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,338
    Are you guys really not realizing that with the old keycode format, you had exactly the same problem? If you made a layout for AZERTY, then most users would not be able to play your game without rebinding.

    This new system makes it better for dworak and azerty players, because games made for different keyboards will actually work as expected on their keyboards by default. It also makes it better for (at least novice) dworak and azerty developers, because you're not shipping games with ZQSD movement that won't work for most of your players.

    Yes, you have to think about it if you're directly coding key codes, but that's not very hard. Handling different keyboard layouts just got overall a whole lot easier for everyone.

    Also, with the new system, you're probably never writing "KeyCode.A" anywhere. Instead, you'll be making an input binding file where you define keys. That window has keypress detection built in, so you're going to define the "move up" button, go to that window, and press the key on your keyboard you want bound. So KeyCodes are probably never visible in your game unless it's a super-hacky prototype.

    Finally, integer ID's are not a solution! You're suggesting that since it's a bit harder for you, it should be harder for everyone. That's not an improvement! That's just petty!

    All that being said, there's still two things I think @Rene-Damm and the rest of the input team could consider to make it easier for non-qwerty keyboards:
    - Add code paths that allow users to query the physical position of common kinds of keyboards:

    Code (csharp):
    1. Assert.AreEqual(Keyboard.current.Q, Keyboard.current_azerty.A):
    2. Assert.AreEqual(Keyboard.current.Q, Keyboard.current_dworak.D):
    3. Assert.AreEqual(Keyboard.current.Y, Keyboard.current_qwertz.Z):
    4. ... etc
    - Make the input binding window (optionally) show the display keys for the developer's keyboard.

    I don't think the values of eq. "KeyCode.Q" should change based on the developer's keyboard, because that'd make code not portable at all.
     
    noio, Aeroxima, foxnne and 2 others like this.
  31. vecima

    vecima

    Joined:
    Jun 6, 2017
    Posts:
    16
    I just updated my project from 2018.3.6f1 to 2018.3.8f1 and now when I open my project I have this error:

    Library\PackageCache\com.unity.inputsystem@0.2.0-preview\InputSystem\NativeInputRuntime.cs(130,25): error CS1593: Delegate 'NativeUpdateCallback' does not take 3 arguments
     
  32. ZoidbergForPresident

    ZoidbergForPresident

    Joined:
    Dec 15, 2015
    Posts:
    157
    And where is it in the code? The input system or your side?
     
  33. vecima

    vecima

    Joined:
    Jun 6, 2017
    Posts:
    16
    I didn't call NativeUpdateCallback anywhere in my code. I assumed it was in the system. I finally completely removed the new Input System so that I could continue with development. I'll revisit the system later if/when it's more stable.
     
  34. gegagome

    gegagome

    Joined:
    Oct 11, 2012
    Posts:
    392
    He (Infallible Code) didn't address xbox or controllers in the video
     
  35. dougpunity3d

    dougpunity3d

    Unity Technologies

    Joined:
    Jul 11, 2018
    Posts:
    16
    Fix on its way to the Package manager when it lands it will be 0.2.1-preview. We apologize for this, its looks sloppy from a user standpoint. The reason this happened multiple times is how a package is separate from the unity engine code. Its rare that we have to make a native api change in the editor/engine that breaks the package. When we do, unity releases are usually 2-3 weeks apart, so we have to wait for the release, then patch it on the managed side in the package after the unity release. Causing a day or two of not being able to compile the package. This time, making thing s a little worse was the zero day security patch in Unity that the entire world was forced to upgrade.
     
    vecima likes this.
  36. YuriyPopov

    YuriyPopov

    Joined:
    Sep 5, 2017
    Posts:
    237
    Hi there. Recently we started using tje new input system in our project and so far we are using it for player movememt and camera controll. How ever I'm interested in trying it out for our combat system as well. Is there a way to set up buffers for the keys with timestamps that i ca use to detect combos or is the system not ready or desingned for this.
    Thanks in advance
     
  37. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
    @YuriyPopov - I'm doing something like this, you'll have to roll your own currently.
     
  38. YuriyPopov

    YuriyPopov

    Joined:
    Sep 5, 2017
    Posts:
    237
    So how are you aproaching this ?
     
  39. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
    I'm using ECS, using the InputTrace class to grab action events and transform them into event data. I then stick them into IBufferElementData DynamicBuffers and scan them to do things like rendering.

    My goal for this week is to finish the combo scan definitions system after I get around to upgrading to the latest ECS packages and switch to the new conversion pipeline. I already have debug rendering of the state buffers but I want to clean that up. I already have a solid spec on paper for the combo + state system. Basically, each state that can handle combos would have a list of combos to scan for and and a target state to transition to if the combo is evaluated. Doing this in a general way should help with interrupts/cancels.
     
  40. DavidNLN

    DavidNLN

    Joined:
    Sep 27, 2018
    Posts:
    90
    Hi, Is there some way to give a binding more then one path ?
    As in givin the same binding the 'w' key, and not duplicate the binding
     
  41. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,792
    Maybe it has been discussed before, but why does the new Input System have a dependency on the Unity Analytics module, and is that dependency going to stay in the final release?
     
  42. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    @Baste and @ZoidbergForPresident

    IMO you are both right :) I think binding (or looking up) by physical key location and binding by the character a key generates are both valid scenarios. For games I'd argue that binding by physical location is usually more desirable but even there you may well have situations where you want to bind by what's on the key instead.

    Also, even if you choose to bind by physical location, it could be very helpful to show keys in the picker UI with their names according to the current layout instead of with the "standardized" names.

    So, I think it would be great if the system enabled both. And it's easy to do. Tricky part is figuring out how to present this in the UI such that it's easy to understand without having to know much about the system.

    As for the API, looking up using the Key enum would be unaffected. That would still all go by physical location. And in the current API, you *can* already look up by the character a key produces (just search for the one with the right displayName).

    I think it's not necessary to show every possible layout but rather more about relating keyboard information shown in the UI to the current keyboard layout you have so that no matter what, you easily know which key is being referred to. From my understanding, this has been the biggest source of confusion for @ZoidbergForPresident; seeing "A" in the UI and not having an indication that it actually refers to what on his keyboard is the Q key.

    There's still some work left to do here but replacing StandaloneInputModule with UIActionInputModule should get you going and give some level of working UI support.

    Yes, that's by design. The idea is that when you enable a map, the actions pick up on the current state of controls so that it's not necessary to first release a control and then re-press it for the action to trigger.

    However, I see your dilemma there and I think that's a pretty common situation. Think in the desire to solve one problem, we created another. Gimme a moment to bounce this off a few people and gather some input.

    Missing all bindings how? The class will not contain bindings itself. The generated C# class is just a wrapper that simplifies working with the asset.

    Converting the entire asset to code is actually an interesting idea, though! Could indeed be a useful thing. Get rid of the entire asset and end up with only code.

    The binding will not trigger.

    Same physical key but generate two distinct key codes most likely based on Fn or some other modifier key.

    It solves a real and common problem that devs have been facing with the old input system. For example, I remember a studio manually adding support for a whole roster of keyboard layouts (plus all the code to detect that reliably at runtime) just to have consistent WASD keys in their game.

    So IMO this isn't overthinking it. But at the same time, completely agree that approaching it by physical location *alone* is too restrictive. See my thoughts above. Both avenues should be available.

    Agree. There's some new stuff in the works that may actually make this possible in the future. Let's see :)

    ATM, as @recursive says, it's down to custom code. One way is to use the timestamps you get from actions.

    One thing that's still on the list, though uncertain when it will actually come online, is InputStateHistory and the idea of being able to make history of inputs automatically available. But yeah, it's little more than an idea ATM.

    ATM the single path of a binding can match multiple controls but a single binding cannot have multiple paths. So no, if you want to bind to, say, the W and the P key both, you need two bindings.

    For us as devs, it's a huge help to have an insight what the system is *actually* doing out in the field (as opposed to what we *think* it is doing). However, the intention is that if analytics is disabled, all the code related to input analytics gets #if'd out. We still have to give that a proper check.
     
  43. hidingspot

    hidingspot

    Joined:
    Apr 27, 2011
    Posts:
    87
    I'm running into this issue as well. As a workaround, I'm just commenting out the bodies of the NativeInputRuntime.RegisterAnalyticsEvent and NativeInputRuntime.SendAnalyticsEvent functions (leaving the method signatures to avoid having to make further changes). It's obviously not ideal, but I'm assuming/hoping the issue will be fixed in the next release. The real bummer is that the package manager seems to know about my changes, and overwrites them ever time I launch Unity. But yeah, I know it's not a good fix. I just need it up and running without compile errors.

    On a separate note, my only real issue with the new Input System so far is that in builds of my game (Windows 10 using Unity 2019.1.0b7), the cursor appears as the spinning circle as if it's hung up. Things work normally though... just the wrong system cursor. Could this be something I'm doing wrong?

    All that being said, I'm super happy with the new Input System. It's such a better experience.
     
  44. saneangel8

    saneangel8

    Joined:
    May 1, 2017
    Posts:
    4
    I was also waiting for that nugget. Really great video! Hope there is more on the way.
     
  45. Juixa

    Juixa

    Unity Technologies

    Joined:
    Sep 6, 2017
    Posts:
    7
    @WizByteGames does this issue still happen with 0.2.1-preview on 2019.1?
    If so, can you provide the code you are using for this or submit a bug report through BugReporter in Editor. We don't see this behavior on our end.
    Thanks.
     
  46. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
    I reported this on the github as well: https://github.com/Unity-Technologies/InputSystem/issues/470

    Happens with 2019.1b6 on both 0.2.0 and 0.2.1, Win10 64bits. On a laptop and a desktop, with both PS4 and XB1 controllers.

    I'll see if I can make a min repro. It happens when I turn continous flag off. I'll see if beta 8 makes any difference.
     
  47. recursive

    recursive

    Joined:
    Jul 12, 2012
    Posts:
    669
  48. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,792
    I understand.

    I really hope that the final release does not depend on analytics, since otherwise we will probably never use it.
     
    Ferazel likes this.
  49. WizByteGames

    WizByteGames

    Joined:
    Mar 28, 2013
    Posts:
    70
    I am using Unity 2018.3.8f1 So I'm not sure about 2019.1 at all. But I can confirm that the issue still exists in in preview 0.2.1
     
    Last edited: Mar 22, 2019
  50. WizByteGames

    WizByteGames

    Joined:
    Mar 28, 2013
    Posts:
    70
    I have a question about multiplayer support for tne Input System. Not sure if it was stated here or not. IS their a way to check if a controller belongs to player 1 or Player 2 or is that not setup yet? I remember someone asking a while back but not sure if a resonse was given.
     
Thread Status:
Not open for further replies.