Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

[Official] Input System Improvements

Discussion in 'General Discussion' started by bibbinator, May 26, 2014.

  1. bibbinator


    Unity Technologies

    Nov 20, 2009
    Hi all,

    The Input system needs some Unity love. This one is in many ways simpler than other systems, but it's always good to make sure we have though through the use-cases. We also need to think about the scope in how far we go with it.

    Is it better to have a simple, modular system with an API you use to roll-your-own way of supporting different input controllers, platforms and bindings? Or are you more concerned with how multiple systems work together on one platform? Or multiple platforms supported in one code path? Or? Of course the answer is "we want it all, now!"...

    You get the idea; we would love to collect some use-cases we might not though of, and what troubles you the most when supporting input on the huge array of platforms we support.

  2. Deleted User

    Deleted User


    Multiple systems work toghter on one platform and modular at the same time. Think of a desktop game and how many type of controllers one can plugin. The main problem on this is with axis, so would need some system in place where you can define a "batch" of each controller you can support, so you can just interface to the inputs that particular controller needs.

    Also events to detect if you are using mouse wheel for example for key binding (think of a switch weapon system in a fps). Surely it can be already done in many ways, but is kinda hacky at the moment.

    Just for the hell of it, I'll post my convulted key binding system:

    Code (csharp):
    1. using UnityEngine;
    2. using System.Collections.Generic;
    4. [AddComponentMenu("")]
    5. public class NMInput : MonoBehaviour {
    7.     static private bool isInitialized;
    8.     static private List<string> kName = new List<string>();                               //Key input nome
    9.     static private List<KeyCode> kPrimary = new List<KeyCode>();                          //Key input primario
    10.     static private List<KeyCode> kSecondary = new List<KeyCode>();                        //Key input secondario
    12.     static private List<string> aName = new List<string>();                               //Axis nomi stringhe 
    13.     static public List<NMAxisInput> axis = new List<NMAxisInput>();                       //Axis lista classe
    14.     static private int aIndex = 0;                                                        //Indice per riferimento veloce agli assi
    16.     static private float aSensitivity = 3;                                                //Sensività asse predefinita
    17.     static private float aGravity = 3;                                                    //Gravità asse predefinita
    19.     #region INPUT_FUNCTIONS
    21.     //Crea istanza se necessario
    22.     public static void Init() {NMInputInit();}
    24.     //Crea istanza
    25.     private static void NMInputInit() {
    26.         if (!isInitialized) {
    27.             GameObject o;
    29.             if (GameObject.Find("NMInput"))
    30.                 o = GameObject.Find("NMInput");
    31.             else
    32.                 o = new GameObject("NMInput");
    34.             if(GameObject.Find ("NMSystem"))
    35.                 o.transform.parent = GameObject.Find ("NMSystem").transform;
    37.             if (!o.GetComponent<NMInput>())
    38.                 o.AddComponent<NMInput>();
    40.             isInitialized = true;
    41.         }
    42.     }
    44.     //Controlla se questa istanza esiste
    45.     static public bool Exist() {return isInitialized; }
    47.     //Ritorna tasto rilasciato
    48.     static public bool GetKeyUp (string k) {
    49.         if (Input.GetKeyUp(kPrimary[kName.IndexOf(k)]) || Input.GetKeyUp(kSecondary[kName.IndexOf(k)]))
    50.             return true;   
    51.         else
    52.             return false;
    53.     }
    55.     //Ritorna tasto premuto
    56.     static public bool GetKeyDown (string k) {
    57.         if (Input.GetKeyDown(kPrimary[kName.IndexOf(k)]) || Input.GetKeyDown(kSecondary[kName.IndexOf(k)]))
    58.             return true;   
    59.         else
    60.             return false;
    61.     }
    63.     //Ritorna tasto tenuto premuto
    64.     static public bool GetKey (string k) {
    65.         if (Input.GetKey(kPrimary[kName.IndexOf(k)]) || Input.GetKey(kSecondary[kName.IndexOf(k)]))
    66.             return true;   
    67.         else
    68.             return false;
    69.     }
    71.     //Ritorna valore asse
    72.     static public float GetAxis (string a) {
    73.         int index = 0;
    75.         for (int i = 0; i < aName.Count; i++)
    76.             if (axis [i].aName == a)
    77.                 index = i;
    79.         return axis [index].GetAxisInput (a);
    80.     }
    82.     //Setta key input
    83.     static public void SetKeyInput (string k, KeyCode p, KeyCode s) {NMInputInit(); kName.Add (k); kPrimary.Add (p); kSecondary.Add (s);}
    85.     //Setta asse con sensività e gravitò predefinite
    86.     static public void SetAxisInput (string a, string p, string n) {
    87.         NMInputInit();
    88.         aName.Add (a);
    89.         axis.Add (new NMAxisInput());
    90.         axis[aIndex].SetAxisInput(a, GetKeyPrimary(p), GetKeySecondary(p), GetKeyPrimary(n), GetKeySecondary(n), aSensitivity, aGravity);
    91.         aIndex++;
    92.     }
    94.     //Setta asse con sensività e gravità specifiche
    95.     static public void SetAxisInput (string a, string p, string n, float s, float g) {
    96.         NMInputInit();
    97.         aName.Add (a);
    98.         axis.Add (new NMAxisInput());
    99.         axis[aIndex].SetAxisInput(a, GetKeyPrimary(p), GetKeySecondary(p), GetKeyPrimary(n), GetKeySecondary(n), s, g);
    100.         aIndex++;
    101.     }
    103.     //Controlla key input nel dizionario (primario)
    104.     static private KeyCode GetKeyPrimary (string c) {
    105.         KeyCode v = KeyCode.None;
    107.         if (kName.Contains(c))
    108.             v = kPrimary[kName.IndexOf(c)];
    110.         return v;
    111.     }
    113.     //Controlla key input nel dizionario (secondario)
    114.     static private KeyCode GetKeySecondary (string c) {
    115.         KeyCode v = KeyCode.None;
    117.         if (kName.Contains(c))
    118.             v = kSecondary[kName.IndexOf(c)];
    120.         return v;
    121.     }
    123.     //Ritorna l'indice dell'asse dal nome
    124.     static private int GetAxisIndex (string n) {
    125.         int a = 0;
    127.         for(int i = 0; i < aName.Count;i++)
    128.             if(aName[i].Contains(n))
    129.                 a = i;
    131.         return a;
    132.     }
    134.     #endregion
    136.     //Classe specifica per gli assi
    137.     [System.Serializable]
    138.     public class NMAxisInput {
    139.         public string aName;                //Nome asse
    140.         public float positive;              //Magnitudine positivo
    141.         public float negative;              //Magnitudine negativo
    142.         public float magnitude;             //Magnitudine corrente
    144.         public float gravity = 3;           //Gravità                                                    
    145.         public float sensitivity = 3;       //Sensività
    147.         private KeyCode posPrimaryKey;      //Primario positivo key input
    148.         private KeyCode posSecondaryKey;    //Secondario positivo key input
    149.         private KeyCode negPrimaryKey;      //Primario negativo key input
    150.         private KeyCode negSecondaryKey;    //Secondario negativo key input
    152.         private bool posInput;              //Se key input primario-secondario positivo sono premute
    153.         private bool negInput;              //Se key input primario-secondario negativo sono premute
    154.         private bool posPrimaryInput;       //Key input positivo primario premuto
    155.         private bool posSecondaryInput;     //Key input positivo secondario premuto
    156.         private bool negPrimaryInput;       //Key input positivo negativo premuto
    157.         private bool negSecondaryInput;     //Key input positivo negativo premuto
    159.         //Setta asse predefinito
    160.         public void SetAxisInput (string n, KeyCode p1, KeyCode p2, KeyCode n1, KeyCode n2, float s, float g) {
    161.             aName = n;
    162.             posPrimaryKey = p1; posSecondaryKey = p2;
    163.             negPrimaryKey = n1; negSecondaryKey = n2;
    164.             sensitivity = s;
    165.             gravity = g;
    166.         }
    168.         //Ritorna la magniudine dell'asse
    169.         public float GetAxisInput (string a) {
    170.             if(aName == a) {
    171.                 if(Input.GetKey(posPrimaryKey) || Input.GetKey(posSecondaryKey)) {
    172.                     posPrimaryInput = Input.GetKey(posPrimaryKey);
    173.                     posSecondaryInput = Input.GetKey(posSecondaryKey);
    175.                     if (posPrimaryInput || posSecondaryInput)
    176.                         posInput = true;
    177.                     else
    178.                         posInput = false;
    180.                     return GetPositiveAxis();
    181.                 }
    182.                 else if (Input.GetKey(negPrimaryKey) || Input.GetKey(negSecondaryKey)) {
    183.                     negPrimaryInput = Input.GetKey(negPrimaryKey);
    184.                     negSecondaryInput = Input.GetKey(negSecondaryKey);
    186.                     if (negPrimaryInput || negSecondaryInput)
    187.                         negInput = true;
    188.                     else
    189.                         negInput = false;
    191.                     return GetNegativeAxis();
    192.                 }
    193.                 else
    194.                 {
    195.                     posInput = negInput = false;
    196.                     return GetGravityAxis();
    197.                 }
    198.             }
    199.             else
    200.             {
    201.                 return 0;
    202.             }
    203.         }
    205.         //Ritorna la magnitudine positiva dell'asse
    206.         float GetPositiveAxis () {
    207.             if (posInput) {
    208.                 positive += sensitivity * Time.deltaTime;
    210.                 if(positive > 1)
    211.                     positive = 1;
    213.                 magnitude = positive;
    214.             }
    215.             return magnitude;
    216.         }
    218.         //Ritorna la magnitudine negativa dell'asse
    219.         float GetNegativeAxis () {
    220.             if (negInput){ 
    221.                 negative -= sensitivity * Time.deltaTime;
    223.                 if(negative < -1)
    224.                     negative = -1;
    226.                 magnitude = negative;
    227.             }
    228.             return magnitude;
    229.         }
    231.         //Se nulla è premuto itorna la magnitudine a 0
    232.         float GetGravityAxis () {
    233.             if (magnitude < 0) {
    234.                 negative += gravity * Time.deltaTime;
    236.                 if(negative > 0)
    237.                     negative = 0;
    239.                 magnitude = negative;
    240.             }
    242.             if (magnitude > 0) {
    243.                 positive -= gravity * Time.deltaTime;
    245.                 if(positive < 0)
    246.                     positive = 0;
    248.                 magnitude = positive;
    249.             }
    250.             return magnitude;
    251.         }
    252.     }
    253. }
  3. Tiny-Tree


    Dec 26, 2012
    What i miss from the Input current system: simulate input.

    i would like to be able to do something like call "Input.Simulate.getKey(Keycode.Space)" in a function
    then all my if(Input.GetKey(Keycode.Space) will return true
    Gekigengar likes this.
  4. Dameon_


    Apr 11, 2014
    I love the fact that, with the current input system, I can call GetAxis and get platform independent results. What I don't love is that I can't modify those axes at runtime. We need access to the settings, if only to allow players to change input setups in-game, eithout having to exit, change settings, and then restart. Script access to key definitions is crucial.
    Gekigengar likes this.
  5. Bladesfist


    Jan 3, 2012
    I would like the event input system to pick up right shift and alt gr.
  6. Woodlauncher


    Nov 23, 2012
    NeatWolf, bvoloh, Elecman and 2 others like this.
  7. Carpe-Denius


    May 17, 2013
    Make it changable from the API.
    Almost every game uses key changes ingame in its own menu instead of unitys I-will-show-you-the-input-manager-first.

    I don't know if the inputs are usable before I played the game, so I have to start the game, try it, quit the game, start a new one, change a key, play it, quit it... Other than that, it is enough for me.

    I don't know how to use several controllers in one game, but I am not affected by it.
  8. Dreamora


    Apr 5, 2008
    I agree to part of the above points:
    * Please make the input assignements runtime changeable so we can use it with options menus, external config files etc. Nothing against the startup window, but calling its look and lack of any configurability make it an absolute no go
    * Add the capability to simulate input or raise input events in general (usefull for Unit Testing and mapping other input devices without OS level event raising - we did that for TUIO and Joysticks and its definitely no fun just to get the UI to respond ...)
    * Add the capability to the input system to globally ignore / disable specific hotkey combos so games can decide to ignore the windows key and alt-tab altogether for example
  9. Rodolfo-Rubens


    Nov 17, 2012
    That would be cool, it would make touch screen input much more easy to implement!
  10. Tiles


    Feb 5, 2010
    I guess we have two groups here. The one like me loves it superbabyeasy. Cannot be easy enough. The others loves to have full control over everything. It's hard to put this under one hat. I fear you will need two solutions here. One super easy solution at the surface, and a advanced solution in an advanced setup.

    Suggestion for the input manager in the project settings: i was always at heavy hunting what term can be used in the edit boxes. Is it escape, Escape, esc, Esc? This issue becomes even more interesting with the not so common letters like the f buttons, or the normal values vs the numpad values. And of course for the joypad. What button is what. And then there is the american qwerty keyboard and the european qwertz keyboard. Happy guessing. American keyobard does not have umlauts.

    What about a "Please press key" feature, like in the hotkey manager of Blender? No more hunting anymore what comes into the edit box ...
  11. Wahooney


    Mar 8, 2010

    1. COMPLETE script access to input manager. Editor and Runtime.
    2. Simulated input events
    3. Better location/mapping of connected devices (multiple joysticks, mouses, etc.)

    Please and thank you :)
  12. eskimojoe


    Jun 4, 2012
    Make official input templates for iOS joystick, Android Bluetooth Joystick, Ouya, PS3 joystick, XBOX joystick.

    Right now, I have to code dozens of input entries for each joystick.
  13. eskimojoe


    Jun 4, 2012
    Aggregation system.

    Each monobehavior has an Update() which checks Input.Key or Input.Touch.

    Make it such that one monobehavior, for the whole scene, is able to check for Input and delegate it to the rest of the GUI, Character Controller, NPC controller, etc.
  14. ZJP


    Jan 22, 2010
    Last edited: May 26, 2014
  15. Woodlauncher


    Nov 23, 2012
  16. ZJP


    Jan 22, 2010
    Because i love the simplicity and the demo webplayer works well with ALL joystick/joypad i tried. This is not the case of cInput. I bought it, but i plan to take jInput this week.
  17. TheDMan


    Feb 23, 2014
    Totally agree as well!!
  18. Murgilod


    Nov 12, 2013
    The fact that there's no way to natively change inputs at runtime is absolutely ridiculous for a game engine made after 1996. On top of that, there should be an input detection system, both that can be called in scripts and in the main editor for setup, that allows the user to press a key to set up an input.
  19. SpreadcampShay


    Dec 9, 2010
    * We need to be able to read and write to it from scripting. It's hard to set up the Keys and their behavior in the Inspector and impossible to do from scripting. To achieve in-game control menu I ended up using cInput, which is an amazing asset, but it's impossible to do out of the box without a major effort as far as the normal Input Manager is concerned.

    * Vibration support! It's such a bummer that it is missing.

    * Simplified settings. I don't actually need to set gravity and sensitivity over and over again, I only need it once. I can do this with cInput, so in my Initialize function I have something like cInput.sensitivity = 3f * Managers.Settings.userSensitivity; - so simplify it, but still offer these individual settings as optional for those that need them.

    * Better support for common Gamepads. You need to guess which Keycode responds to what button on the pad. I ended up having to do a cheatsheet for the 360 pad through trial-by-error. The KeyCode class is nice, but for pads the variables are far too ambiguous. Even for keyboard and mouse they sometimes are. But primarily for gamepads I'd like to see naming representative of the buttons on the pad, considering how the 360 gamepad in particular is a PC standard. Or, at the very least, provide a official cheatsheet so we don't have to make one of our own or google for one (Users be aware many of the google results are outdated, at least at the time I made my cheatsheet). We could write our own scripts to help with that, but it's still annoying.
    Last edited: May 26, 2014
  20. Bladesfist


    Jan 3, 2012
    That would be easy enough to implement if they give you access to the actions at runtime. I am doing it already with Rebind however getting input in the editor is not very accurate.
  21. Ferazel


    Apr 18, 2010
    I agree that Input would be better served by removing legacy OnMouseButton() reflection methods and have a singleton/static InputActionManager class that is more event-based. I think this class should be sub-classable with a very well-documented process allowing for low-level access to input. Maybe make scripts require an interface to allow them to receive actions from this manager if subscribing to events is a too high barrier to entry.

    Another thing that might be useful for this manager is to manage object action zones. What I mean by this is that often you have on-screen UI or other interactions that will block camera interactions or interpret input differently. For example, if I define an area to accept a certain input schema (camera controls) but I also have another area that doesn't respond the same way (GUI). Also being able to do things like creating touch zones, such as if I'm dragging and moving a camera around, I don't want the unit to receive touch-up events (similar to a drag threshold but must be dpi independent).

    I definitely think that runtime action mapping should be a necessity that is accessible from the programming level so that we can use it in options screens and other dynamic mapping capabilities.

    Better built-in touchscreen gestures. Things like twist/pinch/swipes/doubletaps should all be built into the system as inputs that the action system should parse for us automatically. These gestures could have different thresholds that could be altered or modified into the action allowing us to create dead zones depending on the game.
  22. AndrewGrayGames


    Nov 19, 2009
    So, one of the problems that I find myself solving with my custom wrappers, is that Unity does not natively support in-game rebinding of keys, which is a big deal - people like customizing their controls so they can play more comfortably. Unity's Input system isn't bad, but rebindability has really become a much higher-priority requirement in recent years for games on most non-mobile platforms.

    While I haven't implemented a GUI for it yet (and I have some changes I need to commit...), here's my code for how I'm going about doing that:

    Control Axis - based upon Unity's own axes, only I use Input methods.

    Accelerometer Binding - Untested accelerometer implementation. Based on a similar concept to the axes, only intended to be used in mobile deploys only.

    Control Manager - The heart and soul of my control rewrite. A GUI allows the user to modify what keys go to what axis directly on this. The player's avatar refers to this manager to get input, basically.
  23. shkar-noori


    Jun 10, 2013
    Thank you for opening these official threads. what I need for the input system is:

    * Runtime API.
    * Per-Player Controls.
    * Better KeyCode naming for Gamepads.
    * Input simulation.

    I'm currently working on these stuff in my project, yes, it can be done now, but its time consuming.
  24. AnomalusUndrdog


    Jul 3, 2009
    I'd like to point out:

    1. if you do add an input simulation feature please make sure it works properly with uGUI and the old GUI
    2. please also do simulation for mouse movement/clicks/drags.

    Those are small steps towards making a full-featured replay system that I'd very much like our game to have.


    I'm trying to create my own visual editor in Unity (an editor script), and one of the things I'd like to be able to do, is let the user use spacebar to pan, like in Photoshop. Currently, it's not possible because the Editor window is eating up the spacebar input, and it never reaches my scripts (at least, last I checked).

    Also stuff like be able to detect "hold alt + hold spacebar + left click" for zoom in, and "hold ctrl + hold spacebar + left click" for zoom out. Currently, spacebar can't act as if it's a modifier key, but that's what I need. In fact, it'd be great if any key can be queried if they are being held, just like in the Input class.

    (For those unfamiliar with writing editor scripts, you can't use Input, e.g. Input.GetKeyDown, because Input only works while the game is playing. For editors, you have to use Event.)

    It could be worthwhile to unify the input system for editors and during play mode, but I know that may become a mess; you'd have to deprecate a lot of things.


    I chatted with a guy who was disappointed that Unity can only detect up to 20 joystick buttons. It seemed he needed more, but I never bothered asking why. It must be some sort of special project. It could be worth looking at to not hardcode the amount of buttons detected.
    Last edited: May 26, 2014
  25. goat


    Aug 24, 2009
    I second this. I think, but for example it would make it easier to convert human controlled characters to NavMesh Agents controlled if you like.

    I think the approach I would like it to take the potentially most complex input system (iPad and other tablet gestures) and create GUI templates with elements defaulted and assigned to a GUI element and translate those if used to PC mouse movements or game controller axis movements or vice versa, PC mouse and keyboard controller and Game Controllers clearly assigned and translated into equivalent virtual controllers on tablet and phone devices. Also gyroscope / accelerator of tablets translated somehow into traditional game controls (perhaps by speed of control strike repetition consecutively in a certain direction?).

    The above paragraph highlights the major divide in controller styles, touch vs stick buttons.

    The last thing to finish off the equation and for portability would be to take the above and create virtual tablet controls / PC keyboard from the big consoles Nintendo, xBox, and PS as a small consistent (with the original HW controller) GUI on screen. Folks used to playing on a specific HW device could simply chose to change the virtual controller used.

    Then if new HW comes in the future you extend the model.

    It's also help testing too I think.
  26. Moonjump


    Apr 15, 2010
    A simple way to know what input options are available on the device (touchscreen, keyboard, mouse, joystick, etc.) and an event if that changes (a mouse plugged in for example).
  27. goat


    Aug 24, 2009
    Yes, those pictures describe better what I mean. And each major HW input device having a default cross mapped with the other HW input devices and presented in a nice GUI template makes sense.

    Of course you can reassign functionality of the defaults or extend those and change during runtime so contrary to what one might say making a system noob friendly isn't dumbing it down, it's showing your skills as engineers. Quite the opposite from dumbing it down.
  28. kburkhart84


    Apr 28, 2012
    I think the biggest thing in this thread is the simple run-time access to the input settings, as in allowing us to have the player change them at run-time instead of depending on a crappy start up dialog. This would allow us to do it when we want, which could be in the game's menu, or even a menu during a "level" of the game, or really anywhere we want to. I think the other suggestions could be useful, but everybody posting would agree that what I'm saying(and others have said the same thing) here is the number 1 thing we need as far as input.
  29. goat


    Aug 24, 2009
    Example, you can have the controller type in the inventory like health aids and such .
  30. steego


    Jul 15, 2010
    Do I need to say it again? Rebinding. I like the basic idea of how it is now, with setting up buttons and axis actions in the editor, assign default values, and then allow changes to this from script. The important bits however:

    • Don't limit the number of keys/axis that can be assigned to an action. Two is not enough.
    • Do allow the same key to be assigned to multiple actions. Some input devices have fewer buttons, so this might be necessary, but give us a way to check if an input is already bound for cases where this doesn't make sense (like binding the same key to move forward and back).
    • Make sure all inputs can be recognized for an action. I've come across games where my extra mouse buttons can't be bound to an action because it is looking for a keyboard key only, which is frustrating.
    • It would be nice to be able to have multiple input bindings you can switch between at runtime, like how for example Battlefield has separate controls for on foot, in vehicle, aircraft etc.
  31. hippocoder


    Digital Ape Moderator

    Apr 11, 2010

    This is pure brilliance. It works well for my needs but I have issues with unity:

    1. GetKey is fast only when keycode is used. Passing a string is horribly slow for some reason, even immutable. GetAxis has no string alternative and is actually killing an entire millisec. Speed this stuff up, it gets on my tits no end. A millisec on vita isn't funny.

    2. I'd like to be able to create as many input manager groups per device I'm deploying to. I'd like to be able to switch to a group or poll a specific group (ie for 2 players for example).

    3. I want Unity to recognise every well-known device. incontrol does this. It knows if a PS4 pad is plugged in. Right now on Windows, there's no way of knowing without using incontrol.

    4. Vibration. All the vibrations. Make those work with Unity Animation Curves.

    5. Mostly, I don't want to see a giant 5 mile long list. This is stupid. Create groups. Name groups. Poll groups.

    6. Optimise. Give us index lookips. GetButton(0, 4) << fast, group 0, button 4. If you're sure strings are fast then just check GetKey, you'll soon find for some reason, they're not. At all.

    Speed and groups. Current Input Manager is perfectly fine, it's just *overwhelming* when used in cross platform situations - something Unity is known for, and it's simply not fast enough, doesn't support vibrations and doesn't support rebinding at runtime very well without some sort of weird wrapper code.
    Last edited: May 27, 2014
  32. chrisall76


    May 19, 2012
    I would love for it to work like this plugin!/content/14695

    It's able to detect when different controllers are plugged in and allows for you to get input for each separate one.
    Also supports all common controllers. The current one to figure out multiple controllers you'll have to do alot, which
    a input manager should deal with. Also need a way to change controls in-game.
  33. User340


    Feb 28, 2007
    I think you should offer an input system that's 100% bare bones. It provides direct access to all input (joysticks, keyboard, mouse, touches, accelerometer, etc) with no interpolations whatsoever. You could call this class RawInput or something like that. It would only contain calls like Input.GetKeyDown(), and no calls like GetAxis().

    Then we (or even you guys) could build higher level systems on top of it. You could build InputManager on top of it; we could build our own input manager on top of it.
  34. User340


    Feb 28, 2007
    Not sure I'm a fan of this idea. This would mean that the Input class is sometimes input and sometimes something else. Plus you can already do this kind of thing now, with interfaces. Instead of calling the Input class directly, call an interface instead for GetKey(), GetAxis(); like this:

    Code (csharp):
    1. // Replaces unity's Input class
    2. interface IInputSource {
    3.     bool GetKey(KeyCode key);
    4.     float GetAxis(string name);
    5. }
    6. class UnityInput : IInputSource {
    7.     public bool GetKey(KeyCode key) {
    8.         return Input.GetKey(key);
    9.     }
    10.     public float GetAxis(string name) {
    11.         return Input.GetAxis(name);
    12.     }
    13. }
    Then in your calling code do this:
    Code (csharp):
    1. public class NewBehaviourScript : MonoBehaviour {
    2.     IInputSource input;
    3.     void Awake() {
    4.         input = new UnityInput();
    5.         //input = new TestingInput();
    6.         //input = new AIInput();
    7.     }
    8.     void Update() {
    9.         if (input.GetKey(KeyCode.Space)) {}
    10.     }
    11. }
    shaderop likes this.
  35. Waz


    May 1, 2010
    This that make the current system painful, from most to least (but still all painful):

    1. No bindings setters (i.e. can't make own Input dialog) - this is obviously the biggest - everyone just throws Input away and does it themselves.
    2. GUI focus navigation via input (this is more a question for the new GUI system) - needs to be able to work with joysticks.
    3. joystickNames is totally useless - joystickNames[0] doesn't correspond to JoystickButton0, etc., and therefore there is no way to tell if a particular joystick is really connected.
    4. Inconsistency between Input called from Update and GUI events in OnGUI. Some things, like joysticks (but also magic Android keys), never appear as GUI events.
    5. Down/Up coherence can be lost (sometimes you only get one) : again more a GUI question but TextField eats the Up but now the Down.
    6. No cross-platform joystick-based text input. eg. for SteamOS. Mostly a problem because it's very hard to roll your own given the opaqueness of the API.
    7. No KeyCode <=> text label mapping (needed when making own manager). Even better would be icons too.
    8. Inconsistency in dealing with autorepeat of keys vs. non-autorepeat of joysticks.
    9. Performance is suspect: bindings are always by string - not a very efficient model. I would prefer:
      Code (csharp):
      1. Binding fire; Start() { fire = InputManager2.GetBinding("fire"); } Update() { if (fire.Pressed) { ... } }
    10. Joystick are just a pile of buttons ans axes, no logical names like "left stick". Not sure Unity can fix this, but at least we need a way to default to the standard XInput, DirectInput, and whatever Android/Linux usually has (I've hardcoded all).

    The above is based on what I've had to implement in my simple roguelike (WazHack) to do all the input handling that I believe is the basic level any game should have, and which certainly my customers demanded as the game grew to support all platforms.

    When that Unity fellow told us Joysticks where "Legacy" in the Steam Dev Days talk while in the same breath saying Unity supported SteamBoxes, I felt like screaming at my YouTube. When he then started talking about how "the Community" could solve all the brokenness of Unity, I finally lost my remaining faith in you. Please improve so we can care like we used to.

    ps. as for whether we "want it all now", the above is implemented in under 2KLOC, and most of it is repeating GUI and InputManager boilerplate. Give me your source code for a week and I'll send you a patch, if it's all too much work.
    Last edited: May 27, 2014
  36. npsf3000


    Sep 19, 2010
    In addition to all the above, give us control!

    The ability to:

    Create Input.
    Record ALL input. (And, by extension with the above, replay input!)
    Preview and Post-view Input (and cancel/consume it)

    Hippocoder mentioned it - it would be cool if we could map different keys to the same binding in different groups. Both Key A and Key B map to Up, but for player 1 and 2 respectively!

    For me it's very important you do two things:

    1) Provide low level access, there should be *nothing* your input manger does that we can't do.
    2) Provide a comprehensive API that's easy to modify. We shouldn't have to reinvent the wheel or buy someone elses - that's what we are paying you for. Should it be needed we should be able to tweak it to our needs.

    Think Testing.
    Think Replay Systems (obviously not going to work for every system).
    Think User Defined Macro's.

    An input class is an input class. The input does not have to come live from a physical controller to be useful input.
    Last edited: May 27, 2014
  37. angrypenguin


    Dec 29, 2011
    Edit for Unity: Oh yeah, who am I? Hobbyist indie game developer and also professional working with Unity in my day job (B2B simulation, training and visualisation) starting with 1.6 and roughly full-time since 2.4. I'm a very tools-and-workflow driven guy, and prefer to code things that extend my team's capability on top of the more immediate needs of shipping projects out the door.

    Assuming it has a decent API, how could it not be compatible? It'd be up to the coder to have their GUI make the correct calls with the correct data. The input system shouldn't give a hoot what kind of controller/GUI/network connection/psychic link the input originated from as long as it comes in via that API.

    Also, the "simulation" wouldn't be useful just for testing and Editor-side stuff. When writing touch-screen controls at the moment everything needs two code paths - one for getting input from the normal system plus one for the touch stuff. It'd be awesome if we could just write touch stuff that works via the input system - one path, less testing, more modular and reusable.

    My list, for now:
    - Runtime remapping and modification of the input data. Not just changing buttons, but also deadzones, sensitivity, and whatever else I might want to offer users. (Yeah, we can do all of that now by using raw data and doing it ourselves, but why reinvent wheels when we don't have to?)
    - Setting input values as well as getting them (i.e.: "simulation" as above).
    - Associating some metadata with an input. The use case here is loading icons and text for both the input's action ("jump") and the button ("X" on a 360 controller). Note that it might be best to do this indirectly (i.e.: let us store resource names instead of references to the resources themselves) so that we can integrate with localisation systems as necessary. At its simplest this could be a Dictionary<string, string> that exists for each input* so we can make calls like Input.GetMetaData("Jump").GetString("ActionIconResourceName");
    - Nicer methods to reference multiple joysticks/gamepads.
    - Better handling when importing packages that have InputManager settings. Seriously, when I was working on an Ouya port of my game I had to hand-update the Input list a bunch of times because every time they updated I had a choice of putting their updates in by hand (no thanks!) or having their updates completely overwrite my list. This also means that Asset Store vendors can't give us the option of simply adding a few extra inputs to our existing list, which would be really handy - right now they have to tell us to add stuff by hand.
    - Event-based input subscriptions. Polling for input makes sense a lot of the time and I certainly don't want to lose that. Other times subscribing to an event makes more sense, though. Subscribing to an event also gives us flexibilities like having a dormant object that gets activated upon receiving an input event without needing an intermediary object polling on its behalf. Not a big deal, but definitely a nice to have.
    - More events/properties for touches. There's a whole lot of stuff I've written into my own utility library like being able to check when a touch with a particular finger ID ended. It seems to me like your touch API is based around how the devices present the info rather than how we use the info. Built in stuff to look up a touch by finger ID would be neat. To put this in context, when we use input from a gamepad we don't have to first iterate through a collection of all gamepads to find a match (though I wouldn't mind being able to do that during my initialisation!).

    * Being able to extent the Input window so that we can improve the workflow based on what we're integrating with the Input system would rock somethin' hardcode, too. :)
    Last edited: May 27, 2014
  38. Julien-Lynge


    Nov 5, 2010
    Like Dreamora, npsf3000, and others above, I would like the input to be 'mutable', meaning that we could create our own input, consume input, and change existing inputs.

    Beyond things like testing the GUI, here's a more concrete use case for an app:

    Say I'm creating an app that uses Coherent UI to overlay a JQuery-based UI on top of my app, along with NGUI (or uGUI or whatever) for embedded GUI elements in the scene. And say that I'm targeting non-traditional inputs, like multi-user touch screens (e.g. PQ Labs' stuff) or Leap or a holographic display (like zSpace).

    Step 1, I want to be able to convert the custom input to Input as Unity understands it.

    Step 2, I want to check the user input against Coherent, since that's on top. If the event is over a Coherent element (where the alpha of the pixel > 0), then I want Coherent to consume the event.

    Step 3, then I want to pass the input to NGUI / uGUI / whatever and let it do its thing. If it can use up the input (because the user is over an element) it does so

    Step 4, finally any input that's left over I raycast out into the scene, to determine things like whether the user is hovering over any interactable objects in scene (and if so, I should highlight them).

    With this schema, an added benefit is that I could define events that mean something to me. For instance, when we started using NGUI, one of the first things I did was add OnMove and OnHold input, which NGUI was lacking, because I needed them for my particular UI. If you allow users the flexibility to change inputs, you guys don't have to try and guess everything that users may want to do.

    The way this would have to work (to use up / create inputs in a known order) would be to use the script execution order - as a developer in the example above, I'd make sure that my Coherent Update() method to handle input code runs first, then the NGUI / uGUI / whatever, then the code that raycasts into the scene.
  39. deram_scholzara


    Aug 26, 2005
    Yeah, that about sums up everything I've been hoping for.
  40. hippocoder


    Digital Ape Moderator

    Apr 11, 2010
    You guys should really check out InControl. It's everything Unity wants to be.
    PrimalCoder likes this.
  41. minionnz


    Jan 29, 2013
    InControl looks amazing!
  42. TylerPerry


    May 29, 2011
    I haven't used it recently but in the past InControl didn't have any easy way of adding touch controls so it required a extra layer on top.
  43. AnomalusUndrdog


    Jul 3, 2009
    I was referring to the fact that there's Input (meant for player controls), and there's Event (meant for GUI only). The two are separate systems afaik. What I said is please make sure it works with that as well i.e. OnGUI() (plus uGUI of course).

    And hey, I didn't say my requests are the only thing that's important.


    Anyway, I remember what devs did when testing the Kinect: it was a hassle to iterate the Kinect controls for their game cause they had to wave their hand at the Kinect then dart back to their monitors to make sure things are working fine, so they invented this tool to let them record input from the Kinect, then use that recorded stream of input when they test.

    I think it'll be useful for keyboard/mouse input too (e.g. integration tests concerning input).

    tl;dr: record your input to be used for playback in integration tests concerning input.

    will be especially useful for stuff like oculus and kinect, but should be equally useful for touch-screen input and traditional keyboard/mouse input.
  44. angrypenguin


    Dec 29, 2011
    Ahh, yes, makes sense.

    If there's an Event system, please, for the love of God make it generic and accessible! Everyone and their dog on the Asset Store has their own version of passing events around. If Unity's getting in on that I want them to clear it up, not add to it!

    :confused: Not sure where that came from?
  45. AnomalusUndrdog


    Jul 3, 2009
    Haha, nevermind. I thought you meant something else.
  46. Cygon4


    Sep 17, 2012
    I've designed a multitude of input systems over the ages (all the way from DOS to window messages + Direct Input to XINPUT for the Xbox and later touch input for mobile devices) in .NET and C++. So I have a little bit of expertise in this area.

    What I would consider important would be:

    • Input binding as a separate layer on top of raw input device access Developers that want to do their own binding system should be able to do so.
    • Hide input device unavailability and switching. Developers should be able to query whether an input device is attached, but retrieving input from an unattached input device should simply return neutral inputs (buttons up, axes centered).
    • Give us a base profile across all platforms. What worked for me was to always have at least 1 keyboard, 1 mouse, 4 game pads, 1 touch panel in all cases. If you query mouse input on a mobile device, you talk to the dummy mouse which returns neutral inputs. Memory use + overhead of nearly zero. Performance freaks can check if eg. a mouse is attached, but import things is, all platforms support this base profile.
    • Simulation of inputs In order to allow for replays, integration testing and mockups. I usually do this via a MockInputManager that can be used instead of the InputManager, but Unity's unfortunate limitations in this area may mean having to add input simulation right into a real input devices.

    After tweaking my design for years I've settled on the model below. It allows simple code like:

    Code (csharp):
    2. bool buttonDown0 = InputManager.GamePads[0].IsButtonDown(0)
    3. bool escapePressed = InputManager.Keyboards[0].WasKeyPressed(Key.Escape)

    On top of the input device model, there should be an input mapper that allows bindings to be managed by code. The current input manager, which has actions that the user can only change in Unity's settings dialog is less than ideal...

    I'm currently running my Unity game with this design (built on top of Unity's InputManager):


    If anyone is interested, this the exact input system described at the top implemented for .NET / XNA:

    This is the same in C++11:

    I have a full implementation of the InputMapper I describe above for Unity, too, and was considering putting this on the asset store, but if there's interest, I could maybe just make the code available somewhere.
    Last edited: Oct 26, 2014
    shkar-noori likes this.
  47. Agostino


    Nov 18, 2013
    So, do you need to make this check every frame?

    Can you also subscribe to events?
    I see you have <<event>> and EventHandler.
    Could you please elaborate more on what they are?

    Yes, please.
    Last edited: May 27, 2014
  48. bdovaz


    Dec 10, 2011

    And also, PLEASE implement Windows 8 (and 7 if you can) multitouch Input in Unity!!

    Take a look at this:

    Unity it's Crossplatform why not "Input" class?
  49. goldbug


    Oct 12, 2011
    My wish list:

    1) Event based. Why am I wasting cpu polling every frame for key presses? I would love to be able to say "when key 'x' is pressed, call this function". For example:

    Code (csharp):
    1. [OnKeyUp("Menu")]
    2. public void MenuPressed()
    3. {
    4. ...
    5. }
    It can be annotations, or let me register my function with the InputManager, or whatever. Just make it event based.

    2) Allow me to make my own input drivers. Say I want to integrate with controller X, but it is not hid, let me make my own driver that generates input events.

    3) Allow me to detect input devices. Is there a joystick? Is there a mouse available? is there a touchscreen? Ideally raise events when these devices are plugged in/plugged out.

    4) Make the darned mouse emulation optional. Right now it is not easy to treat mouse events and touch screen events differently. Mouse emulation is always on, and there is no way to tell if you actually have a mouse or only a touchscreen.

    5) Allow me to change input mapping at run time. I would love to have a settings page where the user can decide what keys to use for different actions.
  50. thelackey3326


    Sep 17, 2010
    Hope you guys are checking out the Feedback.

    Linked above is my suggestion in Unity Feedback from a few years ago when the company I worked for switched from Delta3D (an open-source engine) to Unity3D. I really liked Delta3D's idea of a completely generic input device. We were able to integrate pretty much anything as an input "device" with Delta3D, because an Input Device is just a container of buttons (booleans) and axes (floats). You add the buttons and axes you need, and then update the values from your input source in code. Because it's so generic, anything can be an input source: 3D mouse, serial port, IR remote control, generic USB device, chunk of network code, an AI, or even a combination of those, etc. Out-of-the-box support for keyboard, mouse, gamepad, touch, accelerometer, gyro, or whatever could be included with Unity to meet basic needs.

    For ease of use, an input device could be created in the editor and serialized as an asset in your project. But, they could also be created and changed in code.

    I've also worked in systems that treated everything as an axis. That way you could have pressure sensitive buttons, or give them a threshold and treat them as two-state.

    Last thing. Would it be possible to get rid of string-addressed inputs?