Search Unity

Rewired - Advanced Input for Unity

Discussion in 'Assets and Asset Store' started by guavaman, Sep 25, 2014.

  1. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    What mouse is being used by the user?
     
  2. f0ff886f

    f0ff886f

    Joined:
    Nov 1, 2015
    Posts:
    201
    A Logitech Performance MX.
     
  3. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Thanks. I'll fix this cosmetic issue in the next update.
     
  4. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Thanks. I don't think there would be anything special about this mouse.

    All mice are going to have different DPI values. The values provided by Raw Input are extremely simple X and Y delta values. These are the lowest-level values you can get from the mouse on Windows and do not include any of Windows pointer ballistics. If your users have extremely high DPI mice, but have the Windows sensitivity set so the cursor moves slowly on the screen to counteract the resolution of the mouse, this sensitivity is not going to be passed along to Raw Input and the mouse will be incredibly sensitive.

    You just need a lower minimum multiplier value.
     
    f0ff886f likes this.
  5. Djaydino

    Djaydino

    Joined:
    Aug 19, 2012
    Posts:
    48
    Hi.
    I am using Playmaker Actions to get the Keycodes.
    But for the axis i can get the positive but not the negative value (or at least i don't know how :) )

    upload_2020-4-22_18-23-36.png
    upload_2020-4-22_18-24-3.png
    is there another action to get the negative Id?

    Kind regards,
    djaydino
     
  6. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    There is no negative id. And setting "Element Type" to Axis for a keyboard key is not correct. You are thinking that means the Action type but it doesn't. It means the physical controller element type that you want to search for. And it does nothing anyway unless you enable "By Element Type."

    Element Maps can only bind to one element at a time. By definition, they bind one Action to one Element. If you're binding a keyboard key to an Action, that by definition will be a split binding because a button can only bind to one pole if an Action. Key to Positive Axis or Key to Negative Axis.

    In order to make key bindings to two poles of an Axis, you have to create two bindings in the editor. You can't do this on a single binding. Therefore, you also cannot get both bindings from calling "Get First Element Map Id With Action". This will only ever get the first binding to an Action. You have two and you call it twice, you'll get the first binding twice.

    This "Get First Element Map Id With Action" PlayMaker Action is a wrapper for the same GetFirstElementMapWithAction in the C# API. This function allows you to find the first Element Map that matches certain criteria. This does not including matching on the pole of the Action to which it is bound. In the C# API, there are many other functions to use to find Element Maps including the ability to iterate over all the Element Maps. This is not available in Play Maker.

    The PlayMaker Actions do not include most of the Controller Map system due to the sheer size and complexity of those systems and limitations of PlayMaker's ability to work with and store objects, structs, and arrays. PlayMaker is not intended to be a complete replacement for code in Rewired. You cannot, for example, build a controller remapping system in PlayMaker.

    There is no way you would be able to do this without making changes to the PlayMaker Actions to get multiple bindings to the same Action.
     
    Last edited: Apr 22, 2020
  7. Djaydino

    Djaydino

    Joined:
    Aug 19, 2012
    Posts:
    48
    Hi
    Actually it still gives the positive id even when not enabled.

    if enabled it has to be set as button when controller type is set to keyboard.

    After some testing i noticed that ID(Unique id) is always in the same sequence as in the map Elements.
    So i used that to get they keys.
    So for example when element 0 (up / axis positive) is id 2121
    element 1 will be 2122
    So i set 'down' (axis negative) on element 1 and then i know that the id will be 2121+1

    I know this is probably not the best way to do this, but it works and i need to get this done asap :)

    Before with Playmaker it might be very hard to do, but i do think now it is possible to do so.

    At the moment i do not have time to try this, but i will definitely try to build a remapping system later.
    (with making custom actions that is)
     
  8. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    It is being totally ignored when not enabled. It is not factoring into the selection criteria so everything else you have that is enabled is being used, but not that option.

    if enabled it has to be set as button when controller type is set to keyboard. Keyboards do not have axes. There is no reason to enable it in this case.

    No, that is a very bad way to do it because there is no guarantee your mappings will be in that order, especially if saved data is loaded or the user changes the bindings somehow.

    Replace the Rewired/Intergration/PlayMaker folder with the contents of this zip. You will now have the By Axis Contribution selection criteria option. This is the best that can be done with the "GetFirst" type of method.
     

    Attached Files:

    Last edited: Apr 22, 2020
  9. Djaydino

    Djaydino

    Joined:
    Aug 19, 2012
    Posts:
    48
    Hi.
    thank for the quick reply and action update!
    I will test right away.
    Thanks a lot!
    I will let you know if it worked
     
  10. Undertaker-Infinity

    Undertaker-Infinity

    Joined:
    May 2, 2014
    Posts:
    112
    Hi
    A few years have gone by and I'm still using ReWired because it's awesome and your support is amazing.

    I set up a touchpad with it's custom controller and it's working well. I need to rotate the touchpad during runtime, but when doing so, the input is still aligned to the screen, not the RectTransform. Is there a way to have the behavior of the touchpad align to the RectTransform? I'd love for it to be usable WHILE rotating.
    If this is not possible, I thought of just swapping and/or inverting the touch pad's axis according to the final position, which is 90deg to either side, but I couldn't find how to do that by code in the docs.

    Basically I'm doing portrait/landscape mode manually, and the touchpad refuses to acknowledge it's rotation.
    Thanks!
     
  11. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Thanks! I'm glad you like it and are still using it.

    The Touch Pad control does not consider any Transform values. The value is simply a vector drawn from the starting touch point to the current touch point on the screen. You also cannot swap X and Y, but you could do that in code after the fact. You could also transform the vector it returns in code after the fact to make it rotate with the Transform. You won't be able to do what you are trying to do directly with the Touch Pad component.
     
  12. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Actually, you could swap the axes by swapping the Custom Controller Elements they're assigned to:
    https://guavaman.com/projects/rewir...Pad_horizontalAxisCustomControllerElement.htm
    https://guavaman.com/projects/rewir...chPad_verticalAxisCustomControllerElement.htm

    You can also invert the axes through Axis Calibration:
    https://guavaman.com/projects/rewir...ntrols_TouchPad_horizontalAxisCalibration.htm
    https://guavaman.com/projects/rewir...Controls_TouchPad_verticalAxisCalibration.htm
     
  13. Undertaker-Infinity

    Undertaker-Infinity

    Joined:
    May 2, 2014
    Posts:
    112
    Ah! Those are exactly what I needed, but they are read only :(
    I'll have to roll out my own touchpad with calls to the custom controller, right?

    edit:
    I can write to touchPad.horizontalAxisCalibration.invert, so that's good.
    Rewired.ComponentControls.TouchPad.horizontalAxisCustomControllerElement and horizontalAxisCustomControllerElement.target are both read only, so I can't replace the element or change the target.
     
  14. Djaydino

    Djaydino

    Joined:
    Aug 19, 2012
    Posts:
    48
    Hi.
    It works great!
    Only thing might be useful is the have a 'not found' or 'failed' event (when result is -1)
    on the "Rewired Player Get First Element Map Id With Action"
    Now i used a int compare as i have some mouse actions as well, so if -1 i check mouse controller type.

    Thanks a lot!

    i will send a Steam key of our game as soon as its released (probably mid May)
    That's the least i can do for your support :)
     
    guavaman likes this.
  15. MatthewDickson

    MatthewDickson

    Joined:
    Jan 8, 2019
    Posts:
    2
    Hi all,
    Quick question, How do I set the default Input Behaviour mouse axis sensitivity via code?
     
  16. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    They are class objects. You don't replace them, you modify properties on them. Start typing in your IDE and you will see properties appear that you can change. Nothing in any component is ever modifiable only from the inspector. There is always a code interface to change everything also.

    Code (csharp):
    1. touchPad.horizontalAxisCustomControllerElement.target.element.selectorType = ComponentControls.Data.CustomControllerElementSelector.SelectorType.Id;
    2. touchPad.horizontalAxisCustomControllerElement.target.element.elementId = 3;

    https://guavaman.com/projects/rewir..._CustomControllerElementTargetSetForFloat.htm
    https://guavaman.com/projects/rewir...ntrols_Data_CustomControllerElementTarget.htm
    https://guavaman.com/projects/rewir...rols_Data_CustomControllerElementSelector.htm
     
    Last edited: Apr 23, 2020
  17. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    MatthewDickson likes this.
  18. Undertaker-Infinity

    Undertaker-Infinity

    Joined:
    May 2, 2014
    Posts:
    112
    Awesome. For some reason, the first time I looked at the docs I couldn't click or find the CustomControllerElementTargetSetForFloat class and VS wouldn't show me properties either. Now everything works and I have my rotating touchpad.
    Thanks!
     
  19. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    The reason the web docs didn't show it was the XML for those methods was excluded from my exported documentation for some reason. I fixed that on the web. But that wouldn't affect the IDE.

    Glad it works.
     
  20. Darkon_Who

    Darkon_Who

    Joined:
    Jul 22, 2017
    Posts:
    14
    Hello @guavaman Firstly thank you for the amazing asset, it is working really well so far!

    A few tips i would like to ask for please if you have a moment!

    What is the best way you recommend to handle sensitivity settings for multiple input types (touch, mouse, controller) - As i need to support all 3 but would prefer having a single sensitivity option.

    Also the one last thing, I need the right half of the screen to be a touchpad which is used for aiming, but then i need a shoot button to be there as well, The shoot button should be static but aiming should still work if the user moves their finger across the screen, any tips around this please (PUBG Mobile is a good example)?

    Thank you again!
     
  21. vladimir-fs

    vladimir-fs

    Joined:
    Nov 28, 2019
    Posts:
    23
    Hi,

    After solving the Start button issue I had (or avoiding it, rather) I poked around Android controls for a bit. I'm still using an Android 10 device and a standard XBox One controller. We're using Rewired 1.1.19.9.U2018.

    As a preface, I'm running the code from https://guavaman.com/projects/rewir...l#debug-information-diagnosing-input-problems to check out what is assigned and mapped on Rewired's side and using a check for GetKeyDown in all codes in Enum.GetValues(typeof(KeyCode)) to check out what is being triggered on Unity's side.

    Regarding the Start button.
    It's reading as the following keycodes: Return, JoystickButton10 and Joystick1Button10. Rewired is returning what's mapped on the A button, probably because of the Return keycode from Unity. Now that I know what's going on I can detect this specific combo of Android + Return + Start button and handle the input as if the only button pressed is the Button10. Do you agree with that approach?

    I'm having another problem on the same setup.
    PlayerID 0 is assigned 1 joystick (a DS4).
    PlayerID 1 is assigned 0 joysticks.
    Pressing R1 (happens only on R1) on Player ID 0's DS4 results in both of these lines returning true:

    ReInput.players.GetPlayer(0).GetAnyButtonDown()
    ReInput.players.GetPlayer(1).GetAnyButtonDown()

    Unity reports Keycodes JoystickButton5 and Joystick1Button5 being pressed. Any idea what might be causing this?
     
  22. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    I don't have any recommendations except to tell you not to try to lump all control types into one sensitivity slider. That's not going to work with the wide variety of devices out there. Consider this for example:
    https://forum.unity.com/threads/rewired-advanced-input-for-unity.270693/page-124#post-5745322
    https://forum.unity.com/threads/rewired-advanced-input-for-unity.270693/page-125#post-5745877

    Input Behaviors can be used for mouse, joysticks, and custom controllers, but only support linear sensitivity and apply to all axes on all devices for that Action. Joystick Calibration is far more versatile for joysticks and can be applied independently to individual axes.

    If you're saying you want to be able to press the shoot button and then start sliding your finger around the screen while it's held down and have that start aiming while you're still shooting, you can't do that with the Touch Controls included in Rewired. If you want that, you have to build your own Touch Controls.

    Rewired's Touch Controls are built on Unity UI. Unity UI works by raycasting and sending events to the top-most element in the stack. Having a button on top of a touch area, the button will capture the raycast and the touch pad underneath will have no knowledge of it until the finger moves outside the raycast blocking box. Building something like you're describing requires either a specialized control -- a button that also tracks finger position -- or else changing Unity UI's system at a low level to use Raycast All instead of Raycast, but that would lead to a lot of other unwanted issues most likely.
     
    Last edited: Apr 24, 2020
  23. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    That's from 9/2018. I can't support that beyond general Rewired usage. Any specific issues you have with it, it's far too old for me to deal with. You really should consider updating.

    Have you seen this?

    https://guavaman.com/projects/rewired/docs/Troubleshooting.html#android-joystick-keyboard-keys

    No, I have no idea what's happening. The only way player.GetAnyButtonDown() will return true is if the controller on which the button is pressed 1) is assigned to that Player 2) has an Action mapped and enabled on that Player assigned to the button you're pressing. There's no possibility of cross-over of devices in Rewired's code.

    I suggest you use the Debug Information prefab mentioned in that same link you sent and expand and watch the data both the Joysticks are returning and the Actions returned by each Player as you press that button. If a button or action is activating on Player 1 when you press Player 0's joystick and you verify that joystick is not assigned to the Player, then there is no other possibility but that something is happening at the Unity level.

    Use the Rewired/DevTools/JoystickElementIdentifier to watch what happens on the joysticks. You will have to plug in a keyboard in order to be able to cycle through the connected joysticks to see what's being returned by UnityEngine.Input for both.

    Additionally, it could also be a key code being returned by Unity. Are you certain you're logging all possible key codes?
     
  24. IceBeamGames

    IceBeamGames

    Joined:
    Feb 9, 2014
    Posts:
    170
    Hey @guavaman

    I was wondering what the best way to handle controller input for a couch co-op style four player game. Specifically, how to handle the menu control / input before I have established how many players are playing and what controllers are they using? Is it best to assign all controllers to the "system" and only have menu control maps on the system player? Or is it better to try and use player one for all the menus? I couldn't see this specified in the "how to" guides you have.

    Tom.
     
  25. minad2017

    minad2017

    Joined:
    Dec 1, 2016
    Posts:
    50
    hi,
    In the game tutorial scene,
    I would like to issue a message saying "Please press the" E "button to open the door."
    Is there a way to get the key name assigned to it, the name of a gamepad button, etc. from the action name?
    I made a terse script based on the documentation, but nothing was stored in the string variable, no error.
    Code (CSharp):
    1. Text _text;
    2.  
    3.         Player player = Rewired.ReInput.players.GetPlayer(0);
    4.  
    5.         void Start()
    6.         {
    7.             _text = GetComponent<Text>();
    8.             bool skipDisabledMaps = true;
    9.             _text.text = player.controllers.maps.GetFirstElementMapWithAction("Choice", skipDisabledMaps).elementIdentifierName;
    10.     }
     
  26. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    There is no "best" way. This question really applies to any multiplayer game made with any input system, not just Rewired, so it's not a question I can give you a definitive answer to. Every multiplayer game is going to have to make decisions on how to handle controllers. It also would depend on the target platforms; for example, do you have keyboard/mouse available or not?

    There are many ways to approach it, all with their own advantages and disadvantages. And a lot of it simply depends on how you want it to work. For example, should your game's UI's be controllable by any player or just one player? That's up to you and affects how you would approach it.

    I happen to like joystick auto assignment where it assigns controllers to all players from those found on the system. Then the users can change their assignments from some menu if they don't like their current assignments. Many people seem to like to dynamically assign joysticks based on user interaction (press-start-to-join).. This really is up to personal choice.

    Most likely, if you're planning on publishing on a console, you don't want to make your UI only controllable by the System Player because once you assign the controllers to Players, nobody will be able to control the UI anymore. You will have to allow at least one Player to control the UI if not all of them. That's up to you. If you want to initially assign all the controllers to the System Player before any assignments have been made, you can certainly do that and that is one of the suggestions for how to handle a press-start-to-join system. If you're using joystick auto-assignment, I do not see the need to do this as you won't get any benefit out of it.
     
  27. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    It's in the documentation:
    https://guavaman.com/projects/rewired/docs/HowTos.html#get-element-name-for-action
    https://guavaman.com/projects/rewired/docs/HowTos.html#display-glyph-for-action

    ActionElementMap.elementIdenfitierName definitely does contain the string name of the element identifier. It would never be blank. The only way it would be blank is if you have a an empty ActionElementMap in your Controller Map that isn't assigned to a controller element. These are filtered out and not created when Controller Maps are made from the Rewired Input Manager, but you can certainly create one of these yourself if you're doing some kind of map manipulation through scripting. Use Debug Information to find your issue:
    https://guavaman.com/projects/rewired/docs/Troubleshooting.html#debug-information
     
    Last edited: Apr 25, 2020
  28. CurryKitten

    CurryKitten

    Joined:
    Dec 22, 2016
    Posts:
    36
    Hi there,

    I was asked by a user about the deadzone - right now it's taken the default deadzone - which for an unknown controller looks to be 0.1 He was noticing the deadzone and was wanting to turn it off - which seems reasonable given that the controller is very accurate (unlike a 360 joystick which I notice can drift a bit with a deadzone of 0.25)

    My first though here was to have and deadzone on/off option and then set the deadzone value to 0 on the particular axis of the calibrationmap. But then I thought - is there a way to reset this the default.. but coudn't see an answer to that.

    Possible more sensible if to present a deadzone slider from 0 to something. Is there a reasonable figure of what something should be? Presumably this is a 0-1 value, in which 025 already feels pretty big and still drifts
     
  29. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    It's better to just allow the user to set the dead zone to whatever they want, though you do need to prevent them from going to 1.0 or their joystick will stop functioning. If this is a console, they'd have no way to recover.

    You can't reset just the deadzone on the AxisCalibration but you can reset all values:
    https://guavaman.com/projects/rewired/docs/api-reference/html/M_Rewired_AxisCalibration_Reset.htm
     
  30. IceBeamGames

    IceBeamGames

    Joined:
    Feb 9, 2014
    Posts:
    170
    Ok, I want to assign controllers using "press start to join" once through the splash screen and into the game's main menu (I've already implemented a press button to join system based on your examples). I want the press to start be something rather agnostic from the menu flow, similar to how its done in Overcooked 2, so players can sign in at any time. Its going be on xbox, ps4, switch and pc and I'd like a mouse and keyboard to be one of the controller options (on steam). I want the game's menu to be usable by all the controllers even if they haven't been assigned to players using the "press to start" way of establishing them.

    How should I go about doing this? Should I assigned all controllers to the system as soon as they are connected and then hand them over to the player when they join? Can the system player have more than one controller assigned?
     
  31. f1chris

    f1chris

    Joined:
    Sep 21, 2013
    Posts:
    335
    Hi @guavaman

    I had a strange issue in my game project. Rewired was integrated since awhile but recently, left axis stopped reading any values ( even tried to debug.log at source ).

    After hours trying to find out what I screwed up, I decided to create a fresh project with only Rewired imported (latest version) and created an iOS build for your Gamepad UI example. At my surprise , same thing, no input for "move horizontal" but "move vertical" still working fine.

    I'm on OSX (latest version), Unity 2019.3 ( latest version), tested with Nimbus & Horipad MFI controllers but also getting same results with a Dualshock4.

    This is happening on an iOS build only (didn't try Android) but when running on desktop all is working fine.

    Any idea ? I just have no clue at all where to look at.

    Thanks
    Chris
     
  32. Grumpy-Dot

    Grumpy-Dot

    Joined:
    Feb 29, 2016
    Posts:
    93
    @guavaman Hello, sorry if this was asked before, I'm wondering if you have any plans to add support for Thrustmaster TMX Racing Wheel. If not, then I would really appreciate any pointers before I dive in and implement it as a custom controller. Thanks!
     
  33. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    That is one of the exact scenarios described in the documentation:
    https://guavaman.com/projects/rewired/docs/HowTos.html#press-start-to-join-detect-start-manually

    1. (Optional) disable Joystick auto-assignment in the Rewired Editor.
    2. Assign all Joysticks to the System Player on start so we can detect the result of Actions on Start and whenever one is connected.
    3. Check for the value of the "Start" Action you've created and assigned to the Start button or other appropriate button in your Joystick Maps.
    4. Get a list of contributing input sources for the "Start" Action and get the target Joystick from InputActionSourceData.controller.
    5. Assign that Joystick to Player 1 and deassign it from System.
    There are no limitations on controller assignment in Rewired. You can have all controllers on the system assigned to all Players at the same time if you want.

    You would have to set up Unity UI to be controllable by the System Player and at the very least Player 0, but probably all of them.

    You also will have to manage what happens when Players quit the game and re-assign their controllers back to the System Player.
     
  34. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Yes.

    https://guavaman.com/projects/rewir...#ios-tvos-13-gamepad-left-stick-x-not-working
     
    f1chris likes this.
  35. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    I don't have any plans.

    I would not implement this as a Custom Controller. Those are for a different purpose. This should be added as a new controller definition:

    https://guavaman.com/projects/rewired/docs/HowTos.html#new-controller-definitions
    https://guavaman.com/projects/rewir...dd-controller-to-existing-controller-template
     
    Last edited: Apr 26, 2020
    Grumpy-Dot likes this.
  36. vladimir-fs

    vladimir-fs

    Joined:
    Nov 28, 2019
    Posts:
    23
    Yeah, we're super close to a release now but will update ASAP.

    I have not! I'll give this a spin.

    I'm pretty certain one player is assigned per controller. I'll check again though.

    I'll try the JoystickElementIdentifier and see what comes up.

    I'm pretty sure all key codes are in Enum.GetValues(typeof(KeyCode)) and that's what I'm using to run through all inputs.


    Thanks for the quick reply. It's great info!
     
  37. danbg

    danbg

    Joined:
    May 1, 2017
    Posts:
    64
    Multidisplay issues with UI

    Hello guavaman, I'm trying to use 4 displays, each one with its own UI and its own cursor controlled with a gamepad, and using mouse as a "master cursor" that can use every display UI. I manage to make it working using 4 different game clients, but to increase performance I'm trying to get all the displays working in the same game client using multidisplay. Fortunately, multidisplay works much better in Unity 2019.3 and I manage to make it work just with the mouse, but using Rewired it gives me a lot of trouble.

    Right now each screen has its own set of scenes, and there is also a "main" Gameobject within DontDestroyOnLoad, but I'm not sure which would be the optimal way to distribute RewiredInputManager and RewiredStandaloneInputModule, because the OnEnableStateChanged and OnScreenPositionChanged events must point to PlayerMice cursors in other scenes. I'd like to have each cursor in its own Scene, so the canvas could render with its own camera for postprocessing, but maybe the only way is to put all the cursors in the same scene and use canvas overlays for each display? (I tested the multiple canvas with just one EventSystem, RewiredInputManager and RewiredStandaloneInputModule, and although I can see and move the cursors on each display, only the UI in the first display is responding)

    I have tried the new InputSystem in Unity (it has a MultiUser Event System) but I'm not sure if it's compatible with Rewired, because I got many errors in my first tests.

    Thank you.
     
    Last edited: Apr 27, 2020
  38. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    First, I have not ever attempted the kind of setup you are describing, so I cannot tell you how to achieve it without trying to replicate it myself and work out all the issues.

    1. Rewired is not compatible with the new input system:
    https://guavaman.com/projects/rewired/docs/KnownIssues.html#not-compatible-unity-new-input-system

    That does not mean you can't use the code from the MultiUserEventSystem. I have never looked at that, so I have no idea how it works or whether it could be used with Rewired.

    2. You do not have to use the PlayerMouse component and the Unity events. The PlayerMouse component is nothing more than a wrapper around the PlayerMouse class. This class can do every single thing the component version can do. There are C# events exposed on it equivalent to the Unity events exposed on the component version. You can also use the PlayerMouse class with the RewiredStandaloneInputModule by adding the mice to it through code. (RewiredPointerInputModule.AddMouseInputSource). This would eliminate any cross-scene inspector linking issues.

    3. If you are trying to use multiple EventSystems, you can't do it without hacks. The problem here isn't the RewiredStandaloneInputModule but Unity's EventSystem because that is what is responsible for driving the Rewired Standalone Input Module. The exact same issues would come up if you were trying to use the StandaloneInputModule with the normal EventSystem. I didn't write the EventSystem, so all I can tell you about it is what I have discovered through trial and error and searching through Unity's UI source code trying to help you and others make Unity UI do things it wasn't designed to do.

    Multiple EventSystems will not work because it was designed by Unity as a singleton which is reason they had to create the MultiUserEventSystem for the new input system. EventSystem.current is the current active EventSystem. That's a singleton pattern and means there is one event system running which at .current. Accessing EventSystem.current always gives you the same EventSystem. While you can change the current EventSystem by setting that value, only one EventSystem per frame will update when the Update function is called on its MonoBehaviour. This would obviously not work because you need all your EventSystems to be running every frame.

    This is caused by Unity's code in the EventSystem:
    https://bitbucket.org/Unity-Technologies/ui/src/2019.1/UnityEngine.UI/EventSystem/EventSystem.cs

    Code (csharp):
    1. // Line 340
    2. protected virtual void Update()
    3. {
    4. if (current != this)
    5. return;
    That code prevents anything but the current EventSystem from updating. One EventSystem at a time. Only the singleton updates, the rest do nothing. Of course, you could extend the EventSystem classs, override the Update method and remove that line of code. Without doing that, the only thing I can think if is to hack it and force each EventSystem to run by calling their Update methods through reflection.

    This is the best I could come up with through trial and error and searching through Unity's source code and testing workarounds in scripts. It seems to work testing with two Canvases, two EventSystems, and two columns of Buttons controlled by different sets of keyboard keys.

    Code (csharp):
    1. using UnityEngine;
    2. using UnityEngine.EventSystems;
    3. using System.Reflection;
    4.  
    5. public class MultipleEventSystems : MonoBehaviour
    6. {
    7.     public EventSystem[] eventSystems;
    8.  
    9.     private MethodInfo eventSystemUpdateMethod;
    10.  
    11.     private void Awake() {
    12.         eventSystemUpdateMethod = typeof(EventSystem).GetMethod("Update", BindingFlags.Instance | BindingFlags.NonPublic);
    13.     }
    14.  
    15.     void Update()
    16.     {
    17.         // Update each EventSystem
    18.         for(int i = 0; i < eventSystems.Length; i++) {
    19.             EventSystem eventSystem = eventSystems[i];
    20.             if(eventSystem == null) continue;
    21.  
    22.             // Get the input module for this EventSystem
    23.             BaseInputModule inputModule = eventSystem.GetComponent<BaseInputModule>();
    24.  
    25.             // Enable the InputModule first
    26.             inputModule.enabled = true;
    27.  
    28.             // Set the current EventSystem or Update will not execute.
    29.             EventSystem.current = eventSystem;
    30.  
    31.             // Manually run Update() on the EventSystem so it runs the InputModule.
    32.             eventSystemUpdateMethod.Invoke(eventSystem, null);
    33.  
    34.             // Disable InputModule to prevent it from being updated during the EventSystem.Update
    35.             // function that Unity calls or you will get double events per frame.
    36.             inputModule.enabled = false;
    37.         }
    38.     }
    39. }
    40.  
     
    Last edited: Apr 28, 2020
  39. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    It is also possible only the first screen's UI is responding because PlayerMouse.screenPosition is clamped to the screen with the default setting of Movement Area Unit as Screen. The mouse has no knowledge of which screen it is on, but the PointerInputModule might expect to see an absolute pixel position from the mouse that spans all the screens, but PlayerMouse will be returning a value clamped to UnityEngine.Screen.width and UnityEngine.Screen.height.
     
    Last edited: Apr 27, 2020
  40. virgiliu

    virgiliu

    Joined:
    Apr 4, 2015
    Posts:
    46
    I'm trying to create a system that records and replays user input in order to animate an NPC. Is it possible to store all user input data and events in a certain frame for later use? My goal is to record and replay about 4 minutes of input.
     
  41. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Rewired provides nothing for you to do this. You can certainly manually record all the Action values you want. The only way you can play them back is through a Custom Controller.
     
    virgiliu likes this.
  42. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    @danbg

    Yep. This is exactly what's happening. PlayerMouse will not work on any display except the primary if the Movement Area is clamped to the screen because it does not know anything about multiple displays and calculates its screen position from Action delta values. The screen position returned can be used to move the UI cursor on the non-primary display because that value is being translated into canvas pixels when moving the UI cursor and that canvas exists in its own space with 0,0 in the lower-left corner. PointerInputModule expects to receive an absolute position value that ranges from 0,0 in the lower left corner of the primary display to the maximum cumulative pixel value across all screens.

    Making this work is very tricky. If you change PlayerMouse Movement Area Unit to Pixels and manually set the pixel area to the maximum total screen area of all screens, the mouse cursor will return the correct value for PointerInputModule, but it will return the wrong value when trying to drive the UI cursor with PlayerMouse.screenPosition because that expects a screen-space value for the screen the Canvas it exists on occupies. It will also not be clamped so the cursor will be allowed to move far off the screen.
     
  43. danbg

    danbg

    Joined:
    May 1, 2017
    Posts:
    64
    Hi guavaman, thank you for your great info. If it works with 2 displays, it will work with many as well. I'll try your code, just 2 questions:
    1. Do I need to extend and hack EventSystem to use your MultipleEventSystems code?
    2. Besides the multiple EventSystem, do I need to add multiple RewiredStandaloneInputModule? In that case how do I link the RewiredStandaloneInputModule with each EventSystem (inspector or code)?
    By the way, what you added about the PlayerMouse could also affect, because checking the values in the InputSystem on runtime, I clearly see that it's using the absolute pixel position... maybe updating the values of PlayerMouse adding the display number and dimensions? Not sure how to do that... Any tips?

    This thing is driving me a bit crazy. I tried to use the new InputSystem but I'd rather use Rewired. The new InputSystem is pretty obscure in some issues and Rewired is much easier to setup. Thank you again for your support.
     
  44. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    No. That's why it uses reflection. If you're not trying to use multiple EventSystems, do not use this code. What you are trying to do can be done without multiple EventSystems. I was not suggesting that you use multiple. I was assuming you were trying to do that.

    Yes, you would. These would be completely independent EventSystems with their own InputModules.

    The problem is, you have to resolve all those issues I described. It's pretty much a total mess.

    1. PlayerMouse must return a value in absolute total screen space. In order to do that, you must change the Movement Area Unit to Pixels and set the Movement Area to the min and max bounds of the current display in total display space. Loop through all displays, get the resolution, and add them together. Add the width of all screens before the current and use that as the min X of the PlayerMouse Movement Area. Add the width of the current display to that value and use that as the max X of the PlayerMouse Movement Area. Min/Max Y should be the same for all screens, so it needs to be 0, Screen.height. (I have no idea if Unity supports multiple screens with different resolutions. If it does, this is going to be a big problem.)

    2. The UI pointer cannot directly use the PlayerMouse.screenPosition anymore to move around. You have to transform this value from total screen space to current screen space. Same concept as above. Subtract the total width of all screens before current to get the current screen space. Use this transformed value to move your pointer.
     
  45. danbg

    danbg

    Joined:
    May 1, 2017
    Posts:
    64
    Then, if I don't use multiple EventSystems I just need to extend EventSystem? But how do I update taking into account all the PlayerMouses?

    In your expert opinion, what would be a better approach? hack the EventSystem or use multiple?

    Regarding the PlayerMouse coordinates, maybe the solution could be to update the values of PlayerMouse passed to PointerInputModule with every update from your new MultipleEventSystems taking into account the display where it is located. Not sure if it works that way. Yes, Unity 19.3 supports UI with multiple displays and different resolutions (in fact, all this works if I just use one cursor and the old and new input system, but not with multiple)

    My setup also uses Kinects and TUIO touch systems to add a total of 8 displays simultaneously... Things work with multiple Unity intances and using socket.io, but the performance is HORRIBLE compared to getting everything working in just one Unity instance (each one is taking around 20-30% of the PC resources and just around 40% if I join everything in one instance). The only issue blocking the single instance is the multiple cursors problem... Right now the alternative I'm thinking about would be using multiple PCs, but that would be a waste of resources and money (I'd rather try to make this more ecofriendly and sustainable)
     
  46. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    Okay. Let's take a step back here...

    I do not have a complete picture of your set up. It sounds incredibly complicated.

    If you are trying to run 8 independent displays all acting as if they are independent instances of Unity each with its own self-contained UI, then yes, you must have multiple EventSystems or write your own multi-user event system. There is no other way to do it because EventSystem can only handle 1 selected GameObject at a time. If you do not have multiple EventSystems or a multi-user event system, when any UI object is selected by any of the screens, it will deselect the object on the other screens. That's obviously not what you want.

    So the answer is, yes, you are going to either have to run multiple EventSystems using the hack I made above (or extend EventSystem) or make a multi-user event system.

    That's only one of the problems we are discussing here.

    The other problem is how to get the pointer coordinates in the right form for both the PointerInputModule and for being able to move the cursor in the UI on the screen you're using.

    I've explained the premise in the previous post. The PlayerMouse movement area needs to be clamped to absolute screen rect in which it should be able to move. Then the resulting screen position it returns needs to be transformed into the correct space for the Canvas you're using on the other screen.

    That's the purpose of clamping the Movement Area of the PlayerMouse using the absolute coordinates. PointerInputModule should get the value it needs.

    I am still trying to make all this work. I am unable to get even a basic new scene in Unity with 2 canvases on 2 displays using the StandaloneInputModule to accept UI input on the 2nd screen. No Rewired involved, no PlayerMouse. Nothing but basic Unity UI, two Buttons, and the mouse. I am going to try using a later version of Unity and see what happens.
     
  47. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    @danbg Yep. 2019.3 works. 2019.1 does not.
     
  48. danbg

    danbg

    Joined:
    May 1, 2017
    Posts:
    64
    Sorry guavaman, I didn't want to sound it like an overcomplicated setup. In reality it's relatively easy dividing all of them into instances (the proof is that I managed to do it that way). I don't need to keep that same setup. Merging all into a single Unity Instance, I'm just copying all my Scenes to the same project and I just need to merge all the different GameManagers and EventSystems into a main DontDestoryOnLoad, and use a set of scenes for each of the 8 displays (the Kinects use the EventSystems and the TUIO uses Touchscript, but that's beyond the use of Rewired, I just mentioned them to have the whole picture of other systems using also the EventSystem)

    Did you manage to get "a basic new scene in Unity with 2 canvases on 2 displays using the StandaloneInputModule to accept UI input on the 2nd screen"? In Unity 2019.3 it works directly but be careful with the editor. The behaviour in the editor and the build is different regardind multiple displays. I'm not sure why is that, but maybe it's because of the way the Display,displays are giving data. When you use the editor only 1 display is detected (you must use multiple game views with different displays on each one) but in the build it detects everything right.

    I wonder why Unity makes working with multidisplays so difficult... In case it helps, you can use both the old and new InputSystem if you change "Active Input Handling" to both in Project Settings > Player. The MultiplayerEventSystem from the new InputSystem uses this code:

    Code (CSharp):
    1.  
    2. namespace UnityEngine.InputSystem.UI
    3. {
    4.     /// <summary>
    5.     /// A modified EventSystem class, which allows multiple players to have their own instances of a UI,
    6.     /// each with it's own selection.
    7.     /// </summary>
    8.     /// <remarks>
    9.     /// You can use the <see cref="playerRoot"/> property to specify a part of the hierarchy belonging to the current player.
    10.     /// Mouse selection will ignore any game objects not within this hierarchy. For gamepad/keyboard selection, you need to make sure that
    11.     /// the navigation links stay within the player's hierarchy.
    12.     /// </remarks>
    13.     public class MultiplayerEventSystem : EventSystem
    14.     {
    15.         [Tooltip("If set, only process mouse events for any game objects which are children of this game object.")]
    16.         [SerializeField] private GameObject m_PlayerRoot;
    17.  
    18.         public GameObject playerRoot
    19.         {
    20.             get => m_PlayerRoot;
    21.             set => m_PlayerRoot = value;
    22.         }
    23.  
    24.         protected override void Update()
    25.         {
    26.             var originalCurrent = current;
    27.             current = this; // in order to avoid reimplementing half of the EventSystem class, just temporarily assign this EventSystem to be the globally current one
    28.             try
    29.             {
    30.                 base.Update();
    31.             }
    32.             finally
    33.             {
    34.                 current = originalCurrent;
    35.             }
    36.         }
    37.     }
    38. }
    39.  
     
  49. guavaman

    guavaman

    Joined:
    Nov 20, 2009
    Posts:
    5,627
    @danbg Okay this is way more complicated than I hoped.

    Unity now supports all the screen arrangement features of Windows, meaning you can have many displays of all different resolutions and you can move those displays anywhere in relation to the other displays.

    Unity UI needs to know a pointer position based on the total absolute pixel space of all displays together to work.

    Unity doesn't seem to have a way to get the absolute location of a screen. So how do you know what region to clamp the PlayerMouse to in this absolute space?

    There is only one method that might possibly used to determine any of this information:
    Display.RelativeMouseAt

    That returns a relative screen space coordinate and a screen index for an absolute pixel position. Problem is, there is no inverse of the function that will tell you an absolute space position for a pixel on a particular screen, so you'd end up having to do some kind of search to find all the screens. But what happens when the user starts moving screens around in Windows Display settings? There would be no non-native way to detect that and trigger a new screen search.
     
  50. danbg

    danbg

    Joined:
    May 1, 2017
    Posts:
    64
    According to docs:
    Controlling monitor display positions
    By default, the user’s computer sorts the relative positions of its display monitors based on its x, y virtual desktop. To override this so that your application displays without any sorting, start your application from the command line and use the -multidisplay command line flag.


    In my setup, absolute or relative location of the screen won't be a problem, because users won't be able to change the screens. Everything is fixed (I also use Autohotkey to setup them and Interception to create virtual joysticks with multiple keyboards, and a server to take control of everything with a mouse, but that's solved)

    I guess a modular and general solution for multiple cursors with multidisplay would be better, but if at least a fixed setup is possible, that would be a great victory for multidisplay developers. (I know some of them just created a single scene for a single screen with the resolution of multiple displays combined, but that's giving also performance issues)

    PS: I'm from Europe and it's getting a bit late. I'll check tomorrow if you get new fidings and let you know if I get it working. Maybe it would be much easier to use Raycasting directly... Thanks for your support. I wonder why Unity hasn't hired you to make their new InputSystem.