I have started using this and so far it seems to be a much better way to do things. The only thing i've always had in update is input which always caused problems for me. Now that this is possible I will have NOTHING in updates. This is just amazing. I would like to know how to do conditionals though. It seems like i have to make a separate interaction for every single possible thing which starts to create a ton of callbacks. How do you do this for example: Code (CSharp): int minX = -120; public static int maxX = 120; int minZ = -200; int maxZ = 110; int multiplier = 100; Vector3 v3Cam; void InitWASD() { v3Cam = Camera.main.transform.position; } float CalculateDirection(float value, float min, float max) { if (Input.GetKey(KeyCode.LeftShift)) return Mathf.Clamp(value + (multiplier * 2 * Time.deltaTime), min, max); else return Mathf.Clamp(value + (multiplier * Time.deltaTime), min, max); } float CalculateDirectionReverse(float value, float min, float max) { if (Input.GetKey(KeyCode.LeftShift)) return Mathf.Clamp(value - (multiplier * 2 * Time.deltaTime), min, max); else return Mathf.Clamp(value - (multiplier * Time.deltaTime), min, max); } void WASDUpdate() { if (!BC_Singleton.Instance.SceneGlobals.InUpgrading) { if (Input.GetKey(KeyCode.D)) { var tempMaxX = maxX; if (BC_Singleton.Instance.SceneGlobals.IsBuildingMenuOpen) tempMaxX = maxX + 35; v3Cam.x = CalculateDirection(v3Cam.x, minX, tempMaxX); Camera.main.transform.position = v3Cam; } if (Input.GetKey(KeyCode.A)) { v3Cam.x = CalculateDirectionReverse(v3Cam.x, minX, maxX); Camera.main.transform.position = v3Cam; } if (Input.GetKey(KeyCode.W)) { v3Cam.z = CalculateDirection(v3Cam.z, minZ, maxZ); Camera.main.transform.position = v3Cam; } if (Input.GetKey(KeyCode.S)) { v3Cam.z = CalculateDirectionReverse(v3Cam.z, minZ, maxZ); Camera.main.transform.position = v3Cam; } } } To be more specific, how would you handle shift? Also, how many callbacks do you think can be handled before a system starts to eat bricks?
I have successfully done this. Wow this is a better structure. Less methods + faster? Code (CSharp): void InitWASD() { v3Cam = Camera.main.transform.position; } float CalculateDirection(float value, float min, float max, float multiplier, Keyboard keyboard) { if (keyboard.leftShiftKey.isPressed)// Input.GetKey(KeyCode.LeftShift)) return Mathf.Clamp(value + (multiplier * 2 * Time.deltaTime), min, max); else return Mathf.Clamp(value + (multiplier * Time.deltaTime), min, max); } private void OnWASD_performed(InputAction.CallbackContext obj) { var values = obj.ReadValue<Vector2>(); var keyboard = Keyboard.current; var tempMaxX = maxX; if (BC_Singleton.Instance.SceneGlobals.IsBuildingMenuOpen) tempMaxX = maxX + 35; v3Cam.x = CalculateDirection(v3Cam.x, minX, tempMaxX, values.x * 100, keyboard); v3Cam.z = CalculateDirection(v3Cam.z, minZ, maxZ, values.y * 100, keyboard); Camera.main.transform.position = v3Cam; }
I've been testing recent development branch on 2019.3 now but I'm having issues with Logitech G25 wheel. My Thrustmaster TX servo works fine as HID device on new input system but logitech wheel gives me these errors when I try to use "listen" and use the G25 wheel: I also tested this on Input System's 0.2.10 preview using 2019.1.8 and there Logitech G25 didn't throw these errors but it didn't recognize any input I put to the wheel (despite it showing the controller on input debugger and recognizing if it was plugged in or out, altho need to mention I couldn't open the input debuggers device view for it). edit: I don't really need built-in DirectInput handling from new input system as I'm going to replace it with FFB wheel input lib anyway but I'd love to use the new input systems binding API for it. So this brings me to a new question: What is the easiest entry point for 3rd party/custom input? Is there some specific class that would be a good example when implementing this on your own (modifying input system itself for this is not an issue).
I would be interested in this as well. For instance, how would I start to add support for Magic Leap controller bindings?
Does the input system have a way to distinguish between different gamepads/joysticks In the old system when setting the buttons you would specify which joystick to listen to
@Rene-Damm Is there any way to setup control schemes based on XInput Device SubType's i need to setup a seperate control scheme for an xinput guitar controller. I've made a github issue for this as well. https://github.com/Unity-Technologies/InputSystem/issues/738
Totally can do. The editor allows you to specify specific devices (if you so choose), and more directly, the PlayerInputManager and PlayerInput components let you establish the link of player controls at runtime (so supporting Xbox or PS4 style 'joining'). I set mine up to join on any button press, and then set up a behaviour that I put on the PlayerInput's prefab. From there, using UnityEvents, it's pretty straightforward to set up control. That said, I did have to modify the unity gui module to support working with multiple controllers, but it was a relatively simple mod for my needs.
Will, and when, this be implemented? I love TRINE 2 and other games that allows for multiple keyboard/mice https://github.com/Unity-Technologies/InputSystem/issues/166
Hey everyone, we pushed 0.9.1 yesterday to fix the regression with composites but turns out we ended up with a different regression in it. When you select actions in the action editor, you get exceptions instead of properties in the UI. Sorry about that. Shouldn't have made it out. We will push another package with a fix shortly.
Easiest way probably is to just create a new "Guitar" layout based on the XInputController layout. Code (CSharp): InputSystem.RegisterLayout(@" { ""name"" : ""Guitar"", ""extend"" : ""XInputController"", ""device"" : { ""interface"" : ""XInput"", ""capabilities"" : [ { ""path"" : ""subType"", ""value"" : 6 } ] } } "); Now you should be able to select "Guitar" as its own type of device under "XInputController" and thus create a separate control scheme for it. (note: only if the code above is executed somewhere in editor code) For the record, there's other ways to go about it but they are probably inferior in this case to the solution above. Solution B relies on usages. Bindings cannot distinguish devices based on their capabilities which is where the XInput subtype sits (it's part of "InputDevice.description" which bindings are not aware of). However, it can easily be made visible in the form of a "usage" tag. Code (CSharp): InputSystem.onDeviceChange += (device, change) => { // If an XInput controller is added that is a guitar, // tag it as such. if (change == InputDeviceChange.Added && device is XInputController ctrl && ctrl.device == XInputController.SubType.Guitar) InputSystem.AddDeviceUsage(device, "Guitar"); }; In bindings, the usage tag can be bound to. Code (CSharp): <Gamepad>{Guitar}/buttonSouth The system can also be told that this is a common thing for XInputControllers. Code (CSharp): InputSystem.RegisterLayoutOverride(@" { ""name"" : ""XInputControllerUsages"", ""extend"" : ""XInputController"", ""commonUsages"" : [ ""Guitar"", ""Wheel"", ""ArcadeStick"" ] } "); This should make it bindable from the UI (oops, just noticed we have a regression where you can no longer add devices with specific usages to a control scheme... fixing).
So you basically have two options here. First one is to bypass the HID fallback. We do this ourselves, too. E.g. DualShockGamepadHID comes in through the same input path as other HIDs but we set up its own layout for it which bypasses the logic in HID that you are getting an exception from. Looking at the code in https://github.com/Unity-Technologi...stem/Plugins/DualShock/DualShockGamepadHID.cs should be helpful in this case. https://github.com/Unity-Technologi...tem/Plugins/DualShock/DualShockSupport.cs#L38 shows how to set up the device matching to override the HID fallback. Second option is for when you want to pick up input from a separate API altogether -- which as far as I understand is ultimately what you want to do. The system isn't well documented yet but the gist of it is very simple. There's is a lengthier example here. Basically, the steps are Create a struct that represents the way you want to store and transmit input data for you device. Put IInputStateTypeInfo on it and give it a custom FourCC tag. Add fields and annotate the fields with [InputControl] as needed to create the kind of control layout you want for your device. Create a subclass of InputDevice (or a more specific kind of device, if applicable) and put [InputControlLayout(stateType = typeof(struct_you_created)] on it. Register the whole thing with the input system. Check the example I linked for a good way to do this during startup such that the editor will also see your device. Important for it to show up in the control picker. Hook into your API to detect when the device is present and InputSystem.AddDevice<TYourDevice>() it as needed. Put IInputUpdateCallbackReceiver on your device and in the OnUpdate() method read data out from the API you have and put it in an instance of the struct you created. Then queue the whole thing as an input event with InputSystem.QueueStateEvent(). This should give you a fully functional device that you can see in the input debugger with all its controls and events coming in. You can hook devices from arbitrary sources into the system like this. It's fairly similar to what InputRemoting does to reflect InputDevices from players in the editor.
Not ATM. At least not out of the box. "Chained bindings" are on the list for >1.0. One thing that you can do ATM is create a custom composite for that. Code (CSharp): #if UNITY_EDITOR [InitializeOnLoad] #endif public class ButtonWithModifierComposite : InputBindingComposite<float> { [InputControl(layout = "Button")] public int modifier; [InputControl(layout = "Button")] public int button; public override float ReadValue(ref InputBindingCompositeContext context) { var modifierValue = context.ReadValue<float>(modifier); if (modifierValue > 0) return context.ReadValue<float>(button); return default; } static ButtonWithModifierComposite() { InputSystem.RegisterBindingComposite<ButtonWithModifierComposite(); } [RuntimeInitializeOnLoadMethod] static void InitInPlayer() {} // Get cctor to run. } In the editor, you should now see a "Add Button With Modifier" option show up when clicking the plus icon to add a binding. For example, bind "<Keyboard>/leftCtrl" to "modifier" (if you want both ctrl keys, duplicate the "modifier" binding and bind "<Keyboard>/rightCtrl" to the duplicate) and "<Keyboard>/a" to "button" and the result should be a composite that triggers when both CTRL and A are pressed. This will work with any kind of button. You could, for example, bind a leftTrigger + buttonSouth combination. ////EDIT: Replaced it with a better implementation that lets the original key through as is instead of returning 1.
Awesome, thanks! Any idea about when chained bindings are to be released? I hope it's not a rude question.
No ETA at this point. Getting 1.0 out the door is first, then it's time for new plans BTW, figured the "button + modifier" composites are probably generally useful. Added a PR that adds the composite and a "button + two modifiers" twin directly to the input system.
Alright! The new Input System is very pleasant to work with. Thanks for putting your time into this, I appreciate it!
I'm loving how easy it is to setup local multiplayer using the Player Input Manager. I created a top down tank game for my kids and I to play together in no time at all. I think I noticed a small bug though, which only seems to happen (so far) when 4 controllers are connected. I'm using Unity 2019.2.0f1, Input System 0.9.2. When we first start the play, we all instantiate just fine (see screenshot) using Join Players When Button Is Pressed. But after a few rounds of getting destroyed, eventually, one of the controllers will no longer work. Instead what happens is, the Player Input Manager creates a clone of one of the active players. I wrote a Game Manager script to make sure we can't create a clone if we're active in the scene by passing around the player's Gamepad.current.id value, which works great, but now what happens is, well, nothing. The 'dead' controller is no longer being polled by the Player Input Manager until we restart, and everything is fine again for another few rounds. I tested in both the Editor and a Windows build version. Any clue what might be causing this? Thanks!
Here you go. WIP docs. Should contain all the key information at this point. Will get refined further.
The last part sounds like the actions are disabled. The first part sounds like for some reason the manager is joining a player when it shouldn't. Could you go, when trying this in the editor, and pop open the input debugger (Window >> Analysis >> Input Debugger) and do a before and after for me? Each PlayerInput will create one user that shows up under "Users" along with the user's paired devices and actions. Would be curious to see a snapshot of what it looks like before and after the action and how the after-the-fact snapshot compares to what you are expecting.
Oh, that looks nice. Thank you! I noticed that "Device Commands" are not quite there yet, is this going to happen for 1.0 or after? My use case for this is force feedback wheel support.
Input Debugger goes blank when I press a button on a controller, but I did take some screenshots before, during, and after just in case that helps. First Startup: During game play - as soon as I press a button, the Input Debug goes blank, and I am noticing a bunch of errors now with NameAndParameters.cs: Then, when the problem is happening, I exiting play mode and took another screenshot: Is there a way for the Input Debug tab stay on while I'm playing? I tried playing with the options but nothing seemed to help. I do have some additional information which may be helpful. 1. This doesn't happen to the same controller, it does appear to be random. This happened to more than 1 controller during a single play session - once I figured out #2 below. 2. I can 'reactivate' the dead controller by destroying the gameObject that would've originally been cloned. Meaning, when a controller is no longer working, it means one of the other players is seemingly being seen as 2 controllers. To fix the issue, I simply have to destroy that player, and then all is well for a round or two and then it may happen to the same controllers, or different controllers. I don't know what that means, but I hope this helps somewhat! Thanks, Tim
Another quick note. I pulled up a debug for each controller and then started the game. When I ran into the problem, I was able to see that the 'dead' controller was still accepting input. I uploaded an animated gif below. Thanks!
It's a relatively small tweak. I expect it for 1.0. Sorry about that one I messed up a change to the input debugger and it slipped through unnoticed. Fixed in next package. Should be out shortly. Yup, this is helpful. I'll do some digging. Thank you for checking. Means that at the device level everything is fine. It's really just PlayerInput screwing up.
This might be a dumb issue, but I have the input system working in the editor, but when I build it no longer works. Should it? Or are preview packages striped out?
It should work in all players. Which platform are you building for? Can you try connecting the input debugger to the player and see whether it connects? (instructions here) How does "not working" manifest itself in the player?
See this thread: https://forum.unity.com/threads/bug-report-not-workign-so-posting-webgl-stripping-error-here.727001/ Stripping does indeed stop platforms (at least WebGL) from working. Turn it off or set it to low.
I'm building Windows PC, and using a PS4 controller, The Input debugger shows the controller sending Input. It works in the editor I can move my sprite around no issue. I build the scene and I get no response from the stand alone build
I don't know what fixed it, but I switched from x86 to x86_64 and went in to project settings and changed input system package to only look for the game pad. I also asked me to generate a file on that page. Any way one of those three things fixed it.
Hopefully this verbose question has a relatively simple answer. TLDR version: I'm looking for a way to read a float from any given 2D axis on my gamepad, in a way that'll allow me to use a single generic script on multiple UI game objects — but I need to be able to point to custom HID inputs. --- I have a very simple script that takes the value of a 2D axis (eg: Left/Right Trigger) as a float, and uses it to display input values through a graphical UI. Thing is, I originally wrote this for Rewired, but I'm currently working with a 3rd-party controller, and a custom HID override which Rewired doesn't currently support. So, I'm trying to re-adapt this script for the new Input System instead. For reference, the relevant parts of my original script for Rewired: Code (CSharp): private string button; private float axis; private float axisNormal; void Awake() { // Get the Rewired Player object for this player and keep it for the duration of the character's lifetime player = ReInput.players.GetPlayer(playerId); // Get the name of the GameObject, use it as the name of the button in GetInput() and ProcessInput() button = name; } private void GetInput() { axis = player.GetAxis(button); // get input by name or action id axisNormal = Mathf.InverseLerp(0f, 1f, axis); // scale axis range to 0f - 1f } My problem lays in GetInput(). This script is made to be generic, so that I can apply it to single button elements (eg: Left Trigger) for multiple controller types, and tweak values as needed. Rewired allows me to use "GetAxis()" to grab the value of any axis OR button as a float — in this case, I've been using the game object's name as a string, and matching it to whatever joystick mapping or action that I've set up. I can technically pass a face button as an axis if I wanted to, and it'll give me constant 0f and 1f values just fine. The issue I've been running into here is that (as far as I can tell from the WIP documentation) the new Input System seems to expect specific names before it'll allow you to grab a value. Instead of being able to use a generic value like "button", Input System seems to expect specific names for every element, something along the lines of "Gamepad.current.leftTrigger.ReadValue()". My controller has twelve 2D axes in total (it's a PS2 DualShock with pressure sensitivity in every button), so I want to avoid making separate scripts for every single button. What do I need to change in order to make this work with the new Input System?
I believe that you can access InputMap actions by string name rather than API name. eg: master[button].ReadValue<float>() This is in the docs, unless I'm misinterpreting the problem.
Is that in the WIP docs on Github, or somewhere else? I'm having trouble finding it. Either way, I must be missing something. Unity Console throws me this error: 'InputControl' does not contain a definition for 'ReadValue' and no accessible extension method 'ReadValue' accepting a first argument of type 'InputControl' could be found For what it's worth, my script is using "UnityEngine.InputSystem" and "UnityEngine.InputSystem.Controls".
Maybe it's been posted here but I didn't find anybody asking it. With the new InputSystemManager, how can I check for controller axis runtime? My simulation will be handling 100+ different radio system controls, InputSystemManager, offers me in the inspector to link my own ("Taranis") radio and it probably wont work for other radios of different brand, so I would like to give users to setup their controllers on runtime. Is it possible to achieve that? To use the "Listen" function from the inspector when assigning axis or buttons to a map? Best regards, Mario
My device is listed under "unsupported". Is there any way to add it to make it supported, or am I just out of luck?
My suggestion would be to put a PlayerInput next to your component (the one you've pasted the snippet of). That component itself is sort of an equivalent of a player ID in Rewired. I.e. by virtue of being instantiated, it represents a player. To connect PlayerInput to your .inputactions, assign them to PlayerInput's "Actions" property. In your script, you can do Code (CSharp): private InputAction button; private float axisNormal; // Link PlayerInput here from the inspector or look it up in Awake() using GetComponent // or GetComponentInChildren. public PlayerInput input; void Awake() { // Look up action by using the GameObject's name as the action name button = input.actions[name]; } private void GetInput() { // All buttons are [0..1] floats. axisNormal = button.ReadValue<float>(); } Working with raw HIDs in .inputactions is a bit quirky ATM but doable. One way to do it, is to connect your HID. If it's recognized, it should show up under "Joystick" in the control picker with a weird-looking label such as "HID::Blablabla" listed under "More Specific Joysticks". When you select it you should see the buttons and axes on the device with generic names such as "button1" etc. that you can bind to. The downside is that these bindings will ONLY work with that specific HID. Which in your case, may be just fine. To make the bindings work more universally ATM does require some manual massaging of the binding strings by switching the "Path" property to text mode via the little "T" button on the right side. This way you can, for example, bind to the third button on any joystick. Code (CSharp): // Third button any joystick. <Joystick>/button2
"Listen" in the control picker is entirely based on InputActionRebindingExtensions.RebindingOperation which is a public API. See here. To easiest way to initiate an interactive rebind on an action, is to call PerformInteractiveRebinding(). The resulting object has copious options for customizing its workings. By default, it will non-destructively apply an override to bindings. Code (CSharp): var rebind = myControls.gameplay.fire.PerformInteractiveRebinding(); // Example: rebind only binding from gamepad control scheme. rebind.WithBindingGroup("Gamepad"); rebind.Start();
Depends Just to make sure, one reason devices end up there as they aren't compatible with what's chosen as "Supported Devices" in the "Input System Package" project settings. If that is the case, it can be as simple as adding the device as supported. If that's not the case, then from the fact it shows up "Unsupported Devices" we know that at least Unity is seeing the device and probably knows how to talk to it. On Windows and Mac, you're probably looking at a HID. It ending up under "Unsupported Devices" means that it either has a HID usage we don't support by default or that we couldn't find any usable controls on the device (there's plenty HIDs, for example, where everything on the device is marked VendorDefined). ATM there's no good way to find which is the case using the Input Debugger. There's a change coming that will allow you to dump the device descriptor to the clipboard but it's not yet in 0.9.3. ATM you will have to use a 3rd-party HID/USB dumper tool or simply know the device. If it's a case of the HID usage of the device not being supported out-of-the-box (we currently pick up GenericDesktop.Joystick, GenericDesktop.Gamepad, and GenericDesktop.MultiAxisController), you can hook into HIDSupport.shouldCreateHID and manually "greenlight" your HID. If it's a case of the HID fallback not being able to work with the controls present on the device, you will have to build your own device layout for the device to work with the input system. Doing so isn't super tricky but does require some custom scripting. There's documentation here.
This did exactly what I needed, thank you! Unfortunately I've run into another issue - I think I'm failing to initialize my HID controller correctly? I can see the controller (and my custom HID override) in the Input Debugger, and I can use the "Listen" function to bind it to Input Actions. But when I'm in Play mode, Unity won't pair this specific controller to User #0. Other controllers pair up just fine - I've tried unplugging them just in case, but no luck. I've followed the HID documentation, although I haven't added the "[RuntimeInitializeOnLoad]" lines from Step #3 to anything. Code (csharp): using UnityEngine; using UnityEditor; using System.Runtime.InteropServices; using UnityEngine.InputSystem.Layouts; using UnityEngine.InputSystem.LowLevel; using UnityEngine.InputSystem.Utilities; namespace UnityEngine.InputSystem.LowLevel { [StructLayout(LayoutKind.Explicit, Size = 20)] [...] // 83 lines of definitions that don't need to be quoted here // Picking it back up at Line #95 [InputControlLayout(stateType = typeof(DualShock2HIDInputReport))] #if UNITY_EDITOR [InitializeOnLoad] // Make sure static constructor is called during startup. #endif public class DualShock2GamepadHID : Gamepad { static DualShock2GamepadHID() { InputSystem.RegisterLayout<DualShock2GamepadHID>( matches: new InputDeviceMatcher() .WithInterface("HID") .WithProduct("MUSIA PS2 controller")); } } }
Can you email me (rene AT unity3d.com) that file above in full plus your .inputactions file? Probably the quickest way to debug this.
Just sent, thank you for checking it out! I forgot to mention, Unity does pick up the HID controller when I'm not using a custom HID override. It feels like user error on my part, I just can't see where that error is.
This still happens on 0.9.3-preview, it appears that the HID structure is hitting some not implemented exception, is this something I could try to solve with custom HID setup or is it more structural issue on Unity's new input system?
This could be solved with a custom HID layout but a HID hitting that exception seems very suspicious to me. A sizeof>=32bit field straddling an int boundary strikes me as an extremely odd thing to encounter with a HID. I'm guessing that this is actually the HID fallback messing something up. Would you mind setting a breakpoint where it generates the exception and then check a couple frames up which input control triggers it? Also, if you could, would you check the "HID Descriptor" window to track down the control and post a screenshot of the control? (Sorry for the awkward debugging; working on something that will at least somewhat help tracking down HID issues like this)
Thank you for the files. I see some problems in the file that are likely the cause of what you are seeing. Some of the problems are likely produced by a quirk the action editor still has. The first problem is that the control scheme you have for the controller does this: Code (CSharp): "controlSchemes": [ { "name": "PS2", "basedOn": "", "bindingGroup": "PS2", "devices": [] <--- empty }, PlayerInput will base its device assignments on what it finds as the device requirements on individual control schemes. Adding your DualShock2GamepadHID device as a requirement here should solve the problem with PlayerInput not picking up your device. However, there's another problem in that "bindingGroup" of the control scheme doesn't match the name of the group used in the bindings. The scheme uses "PS2" whereas the bindings use "PS2Scheme". This is likely the result of editing/renaming the control scheme setup. ATM the action editor can easily be fooled into messing up the binding setup when renaming and deleting and re-creating control schemes. Have it on my TOFIX list
What you mean by couple frames up in this context? like just the callstack to that point? You can see the same callstack on the screenshot below which lists some controls. I can't even open the Input Debuggers device dialog, if I double click the Logitech G25's device, I get this:
Ugh, I see. Sorry about that. There's a change I intend to land in 0.9.4 that should make debugging this much simpler. To debug this on my side, I'll need the device description which with that change can be gotten to in two clicks. If it's okay, let's wait for 0.9.4 to hit (I expect that to happen within a week) and then let's debug this further.
This was the thing! God, of course it'd be a simple "gotcha", lol. Thank you so much. One last quick question: Do I need to add anything to the HID override to get it working in a build? It's working in editor, but doesn't appear to be doing anything in a stand-alone player.
Okay, I've run into a wall. For whatever reason, when setting up action binds in the Input Actions window, the "Listen" tool refuses to hear Right Trigger or Dpad/Up, even though they are definitely visible and functional in the input device debugger window. I've looked over the relevant section of the HID override, but as far as I can tell, what I have below should work just fine. Again, both buttons work correctly in the input debugger window, I can see all of the inputs working perfectly. I've even forced these binds by using the "T" button to type them in directly. The editor just doesn't seem to want to hear these two buttons. Code (csharp): [InputControl(name = "leftShoulder", format = "BIT", displayName = "L1", layout = "Button", bit = 0)] [InputControl(name = "rightShoulder", format = "BIT", displayName = "R1", layout = "Button", bit = 1)] [InputControl(name = "leftTrigger", format = "BIT", displayName = "L2", layout = "Button", bit = 2)] [InputControl(name = "rightTrigger", format = "BIT", displayName = "R2", layout = "Button", bit = 3)] [InputControl(name = "dpad", format = "BIT", layout = "Dpad", sizeInBits = 4, defaultState = 8)] [InputControl(name = "dpad/up", format = "BIT", layout = "DiscreteButton", bit = 4, sizeInBits = 4, parameters = "minValue=7, maxValue=1, wrapAtValue=7, nullValue=8")] [InputControl(name = "dpad/right", format = "BIT", layout = "DiscreteButton", bit = 4, sizeInBits = 4, parameters = "minValue=1, maxValue=3")] [InputControl(name = "dpad/down", format = "BIT", layout = "DiscreteButton", bit = 4, sizeInBits = 4, parameters = "minValue=3, maxValue=5")] [InputControl(name = "dpad/left", format = "BIT", layout = "DiscreteButton", bit = 4, sizeInBits = 4, parameters = "minValue=5, maxValue=7")] [FieldOffset(2)] public byte buttons2; There is some weirdness in how R2 and Dpad/Up behave, too. Dpad/Up seems to start working when Left or Right are also held, but it doesn't work on its own. I'm assuming it has something to do with the parameters used (00 is Up, 70 is Up+Left, 10 is Up+Right, and 80 is "rest" for the whole dpad), but this is basically how it was laid out in the documentation - and using a HID trace tool shows that this is how it functions. If I replace bindings for Dpad directional buttons with Dpad axes ("dpad/up" and "dpad/down" become "dpad/y", left and right = Dpad/x), then R2 disables all axes whenever it's pressed. Weirder still, if R2 and a Dpad Axis is held, pressing L1, R1, or L2 will re-enable dpad axes. The only clue I have is that the bits for Dpad and R2 are right next to each other, so I must have not configured them correctly. But because they show up correctly in the input debugger and don't reflect any of the weird behavior that I'm experiencing in the editor, I'm at a total loss.