Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Official "Meet the Devs" -- Post Your Questions

Discussion in 'Input System' started by Rene-Damm, Apr 2, 2020.

  1. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Heya,

    As part of Unite Now, there will be a webinar about the new input system at 8am PDT on April 15.

    There will be a Q&A section in the session. If you have questions that you'd like to see answered there, feel free to post them here.

    Also, there will be a sneak peak of a new and much better demo (including local multiplayer) than our current adaptation of the Tanks demo. This has been worked on by @Andy-Touch who will also be presenting in the webinar. The demo will be publicly available in the near future.

    You can register for the webinar here.



    ////EDIT: The new demo project that was shown in the webinar can be found here.

    ////EDIT: A recording of the webinar is available here.

    ////EDIT: We're still tracking down the Q&A transcript from the Webinar to get those questions covered as well.
     
    Last edited by a moderator: Apr 23, 2020
  2. mattpwibbu

    mattpwibbu

    Joined:
    Jan 16, 2019
    Posts:
    6
    Can one of the devs please speak to the problems with retrieving the current touch position on mobile? There appear to be multiple open questions about it and I have been attempting to resolve the issue as well. See this post.
     
    Spica24510 and justbb like this.
  3. GilbertoBitt

    GilbertoBitt

    Joined:
    May 27, 2013
    Posts:
    111
    Many people want to create local multiplayer we now the way using the Input Manager and Player Input but many like myself want to create it using the C# Generated class how can we achieve this?
     
    kor_ likes this.
  4. dannyalgorithmic

    dannyalgorithmic

    Joined:
    Jul 22, 2018
    Posts:
    99
    Will you be incorporating touch gesture support any time soon? Particularly highly customizable touch gestures.

    Examples: Swipe, drag, turn, flick
     
    Buzzrick_Runaway and justbb like this.
  5. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,292
    How do you go about making a rebind controls menu? Including saving and loading the rebinds between play sessions.
     
    Lurking-Ninja and hippocoder like this.
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Will XR and New Input System ever reconcile their differences? :D
     
    jiraphatK likes this.
  7. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,513
    Hi @Rene-Damm I know you have been working on pressure and tilt support for pen tablet, like a Wacom.
    I'm mainly interested in the use for editor tools, not so much for runtime.

    Questions:
    • Is Wacom pressure and tilt now working properly in the editor?
    • Do you have any new examples planned to show hot to use the Wacom pressure in the editor tools.

    https://docs.unity3d.com/Packages/com.unity.inputsystem@0.9/manual/Pen.html

    Last time I checked it dit not work for me in the editor tools.
    https://forum.unity.com/threads/input-system-for-pen-stylus-use-with-editor-tools.679960/
     
    transat likes this.
  8. Ferazel

    Ferazel

    Joined:
    Apr 18, 2010
    Posts:
    517
    Sorry if this is too angry, but why does it take 5 years to make a new input system? Entire game engines can be built in that time frame. Why is does it take Unity so long to make packages that are functionally incomplete compared to asset store alternatives that have significantly fewer resources?
     
    Edy, Alex-CG and JoNax97 like this.
  9. Alesk

    Alesk

    Joined:
    Jul 15, 2010
    Posts:
    339
    My questions :

    - is it possible to assign multiple local players to a single device, ex : two players sharing a keyboard or a gamepad (yes, I want to do that)

    - still on local multiplayer, how to properly handle UI interactions with one of the players having the priority over the others ? ex : player 1 can navigate in all menus and options, while others are restricted to certains menus parts.
     
  10. anthonov

    anthonov

    Joined:
    Sep 24, 2015
    Posts:
    160
    why version 1.0 is called a preview ?
     
  11. opdenis

    opdenis

    Joined:
    Jan 23, 2017
    Posts:
    15
    Will be glad to hear about TOUCH events on the Android platform.
    Especially ONE tap filter - to skip first tap, if we are trying to get DOUBLE tap. I cannot understand, how to filter it with the new InputSystem.Also cannot catch Long Tap (Press) on the screen.
    Thank you!
     
    Last edited: Apr 9, 2020
    Livealot and justbb like this.
  12. TJHeuvel-net

    TJHeuvel-net

    Joined:
    Jul 31, 2012
    Posts:
    837
    Im trying to customise the system a bit and add new interactions. Im very lost about what a context, action, phase, `performedAndStayX` are. Some more documentation on these concepts would be very much appreciated.
     
  13. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,203
    Upgrading an 4.7 project onward I faced, in a short duration, many generations of breaking changes and change in programming patterns.

    Is anything done to slow down this trend? If so, what?
     
  14. Alesk

    Alesk

    Joined:
    Jul 15, 2010
    Posts:
    339
    Another one :

    - a simple demo on how to create a Custom Device in order to use it as a Bot would be nice ;)
     
  15. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,203
    This ^^^ guy just gave me a genius idea:

    A group of Unity devs make a game in a few days while non stop on Twitch.

    Game idea is locked but each comment is answered.

    One account, Unity devs in the background, and all questions fielded by one experienced person who is low key, I'm thinking someone in the "quality of life" team, preferably who has been with the company long enough that he doesn't fear repercussions :D

    benefits:
    the people can pick your brains
    you can see pain points
    It's fun
     
    Alesk likes this.
  16. Sab_Rango

    Sab_Rango

    Joined:
    Aug 30, 2019
    Posts:
    121
    Will you create 3d and 2d physics that will run on gpu?
     
  17. justbb

    justbb

    Joined:
    Apr 24, 2019
    Posts:
    6
    Please talk about complete touch controls and possibilities with touch inputs. And talk about optimising 3dgames for mobile platforms without having to sacrifice the quality of the game. The utmost important thing is please tell us how to use the profiler for optimising mobile games
     
  18. dino999z

    dino999z

    Joined:
    Jan 9, 2020
    Posts:
    1
    Hi,

    We are experiencing a strange issue with the new input system on macOS. Whenever we take a screenshot using cmd + shift + 4 when switching action maps, the new input system seems to soft crash (no errors in log or indication that anything is broken) until we switch the current action map again. Stangely, left shift will still work.

    To give some further details, things will work normally again once the game screen is unfocused in editor (e.g. when one clicks the console). We've experimented with filtering what inputs are accepted in a single action map and having seprate action maps for UI and Gameplay. The only thing that seems to help is switching to a dummy action map and before switching back to "fix" the soft lock.

    Any ideas into how to prevent this?
     
  19. dev_reimu

    dev_reimu

    Joined:
    Jun 24, 2017
    Posts:
    10
    What's the best way to reduce draw calls?
     
  20. transat

    transat

    Joined:
    May 5, 2018
    Posts:
    779
    What's the state of compatibility with DOTS?
     
  21. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,495
    - When can we expect it to be a Verified package in a LTS version?

    - How can we detect in code that the new InputSystem API is present, so we could ship cross-compatible components and Asset Store products without the end user facing build errors?
     
    Last edited: Apr 15, 2020
  22. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Ok, turns out there were way more questions than we could fit into the remaining time slot but I'll go post answers to every (input-related) questioned that was asked (here or in the webinar) here.

    BTW thank you to anyone who attended!

    We have some open tickets for touch. We're working on addressing those.

    Answered here.

    We have three key areas we will be working on next:
    1. DOTS
    2. Gestures
    3. Various improvements to actions (most prominently, "routed" input / "action stacks").
    There's no hard timeline on these yet but these are the focus going forward.

    See the "Rebinding UI" sample for the first part. The second part ATM still needs custom code (details here). An API to easily load and save is on the list as part of key area 3) mentioned above.

    Hehe. We definitely have some asynchrony going on in our package world there ATM and for sure haven't aligned our efforts perfectly. We're working on it.

    For this one, I have good and some not-so-good news. Or depends how you look at it I guess.

    RE pen input in the input system in general: We have some problems with pen input specifically on Windows that are currently being worked on by the desktop team. While basic pen support is/should be working as is, there's a number of issues that I hope will all get fixed as part of this.

    RE edit mode input through the input system: We recently took another look at this and came to the conclusion that having edit mode support was a bad idea in the first place. As the input system is a game/player-oriented system, it doesn't play well within the larger context of editor UIs. There's some usefulness to having gamepad input work in EditorWindow code but for something like pen, you really need proper handling of focus and proper routing of events. In practice, we found that for pretty much every use case that people were coming to us with, just tapping the input system in edit mode was not an adequate solution.

    So, we're seriously thinking about removing dedicated edit-mode support in a future version of the input system.

    HOWEVER... the good news in here is that pen support specifically for IMGUI and UI Elements in the editor is actively being worked on ATM. This will be much more useful than piggy-backing off of the player-focused input system in EditorWindow code. With this done, pressure and tilt information will be directly available from UnityEngine.Event, for example (pressure is already a property but it isn't actually set properly by the Windows backend code currently).
     
    Lars-Steenhoff and hippocoder like this.
  23. opdenis

    opdenis

    Joined:
    Jan 23, 2017
    Posts:
    15
    Will we see it in the YouTube channel? I was not be able to watch it on Zoom.
     
    PLL and Alesk like this.
  24. Alesk

    Alesk

    Joined:
    Jul 15, 2010
    Posts:
    339
    Same here ! I have missed the schedule :(
     
    PLL likes this.
  25. PLL

    PLL

    Joined:
    Jul 15, 2013
    Posts:
    19
    Same here. I registered but could not attend. Any chance it was recorded?
     
  26. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    It was recorded. Looks like the Zoom recording will be available once it has finished processing and a YouTube video will be uploaded later. I'll post here once I hear more.

    No simple, single answer to that one. Reality can be messy and complicated. Pretty obvious that mistakes were made. Some of them led to a rocky history. Another contributing factor is that developing an add-on on top of Unity that addresses specific needs is one thing; developing a core Unity feature on the other hand tends to bring with it a litany of requirements and complications that aren't necessarily readily obvious.

    Anyway. Would we wish for some things to have gone differently? You bet :) We do try to learn our lessons and to improve going forward but no matter what, I'm pretty sure we'll also find new mistakes to make. Life :)

    ATM only through code (like this, for example).

    It's on the list. I think what should happen is that the PlayerInput stuff not only takes device pairings into account here but also control schemes. I.e. having, for example, a "LeftSide" and a "RightSide" control scheme are present and both need a Keyboard device, the system is fine joining two players on the same device but each using a different control scheme.

    We haven't tried this ourselves but *theoretically* all that should be necessary for this is to have one PlayerInput who's unconstrained (i.e. not tied to a separate MultiplayerEventSystem) and one who is (with the MultiplayerEventSystem for that one referring to the subtree of the UI that the player is restricted to).
     
  27. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Heh, yeah, should've never happened. There should have been previews going up somewhere to 0.9.something and then it should have turned 1.0 final and non-preview and done.

    The input system was one of the first projects to fully buy into the then emerging package system. Many things were unclear and new and decided on as we went. We're still working out many aspects of that part of Unity. With us being new to package development, we made some calls that were "suboptimal". Going 1.0-preview was one of those.

    In better news, we're currently finalizing 1.0 *non-preview* (there'll be a preview.7 which will hopefully turn into 1.0 non-preview right away) so hopefully we can soon leave that mistake behind us :)

    What we have now isn't set up well to process complex, more "gesturally-natured" inputs. Gestures as a whole are on the high-priority work item list.

    It's possible to set up more complex pointer inputs using composites and to process gestural type input with them (the "Touch Samples" demo does it) but it's hardly elegant or easy.

    Another way to do it with that's there is to, for example, put a "Slow Tap" on the "press" control of a touch and then query the position in the callback.

    Code (CSharp):
    1. void OnSlowTap(InputAction.CallbackContext context)
    2. {
    3.     if (context.performed)
    4.     {
    5.         var control = context.control;
    6.         if (control.parent is TouchControl touch)
    7.             Debug.Log($"Slow-tapped at {touch.position.ReadValue()}");
    8.     }
    9. }
    10.  
    But... hardly elegant.
     
    opdenis and laurentlavigne like this.
  28. Shaunyowns

    Shaunyowns

    Joined:
    Nov 4, 2019
    Posts:
    328
    We'll be uploading it later on, keep an eye out!
     
  29. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Noted. Added an item in the backlog (ISX-353) to improve docs on writing custom composites and interactions.

    Great idea. Noted in backlog (ISX-354).

    OT. But good idea either way.

    Once we have gestures in there, I think we'll have much better news in this area. For now, we're still a long shot from something like LeanTouch.

    Could you file a ticket with the Unity bug reporter? Think our QA should have a closer look. Sounds buggy.

    ATM there's no dedicated support. Recommended way ATM is to pick up input in your ComponentSystem's Update method (like from any other main-thread-constrained API) and then go from there.

    DOTS-specific input support is one of the three areas we want to be focusing on next.

    It's imminent. The 2019 LTS release will be the one that we'll keep targeting with the 1.x version of the input system and we intend to backport suitable native-side fixes for as long as we can.

    There's two methods currently at your disposal.

    "Active Input Handling" in the Player project preferences is tied to two #defines. ENABLE_INPUT_SYSTEM is defined if it's set to "Input System Package" or "Both" and ENABLE_LEGACY_INPUT_MANAGER is defined if it's set to "Input Manager (Legacy)" or "Both".

    However, this isn't tied to whether the input system *package* is actually used by the project. For that, you can use the "Version Defines" feature of asmdef files. Simply add a version define that is tied to the "com.unity.inputsystem" resource and set the min version that you want to support. Then you can have code behind #if that automatically becomes active as soon as a user adds the input system to the project.
     
    Last edited: Apr 15, 2020
    Edy likes this.
  30. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,513
    I think its good news, Is there any place on the forum I should look out for when this is implemented MGUI and UI Elements? I know that on Mac OS the pressure already works for some time

    https://docs.unity3d.com/ScriptReference/Event-pressure.html

    Good to hear is coming to windows too, so more tools will be able to make use of it by default
     
  31. Alesk

    Alesk

    Joined:
    Jul 15, 2010
    Posts:
    339
    Thanks for all these answers :)
     
  32. albrechtjess

    albrechtjess

    Joined:
    Oct 11, 2017
    Posts:
    11
    I was wondering if you had any examples or tutorials of modifying the Player Input Manager? I'm try to better understand how that works as I'm building a split screen shooter and need more control over what happens when the player input manager spawns a player when a button is pressed to join.
     
  33. X3doll

    X3doll

    Joined:
    Apr 15, 2020
    Posts:
    34
    A way to bind dpad and wasd separated? It seems that the Input System associate "One Device --> One Player" handling, but i don't think should be like this. You can go in example with one controller 2 player and so on... or one big custom input device that handling two player separeted, but it's a single device.
     
  34. Edy

    Edy

    Joined:
    Jun 3, 2010
    Posts:
    2,495
    @Rene-Damm Thank you so much for your reply! Indeed, it was clarifying at several levels. I'll begin adopting the the new Input System straight away.
     
    Rene-Damm likes this.
  35. Livealot

    Livealot

    Joined:
    Sep 2, 2013
    Posts:
    228
    I checked out several of the Unite Now offerings this week, and it seems like y'all are experimenting with various formats, which is cool.

    For me, the format of this session was FAR AND AWAY the BEST of the lot. Please do more like this in the future!
    • Live demo
    • Multiple Live speakers
    • Moderated live Q&A
    • Recorded to YouTube for those who missed it or want to re-watch it
    • A forum for follow up
    Wonderful!
     
  36. Livealot

    Livealot

    Joined:
    Sep 2, 2013
    Posts:
    228
    For mobile games, how should we think about using InputManager vs. using IDragHandler and related input interfaces in scripts?

    Will/Does InputManager replace all those interfaces? Do we use some combination?

    Would be nice to see an example of how to use InputManager with a basic mobile game that uses all the common mobile touch inputs.
     
  37. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    A recording of the webinar is available now.

    We have several open tickets in the backlog for improving PlayerInputManager -- which at this point is still a rather basic lobby implementation. One item on the list is opening up prefab spawning to allow things such as selecting a prefab dynamically. Hopefully the docs/samples also improve as part of this work.

    Each player can have arbitrary many devices. Players can also share devices but PlayerInputManager isn't yet smart enough to be able to share two players on the same device as long as the players use different control schemes. It's on the list.

    Thank you :)

    Those mechanisms will stay valid and useful.

    In general, would recommend doing UI things purely through UI mechanisms. E.g. if the input problem you're looking at is something like "if the user clicks/taps on this 2D screen element", then IPointerClickHandler is generally a better solution than rolling that through picking up input directly through InputSystem.

    ATM our touch input support is still pretty basic. We hope to address that as part of the upcoming work on gestures and then have better examples, too. We do have the "Touch Samples" demo (installable from the package manager UI) but admittedly, the samples provided aren't the most straightforward ones.
     
    albrechtjess, Baste and Livealot like this.
  38. dCalle

    dCalle

    Joined:
    Dec 16, 2013
    Posts:
    55
    hey guys. I'm currently working with the actionasset approach but I couldn't find a way to disable an input scheme (mouse/keyboard) or change a default scheme and prevent switching, esp for testing this is pretty much annoying for me...

    AND:it would be nice if we could change some input settings without having to stop the player, change some settings, wait for recompile and then restart the player. that's damn ridiculous.

    So I'd say instead of BAKING data into the Input class, how about you put it into a json file, where it belongs? Obviously if you have to generate some code based on that data, do so, but much of that baked data is just a method call, like normalizing a vector or adding a deadzone. or many other actions I haven't found yet. you don't need to regenerate a class for that, and slowdown the whole workflow. (I'm not sure but you can probably externalize this whole process into another module and just add it to the whole (Player, maybe Editor) assembly once it is finished regenerating, this way you don't have to recompile THE WHOLE GAME, so In short I think the Input Actions Window shouldn't be much more than I json editor, that calls an external regenerator, whose generated Module gets added to the executing assembly)

    If you need some help, I'm happy to ;-)
     
    Last edited: Apr 20, 2020
  39. opdenis

    opdenis

    Joined:
    Jan 23, 2017
    Posts:
    15
    Thank you, but still can't catch Press/Long tap of Android touch screen.
    This code is not working for me.
    P.S. How can I see Debug output in Editor, while I run the application on the device? Is the only way to see log from Android - using Android Studio?
    I don't have any logs this way.
     
    Last edited: Apr 21, 2020
  40. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    There's not currently a way to disable individual schemes.

    Select scheme as default and disable "Auto-Switch".

    Yes, room for improvement. Backlog item ISX-358.

    What's the binding?
     
    dCalle likes this.
  41. opdenis

    opdenis

    Joined:
    Jan 23, 2017
    Posts:
    15
    Action: "ActionLongTap"
    Properties:
    Action -> Action type: Pass Through
    Action -> Control Type: Touch

    Bindings:
    Primary Touch [Touchscreen]
    Binding: Path = <Touchscreen>/primaryTouch
    Interactions:
    Hold/ Slow Tap/Press - nothing catches. This way works only Tap/Double Tap.

    Code (CSharp):
    1.   public void LongTapLogEvent(InputAction.CallbackContext context)
    2.   {
    3.     Output("...LongTapLogEvent\n"); // This is not working!!!
    4.     if (context.performed)
    5.     {
    6.       var control = context.control;
    7.       Debug.Log("Slow Tapped ****");
    8.  
    9.       if (control.parent is TouchControl touch)
    10.       {
    11.         Debug.Log($"Slow-tapped at {touch.position.ReadValue()}");
    12.       }
    13.     }
    14.     else
    15.     {
    16.       Debug.Log("Long tap::Somewhere here");
    17.     }
    18.   }
    19.  
     

    Attached Files:

  42. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    Questions from the Q&A session that didn't get answered in the webinar due to running out of time.

    Writing custom devices is supported. General info here. HID-specific info here.

    The style of the connected controller is available via the Npad API found in the Switch-specific package.

    Support for Switch, PS4, and Xbox is available via dedicated packages made available to licensees directly via the respective forums. (NDA stuff...)

    Probably makes sense for us to generate the code with an #if by default. I've made a note.

    What you can do ATM is put the code in a custom DLL via an .asmdef and conditionalize that to the presence of the input system. See bottom of my post here.

    Touch can either be bound in actions like other input or polled via the EnhancedTouch API.

    Messages in general have a much greater overhead than UnityEvents. They need a roundtrip to the Unity runtime and expensive lookups (although there is some caching performed by the runtime) and dispatching through reflection. UnityEvents are entirely handled in C# and do not involve reflection.

    For "Invoke C# Events", it is necessary to manually hook into PlayerInput.onActionTriggered.

    Not up-to-date on where things are there ATM. I've inquired and will update my response as soon as I know more.

    Code (CSharp):
    1. Gamepad.current.leftStick.ReadValue();
    Not as such. If you want to test without having gamepad present, you can build a gamepad with the on-screen controls. Alternatively, you can build your own simulator support easily in code.

    For XInput devices on Windows, we're still limited by the 4 gamepad limit hardcoded into the XInput API. When the UWP and classic Win32 backends merge (and the XInput API backend disappears), this restriction will hopefully disappear.

    There's no automatic carrying over of players. Lifetime of PlayerInput instances is tied to the lifetime of their GameObjects. You can put the players into a separate scene or mark their GameObjects as DontDestroyOnLoad in order for the players to carry over from scene to scene.

    @StayTalm_Unity ?

    The team working on the simulator is looking into that.

    @StayTalm_Unity

    Dedicated ECS/DOTS support is going to be worked on soon. For now, it can be used in ComponentSystem Update methods like other main-thread-constrained APIs.

    The usages features intends to solve this and already does to some extent. If you bind to the "Submit" usage, the expectation is for the controller to pick the right control for the current platform and hardware. That said, I just noticed that the Switch controller does indeed *not* set the usage correctly. We'll fix that.

    Eventually, the expectation is that usages will get consistently assigned everywhere and that you *will* get the right controls based on platform, region and user preferences.

    Automatic switching is currently restricted to single player scenarios. Device switching with multiple players currently requires explicit API calls. Allowing controlled auto-switches in multiplayer, too, is on the list.

    Essentially, yes. The setup cost is intended to not be significantly greater than with the old system. There is a zero-setup code path equivalent to the polling APIs in UnityEngine.Input and three-click setup path for PlayerInput.

    That said, for the foreseeable future, using the old APIs will likely remain an option.

    @StayTalm_Unity ?

    Unity's own tutorials will start using the input system going forward. In general, though, we expect the transition to be gradual. The old APIs have a lot of momentum behind them and it'll take a while for the ecosystem to transition.

    You can add support for custom devices (see first question in this post) and add custom binding composites and interactions. Support for gestures (which will include defining/implement custom gesture recognition) is coming.

    As for replacing the managed part of the system wholesale with a custom input system implementation, it's theoretically possible but will require some reading through the code to understand the various undocumented (and mostly binary) formats involved.

    Yes.

    I assume you mean pen support? There is pen support on various platforms though (especially on Windows) we still have issues.

    We're looking into this as part of an effort towards deprecation of the old system. What we'd like to have is a wrapper for the old input APIs which underneath uses the input system. It'd not be 100% as behavior *will* inevitably change (the old system has lots of platform-specific idiosyncrasies) but it should be good enough for the majority of cases.
     
  43. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    These interactions only work with button-type controls ATM. Binding them to an entire touch will not work correctly.

    The editor should be filtering out those actions in this case. Will fix.

    Once the gesture stuff is in, setting these kinds of things up in actions should become much easier. ATM things like touch taps are painful as the tap needs to sit on just the press/contact control but the touch input code will want to have the touch position, too.
     
  44. opdenis

    opdenis

    Joined:
    Jan 23, 2017
    Posts:
    15
    Thank you! Will wait for the announce.
    I hope it will be soon.
     
  45. carl010010

    carl010010

    Joined:
    Jul 14, 2010
    Posts:
    139
    @Rene-Damm I was looking at the samples that can be downloaded from the package manager and the Editor window sample seems to be missing. It's there in the video at 11:26, but It's not there in the package manager now. Do you have a direct link to it?

    And please, please, tell me this isn't true. I have been waiting forever to be able to read input from a gamepad in editor mode for years, and now that we finally have it you guys are thinking about cutting it?
     
  46. Finally a demo where it is actually discussed that the default "SendMessage" method is abysmal at best! :D
     
  47. Baste

    Baste

    Joined:
    Jan 24, 2013
    Posts:
    6,292
    The problem with that is that the input part of the Event system that we usually have to rely on for reading input in the editor is an outdated, ill-conceived piece of trash. I mean, the whole Event thing is bad, but especially reading input. We can't actually do useful things there.

    So no matter how poorly the new input system integrates into the editor, it would be better than what we have.
     
  48. WAYNGames

    WAYNGames

    Joined:
    Mar 16, 2019
    Posts:
    982
    Hello,

    I'm making a top down shooter style game. Using the new input system is great and make it easy to support both keybord and gamepad. Great job ! :)

    My question is how would you go about making a directional input relative to player ?

    See, to aim I use the mouse to get a world position by casting aray from the camera to the world so I get the mouse position just fine for the keyboard setup.
    But for the game pad, I use the rigth stick which just give a -1->1 vector2. For now I made acustom processor that project that input to hte border of the screen using Screen.width and heigth. That works fine in single player scenario but for local multiplayer , the screen is not the player's camera view so the input get a bit distorded.

    Is there a way to access the Camera define in the player's input component from a custom processor, so that I can base the input projection to the player's actual "view" screen ?

    Custom processor code :
    Code (CSharp):
    1. #if UNITY_EDITOR
    2. [InitializeOnLoad]
    3. #endif
    4. public class ToPointerPosition : InputProcessor<Vector2>
    5. {
    6.  
    7. #if UNITY_EDITOR
    8.     static ToPointerPosition()
    9.     {
    10.         Initialize();
    11.     }
    12. #endif
    13.  
    14.     [RuntimeInitializeOnLoadMethod]
    15.     static void Initialize()
    16.     {
    17.         InputSystem.RegisterProcessor<ToPointerPosition>();
    18.     }
    19.     public override Vector2 Process(Vector2 value, InputControl control)
    20.     {
    21.         value.x = (value.x + 1) * (Screen.width / 2);
    22.         value.y = (value.y + 1) * (Screen.height / 2);      
    23.         return value;
    24.     }
    25. }
    I read some stuff about virtual mouse, can that help in this context ?
     
    Last edited: Apr 24, 2020
  49. X3doll

    X3doll

    Joined:
    Apr 15, 2020
    Posts:
    34
    This is the mouse and gamepad stick input distortion. A mouse is a way faster and sensible of a game stick. So you need to normalize the mouse input when it's magnitude is > 1. (Because a vector2 of a controller is capped here, but a mouse delta doesn't)

    Code (CSharp):
    1. if(rotation.magnitude > 1){
    2.       rotation.Normalize();
    3. }
    In this way you pick each decimal value between 0 and 1, and also set a maximum magnitude of the input vector2. (In case of a mouse it will capped max at 1)

    If u want to set a sensibility global for each mouse and gamepad, you can multiply this vector after the check with a variable.

    Also for your problem you really need to rotate the player for facing the forward vector direction.
    You should not see as "My pointer is here, so look here" but "ok, i move the mouse in this direction, so look in this direction".

    The input Vector2 say (in x and y), which direction you should put in the x and z of a direction vector. This vector could be retrieved by the Mouse delta and/or stick of a gamepad.

    If u need to keep the camera around the player, simply normalize this looking direction, and multiply the normalization with a "Range variable" for looking around.
     
    Last edited: Apr 24, 2020
  50. WAYNGames

    WAYNGames

    Joined:
    Mar 16, 2019
    Posts:
    982
    Maybe I was not precise enougth in my description. I'm using a features that if I'm not mistaken was not presented in the talk, the split screen with hte keep aspect ratio. So when I have 2 player in a local multiplayer game, the each have 1/4 of the full screen.

    Each player control a tank so they can move in a directon (no issue there has all is in normalized vector direction ZQSD (yes I'm french...) and left stick) and the tank can aim in another direction.
    For the mouse user it's much more intuitive to have the tank aim where the mouse actually is on the screen, so I take the screen coordinate and cast against he world from the player's camera. In that case, there is no issue because teh player can put the mouse anywhere he want on the screen including the 1/4 of the screen that's his.

    On the ohter hand for a gamepad user, it's much more intuitive to have the tank aim in the same direction as the player old the right stick. The issue is that the rigth stick only gives a -1 to 1 vector 2 range. I could make it realtive to the player easily but it would mean have a different logic for the mouse user and the gmaepad user. So I need to convert the gamepad input into a pointer position on the screen to be able to apply the same raycast logic.
    The issue become that I don't have access to the player's view port so the input gets mapped out of the 1/4 of the player's screen.

    I did not try to make the logic the other way around (convert mouse position to -1 1 vector) but I suspect I'll end up with the same issue and the tank won't aim at the mouse position relative to player but relative to center of the screeen.

    I see no way around having a reference to the camera in hte processors to be able to make inputs relative to a player's view port.

    If you want to try it out, you can look at my full project on github :
    https://github.com/WAYNGROUP/MGM
    If you press play you can join by pressing any button on the keybord or gamepad. you'll see the issue with the gamepad used, if he joins first, when there is two player, the aiming is much more sensitive on the north west corner, and if joins second ,it more sensitive on hte north est corner.

    A word of caution though, it uses the DOTS stack and my own packages (all open source and free) so lots of experimental work :p.