Search Unity

Resolved Horizontal Movement with Keyboard and Touch

Discussion in 'Input System' started by ManuelRauber, Apr 4, 2022.

  1. ManuelRauber

    ManuelRauber

    Joined:
    Apr 3, 2015
    Posts:
    122
    Hi,

    I've a problem solving the following scenario with the new Input System, and I hope, someone can help me here.

    I'm currently building a little prototype game where the player only has horizontal movement.

    On desktop with a keyboard, I want to use A and D.
    For touch devices, I want to have it in a way that 0-50 % of Screen.width is considered left where 51%-100% is considered right.

    Sounds pretty easy, right? I know how to do it with the old system and how to do it in the new system with multiple actions and some code involved, but I'm looking for a solution to do it directly in the input system (maybe with a custom processor).

    So, I've setup an action map called "PlayerControls". In there I've an action that is called "HorizontalMovement":

    upload_2022-4-4_16-25-47.png
    The "AD" is setup as a composite binding:

    upload_2022-4-4_16-26-16.png
    The question is, how can I set up touch the way I described above?

    I know that I can use another action to recognize the touch-contact and have some code to read the touch-position of the control.

    But I somehow think it should be possible to directly combine that in the Input System I describe here, right? My "HorizontalMovement" action describes the action and the bindings somehow dictate when the action shall be triggered.

    Just imagine someone using the input system later. He wants to subscribe (or use the interface) to the HorizontalMovement and should not care anymore about touch or keyboard stuff.

    Or is my thinking wrong that it should work this way and the correct setup is via multiple actions and some code to glue them together?

    Thanks!

    Edit:

    Just as the documentation describes:

    upload_2022-4-4_16-53-9.png
    upload_2022-4-4_16-52-58.png



    Tihs is exactly the thing I try do to: define a logical action (HorizontalMovement) based physically on keyboard and touch input.
     

    Attached Files:

    Last edited: Apr 4, 2022
  2. Rene-Damm

    Rene-Damm

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    One way:

    1. Add OneModifierComposite with modifier bound to primary touch press and binding bound to primary touch position X.
    2. Add custom processor like below on that touch X binding.
    Code (CSharp):
    1.     #if UNITY_EDITOR
    2.     [InitializeOnLoad]
    3.     #endif
    4.     public class ScreenHalfProcessor : InputProcessor<float>
    5.     {
    6.         static ScreenHalfProcessor()
    7.         {
    8.             InputSystem.RegisterProcessor<ScreenHalfProcessor>();
    9.         }
    10.      
    11.         [RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.BeforeSceneLoad)]
    12.         public static void Init()
    13.         {
    14.         }
    15.      
    16.         public override float Process(float value, InputControl control)
    17.         {
    18.             if (value > Screen.width / 2f)
    19.                 return 1;
    20.             return -1;
    21.         }
    22.     }
     
    ManuelRauber likes this.
  3. ManuelRauber

    ManuelRauber

    Joined:
    Apr 3, 2015
    Posts:
    122
    This is it!

    Oh damn, I have not thought about the Composite with modifier and it completely makes sense.

    Thanks!