Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Getting Touch Actions with New Input System

Discussion in 'Android' started by NoarK-, May 9, 2020.

  1. NoarK-

    NoarK-

    Joined:
    Jul 20, 2019
    Posts:
    2
    Hey guys, sorry if it's asked a lot of times but to be honest I searched a lot and didn't find why it doesn't work.

    I'm just starting with Android development, the old input works fine, I already worked with the new input system with keyboard, mouse and joystick but now I'm trying to get the touch input and it doesn't work.

    Only for testing I created a function to show the point I'm pressing but the function never get called.

    Am i doing something wrong?
    This is my code:

    Code (CSharp):
    1.  
    2. // I'm setting this on Awake, so the TouchPoint action should call CallSomething()
    3.         controller.Player.TouchPoint.performed += x => CallSomething();
    And this is the "CallSomething()" Function
    Code (CSharp):
    1.  
    2.     private void CallSomething()
    3.     {
    4.         Vector3 touch = Touchscreen.current.position.ReadValue();
    5.         Debug.Log($"Touch Screen Position: {touch}");
    6.         var world = Camera.main.ScreenToWorldPoint(touch);
    7.         Debug.Log($"Touch World Position: {world}");
    8.         Debug.DrawLine(world, world + Vector3.one, Color.magenta, 5f);
    9.     }
    The function doesn't even get called and I don't know why.
    In the input system window I have an action named "TouchPoint" and tried putting a lot of different bindings but nothing works. I tried with Touchscreen Press, #0, #1, #2, etc and every type of binding like Tap, Touch, Indirect Touch, etc. But nothing works, I don't even know why!

    And yes! I enabled my controller on onEnable() so it's working fine.

    Am I doing something wrong or do I need to call something that I'm not calling?

    Thank you!
     
  2. kingofraytown

    kingofraytown

    Joined:
    Jul 16, 2014
    Posts:
    1
    I had the opposite issue, where I found a way to get a touch action to be called but I didn't know how to get the position info from the touch until I saw your CallSomething method.

    In the input system, I made an action with Action Type "Value" and Control Type "Touch"

    TouchPoint 1.png

    Then I used the binding, "Primary Touch" under Touchscreen

    TouchPoint 2.png

    I am using Input System 1.0.0 and Unity 2019.3.0f3

    I hope this helps.
     
  3. msibrava

    msibrava

    Joined:
    Mar 18, 2019
    Posts:
    7
    I set this up... I got it to work. If I changed the TouchPoint action to give a Vector2 value. Unity threw exceptions for Vector3.
    Code (CSharp):
    1.     void Awake()
    2.     {
    3.         controller = new TestController();
    4.         controller.Player.TouchPoint.performed += x => CallSomething(x.ReadValue<Vector2>());
    5.     }
    6.     private void CallSomething(Vector2 touch) {
    7.         Debug.Log($"Touch Screen Position: {touch}");
    8.         var world = Camera.main.ScreenToWorldPoint(touch);
    9.         Debug.Log($"Touch World Position: {world}");
    10.         Debug.DrawLine(world,world + Vector3.one,Color.magenta,5f);
    11.     }
    upload_2020-10-14_0-11-47.png

    upload_2020-10-14_0-12-28.png
    Thought I'd ask, too, if you have the EventSystem set up with the new Input System with the InputActions wired to the Actions Asset? (That got me for a bit.)

    Input System 1.0.0
    Unity 2020.1.8f1
     
    RVDH likes this.
  4. fkhaller

    fkhaller

    Joined:
    Nov 18, 2020
    Posts:
    3
    @kingofraytown
    I could not get the touch control type to work for me like that. I could only map the child controls of the TouchControl. https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/Touch.html

    @msibrava
    I tried that as well but that setup picks up multiple input on dragging your finger on a touchscreen. I found that the action setting action type to button and using the "<Touchscreen>touch*/press" binding gave me my desired binding.
    touchscreen_input_1.PNG
    You can then get the parent of the controller and use it to get the position. In my case I had mouse input as well so I did this:

    if (context.control.parent is TouchControl control)
    {
    SetWorldDirection(control.position.ReadValue());
    }
     
    Prakash1206 and Rs like this.
  5. Steedalion

    Steedalion

    Joined:
    Jul 6, 2020
    Posts:
    51
    I just had this problem of nothing happening and I forgot the enable the controller. This seems like a common mistake.

    Code (CSharp):
    1. protected void OnEnable()
    2.     {
    3.         controller.Enable();
    4.     }
     
    camposalejandrofabian likes this.
  6. Goatfryed

    Goatfryed

    Joined:
    Feb 26, 2023
    Posts:
    1
    This worked nicely for me:
    upload_2023-2-26_15-39-55.png

    This way, I can easily read the position as vector2d with GetValue in both cases.
    The trick is to use a composite for mouse, because for whatever reason, mouse doesn't support click values by default, while touch does.
     

    Attached Files:

  7. DevTom81

    DevTom81

    Joined:
    Aug 10, 2021
    Posts:
    11
    Unfortunately that doesn't fix my problem. When I tap or drag on the screen, it immediately switches from touch scheme to mouse scheme even though I'm not using the mouse. So it detects mouse as current device, not the touch screen.

    Auto-Switch in PlayerInput is enabled.
     
    Last edited: Jun 30, 2023