Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

Touch input in the Editor

Discussion in 'Input System' started by Somnesis, Feb 27, 2020.

  1. Somnesis

    Somnesis

    Joined:
    Feb 27, 2020
    Posts:
    22
    Hi!

    I have a Surface Book 2 and would like to use the touchscreen on it to test touch in the editor, however this doesn't seem to work.

    1. Using the input debugger window, I can see that everything does work there. When I touch the screen and move my finger around I'm getting deltas from the primary touch and touch0
    2. However, when I try to bind Primary touch/Delta like this:
    upload_2020-2-27_12-27-48.png
    I'm not getting any response in my game on reading the value from View.
    3. Using other input (like the keyboard and gamepad in the screenshot above) does work

    Thanks!
     

    Attached Files:

  2. Rene-Damm

    Rene-Damm

    Unity Technologies

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    What's the debugger saying for the "View" action when you are in play mode? Does it list the touch delta control under the action?
     
  3. Somnesis

    Somnesis

    Joined:
    Feb 27, 2020
    Posts:
    22
    So interestingly it doesn't list the delta, only the keyboard mappings:
    upload_2020-3-6_10-26-12.png

    I noticed that every time I try to interact with the touchscreen, the Control Scheme automatically switches over to "Keyboard & Mouse", *not* "Touchscreen". I tried adding the Delta to the "Keyboard & Mouse" control scheme, but it still doesn't show up in the list in the debugger.
     
  4. Somnesis

    Somnesis

    Joined:
    Feb 27, 2020
    Posts:
    22
    If I manually and forcibly switch the control scheme over to Touchscreen, then everything works:
    upload_2020-3-6_10-30-37.png

    But the touch delta doesn't show up under keyboard and mouse, even though I added it as a control scheme to the mapping.
     
  5. Rene-Damm

    Rene-Damm

    Unity Technologies

    Joined:
    Sep 15, 2012
    Posts:
    1,779
    My guess is this is because our Windows backend isn't cleanly separating the inputs yet. Meaning, there's input on the touchscreen and Windows in turn then feed on *both* the touchscreen and the mouse and we're not correctly ignoring the latter. So there's activity on the touchscreen but also on the mouse and thus the whole thing is confused.

    Would you mind filing a ticket for this with the Unity bug reporter? This should get looked at.
     
  6. Morphus74

    Morphus74

    Joined:
    Jun 12, 2018
    Posts:
    174
    I already post similar stuff on forum, and log a case for it.