Search Unity

Bug Touch input on standalone build generates also mouse input when window is shifted

Discussion in 'Input System' started by OrdainedCoder75, Jul 22, 2022.

  1. OrdainedCoder75

    OrdainedCoder75

    Joined:
    Mar 10, 2020
    Posts:
    2
    Hi everyone,

    I just incurred in a strange and pretty specific bug and wanted to know if anyone else has ever encountered the same.

    This only happens with a touchscreen monitor on a desktop standalone build.

    I created a test scene that debugs to screen the status of activation of the three mouse buttons and five touches (I already filed a bug report).

    How to reproduce
    1. Start build
    2. Maximize the game if it isn't
    3. Touch the screen anywhere. Only touch input(s) become active while mouse inputs stay inactive.
    4. Exit fullscreen if not windowed
    5. Shift the window to the right at about 1/4 of the screen width
    6. Touch the screen on the right half. Only touch input become active.
    7. Touch the game screen near the left border of the window. The left mouse button input becomes active.

    Other things I noticed
    • Legacy Input and Input System show no difference
    • The same happens if you move the window down
    • The portion of the game screen where the bug happens is
      • the same width from the left border as the offset between the window and the left screen border
      • the same height from the top border as the offset between the window and the top screen border
    Basically, if the top-left corner of the game window matches the top-left corner of the screen, everything works fine. No matter the size of the window. Otherwise the bug pops out.

    How I read inputs
    • "Legacy Input" build uses Input.GetMouseButton() and Input.GetTouch() with Input.simulateMouseWithTouches set to false.
    • "Input System" build uses Mouse.current.*Button.isPressed and Touch.activeTouches with EnhancedTouchSupport enabled.
    Screenshots
    Ok

    Left

    Top