Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Bug Windows touch screen and SimulatedTouch not working together with schemas

Discussion in 'Input System' started by undersun, Oct 3, 2022.

  1. undersun

    undersun

    Joined:
    May 14, 2013
    Posts:
    3
    I happen to have touch screen on my windows laptop but mainly using mouse with the second monitor.
    I would like to use SimulatedTouch device and Keyboard in a single action map (some actions are from keyboard, others from simulated touch).
    I noticed strange behavior, when I have no control schemas, everything works fine together. All three devices (simulated touch, real touch and keyboard) send events.
    But when I create a control scheme and add all the actions into it, only the keyboard and real touch working. SimulatedTouch stop working completely.

    Is it expected behavior? Are there any workarounds?

    P.S. I can see in the PlayerInput.OnEnable there is different path how to handle no schemas and any schemas cases.