Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Bug Windows touch screen and SimulatedTouch not working together with schemas

Discussion in 'Input System' started by undersun, Oct 3, 2022.

  1. undersun

    undersun

    Joined:
    May 14, 2013
    Posts:
    3
    I happen to have touch screen on my windows laptop but mainly using mouse with the second monitor.
    I would like to use SimulatedTouch device and Keyboard in a single action map (some actions are from keyboard, others from simulated touch).
    I noticed strange behavior, when I have no control schemas, everything works fine together. All three devices (simulated touch, real touch and keyboard) send events.
    But when I create a control scheme and add all the actions into it, only the keyboard and real touch working. SimulatedTouch stop working completely.

    Is it expected behavior? Are there any workarounds?

    P.S. I can see in the PlayerInput.OnEnable there is different path how to handle no schemas and any schemas cases.