Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Managing more than one touchscreen

Discussion in 'Editor & General Support' started by z_orochii, Mar 9, 2020.

  1. z_orochii

    z_orochii

    Joined:
    Mar 8, 2013
    Posts:
    20
    I'm losing my mind with this problem.

    Has anyone ever tried using more than one touchscreen. I don't mind the multitouch, I just want to be able to use both of them, independent one from the other. However Unity, even using the new InputSystem, can't see more than one of the touchscreens. The other one is completely disabled, except by working as a mouse (which my guess is that it's done by Windows or something, an abstraction of sorts for touchscreen stuff).

    So, question is, has anyone ever tried this? Doesn't matter if it's using extensions or .Net libraries or whatever. This is gonna run on Windows only, so no worries about Android/Mac/iOS/whatever.

    Important thing is, being able to interact and detect from which touchscreen each touch event comes.

    The reason for all this is because there will be separate users on each screen, so it would be nice if both of them could interact without interrupting the other, which happens with the default mouse implementation.

    Thanks!