Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Multiple touchscreens: multitouch on multiple input devices

Discussion in 'Editor & General Support' started by z_orochii, Mar 5, 2020.

  1. z_orochii

    z_orochii

    Joined:
    Mar 8, 2013
    Posts:
    20
    Hi! First of all I tried putting this in the Unity Answers, however since I think it's something related to stuff that's still in preview, and Answers is more about user-to-user help, I thought I could get more help here.

    Here is the question just for reference, though I just copy-pasted everything here:
    https://answers.unity.com/questions/1704768/multiple-touchscreens-multitouch-on-multiple-input.html

    Anyway...

    I have been trying to get input from two touchscreens at the same time, and detect from which is coming the input in order to manage UI elements from two separate Canvases on two different displays, but it doesn't properly work. Platform is Windows.

    Basic premise of the application, it's an interactive app which is going to be used simultaneously on both screens by kids, so it's kind of necessary that both screens can act simultaneously and separately. I don't really care about multitouch, just that the screens respond properly and one doesn't block the other (which is what happens if you simply use the UI, like, say, a vanilla UI Button component). It feels a bit unresponsive.

    I'm using the new Input System. I first tried 0.2 since I was using Unity 2018.4 (it's the latest LTS available so it's more reliable). But I also tried with 1.0 which comes with Unity 2019.3. And overall same results in both.

    Here's the code from my test right now. This one comes from my tests with InputSystem 1.0.0preview

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.InputSystem;
    5. using UnityEngine.InputSystem.Controls;
    6. using UnityEngine.UI;
    7.  
    8.  
    9. public class TestTouch : MonoBehaviour {
    10.     [SerializeField] int deviceId;
    11.     [SerializeField] Text debugText;
    12.  
    13.     Image selfImage;
    14.     Text label;
    15.     RectTransform rt;
    16.     //Touchscreen selfDevice;
    17.  
    18.     bool pressed;
    19.  
    20.     void Start() {
    21.         selfImage = GetComponent<Image>();
    22.         label = GetComponentInChildren<Text>();
    23.         rt = (RectTransform)transform;
    24.  
    25.         InputDevice[] allDevices = Touchscreen.all.ToArray();
    26.  
    27.         string devicesStr = "";
    28.         foreach(InputDevice d in allDevices) {
    29.             devicesStr += d.name + "::" + d.displayName + "::" + d.shortDisplayName + System.Environment.NewLine;
    30.         }
    31.         debugText.text = devicesStr;
    32.  
    33.         /*Touchscreen selfDevice = Touchscreen.current;
    34.         selfDevice.activeTouches*/
    35.     }
    36.  
    37.     void Update() {
    38.         // Offsets for positions.
    39.         float startOffset = 1920f * deviceId;
    40.         float endOffset = startOffset + 1920f;
    41.         // Check touch devices.
    42.         if (Touchscreen.current != null) {
    43.             foreach (TouchControl tc in Touchscreen.current.touches) {
    44.                 // Get value
    45.                 Vector2 touchPos = tc.position.ReadValue();
    46.                 //debugText.text = touchPos.ToString();
    47.                 // Check if it's for this screen
    48.                 if (touchPos.x >= startOffset && touchPos.x < endOffset) {
    49.                     touchPos.x -= startOffset;
    50.                     RefreshPressed(true, touchPos);
    51.                     return;
    52.                 }
    53.             }
    54.         }
    55.         bool mp = false;
    56.         Vector2 pos = Input.mousePosition;
    57.         if (pos.x >= startOffset && pos.x < endOffset) {
    58.             mp = Input.GetMouseButton(0);
    59.         }
    60.         RefreshPressed(mp, Input.mousePosition);
    61.     }
    62.  
    63.     void RefreshPressed(bool press, Vector2 position) {
    64.         pressed = false;
    65.         if (!press) {
    66.             selfImage.color = Color.white;
    67.             return;
    68.         }
    69.         //
    70.         Rect r = rt.rect;
    71.         Vector2 start = rt.position + new Vector3(r.x, r.y);
    72.         Vector2 end = start + new Vector2(r.width, r.height);
    73.         if (position.x > start.x && position.x < end.x
    74.             && position.y > start.y && position.y < end.y) {
    75.             pressed = true;
    76.         }
    77.         selfImage.color = pressed ? Color.gray : Color.white;
    78.         label.text = position.ToString();
    79.     }
    80. }
    81.  
    Now, thing is, as far as I can see, UI uses the default mouse abstraction (which I think is managed by Windows?), so any touch is interpreted as a mouse. It lets me interact with both canvases either with mouse or touch, but not at the same time.

    I also tried printing all available InputDevices by iterating over Touchscreen.all and it only shows one Touchscreen. I don't know if it's because Windows abstracts it or what, but thing is, only one screen seems to get touch support. I'm trying the new InputSystem because I read that was one of the things that was being developed, better touch support.


    If you help me, I'll give you a virtual cookie. Thanks!
     
  2. Cave2137

    Cave2137

    Joined:
    Jan 23, 2018
    Posts:
    14
    Did you find a solution?