Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

UIElements touch events?

Discussion in 'UI Toolkit' started by unitydungeonmas, Nov 11, 2020.

  1. unitydungeonmas

    unitydungeonmas

    Joined:
    Sep 6, 2020
    Posts:
    37
    Is there a way to get touch events? I'm using MouseDownEvent and MouseUpEvent, and MouseMoveEvent checking button presses for dragging. i see PointerDownEvent and PointerUpEvent, but PointerMoveEvent doesnt have a way to detect drag for touch. which is preferred?
     
  2. griendeau_unity

    griendeau_unity

    Unity Technologies

    Joined:
    Aug 25, 2020
    Posts:
    230
    Hi unitydungeonmas,

    Pointer events are the recommended way to go for touch input. To detect if the user is dragging, you can store the pointer position in a variable on pointer down, and then compare with the position on pointer move to get the distance dragged.
    Is that what you are trying to implement?

    You could also have a look at drag and drop events, which I believe also work with pointer events, if you're implementing a drag and drop functionality.
     
  3. Chris-Trueman

    Chris-Trueman

    Joined:
    Oct 10, 2014
    Posts:
    1,256
    I cannot get those to work at runtime, and I get errors when I try to build with them being used in the project.

    error CS1069: The type name 'DragEnterEvent' could not be found in the namespace 'UnityEngine.UIElements'. This type has been forwarded to assembly 'UnityEngine.UIElementsModule, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' Enable the built in package 'UIElements' in the Package Manager window to fix this error.

    Also as of preview 10 they have been removed from the Player.
    • Removed ability to reference drag events in the Player, since those are never sent
     
  4. Midiphony-panda

    Midiphony-panda

    Joined:
    Feb 10, 2020
    Posts:
    234
    Things to consider with PointerEvents : if you register to them, you might have more than one event per touch with the current event system.

    For a swiping thing I implemented, I discarded imguiEvent and events not corresponding to the first finger touching the screen :
    Code (CSharp):
    1. private void _OnPointerDown(PointerDownEvent evt)
    2. {
    3.        if (evt.imguiEvent != null || evt.pointerId != 1)
    4.            return;
    5.  
    6.        // Do your things here
    7. }
    (BTW, more information on pointerIds and multi-touch if needed : https://forum.unity.com/threads/does-multi-touch-work-at-runtime-for-android.956412/#post-6239945 )

    There are still some issues with touch support though :\
    If your PanelSettings rescale the panel, the y-coordinates of your touch events will be distorted or have an offset (so you might not receive some touch events).


    @unity_griendeau , I think there is an issue inside UI Toolkit Event System could you take a look at the following code directly from the package ?

    There is also an issue with delta position, which should be something like this :
    Code (CSharp):
    1. touch.deltaPosition = new Vector2(touch.deltaPosition.x, -touch.deltaPosition.y);
    (PS : for whatever reason, I'm not allowed to post my message if I write the method names :D)
     
  5. griendeau_unity

    griendeau_unity

    Unity Technologies

    Joined:
    Aug 25, 2020
    Posts:
    230
    That's right, touch events are sent as mouse events and pointer events. What we typically check to get the primary touch is :
    evt.pointerType != PointerType.mouse && evt.isPrimary
    . Testing evt.PointerId against 1 also works for primary touch, but I would recommend testing against either PointerId.mousePointerId or PointerId.touchPointerIdBase instead of 1.

    I saw a thread about that, the bug is currently in triage and should be handled soon!
     
    Last edited: Nov 17, 2020
    Orimay and Midiphony-panda like this.
  6. Midiphony-panda

    Midiphony-panda

    Joined:
    Feb 10, 2020
    Posts:
    234
    Latest UI Toolkit runtime version (preview.14), Unity 2020.2.6 : touch inputs are still offset when panel.scale is different from 1.

    Bug seems(?) to be still there in DefaultEventSystem.cs lines 204 to 207.
    I still think the "Flip Y Coordinates" should be made l.234 just before sending the position based event along the panel, as the event position will depend on the panel scale.
    If the coordinates must be flipped after that, nevertheless, the panel height should be used : not the screen height (which is not depending on the panel scale).

    Also I guess this :
    Code (CSharp):
    1. touch.deltaPosition = new Vector2(touch.deltaPosition.x, Screen.height - touch.deltaPosition.y);
    Should be this :
    Code (CSharp):
    1. touch.deltaPosition = new Vector2(touch.deltaPosition.x, - touch.deltaPosition.y);
     
    Halfspacer likes this.
  7. MousePods

    MousePods

    Joined:
    Jul 19, 2012
    Posts:
    754
    I can confirm this. I have a UIDocument that the buttons don't work unless it is the native resolution in the panel settings or same aspect ratio.
     
  8. unitydungeonmas

    unitydungeonmas

    Joined:
    Sep 6, 2020
    Posts:
    37
    I regret upgrading to the "new" input system, from when I used to use Input.touches ..

    I'm trying to use Pointer for both mouse events and touch events - is this the way to go? If so, PointerMoveEvent has different meaning - for a mouse drag, we can check whether a button was pressed, but what is the equivalent for non-mouse? do we just explicitly check that it's not a mouse pointer?
     
    Last edited: Mar 20, 2021