Search Unity

[NEW UPDATE!] Fingers - Touch Gestures - #1 in Quality, Support and Features : Dozens of Gestures✓

Discussion in 'Assets and Asset Store' started by jjxtra, Apr 25, 2016.

  1. KingPic

    KingPic

    Joined:
    Nov 5, 2012
    Posts:
    11
    I need to detect up and down gestures separately. Should I create 2 separate SwipeGestureRecognizer?
     
  2. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You can set allowed direction to all, and the gesture has a resulting direction property, which you can check and ignore left or right.
     
  3. KingPic

    KingPic

    Joined:
    Nov 5, 2012
    Posts:
    11
    Thank you for your speed. I'm trying this now. Just bought it and will make sure to leave an excellent review
     
  4. KingPic

    KingPic

    Joined:
    Nov 5, 2012
    Posts:
    11
    Can I have a swipe that triggers only if user used 3 fingers at the same time? thank you
     
  5. nkholski

    nkholski

    Joined:
    Feb 24, 2018
    Posts:
    4
    I can't get the swipe gesture to work at all and get no error messages. It works great in a 2019.3 project but not in a new 2020.2 project, not even the demos. The StateUpdated is never called and the "dot" for the finger does not show on the screen.
    Possible causes I can think of is the Unity version or that I use the new input manager (I've tried to disable canvases with buttons to avoid compiler errors), or if there is a player setting that I might have toggled by mistake or forgotten to activate. I only tried swipe, but assume that other stuff is also broken. Any clues on what it might be?
    Great asset by the way! Can't wait to get it working again :)
     
  6. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    First thing to try is demo scene swipe in a new blank project. Let me know if it does not work for you.
     
  7. MM-Mat

    MM-Mat

    Joined:
    Dec 11, 2015
    Posts:
    13
    Hello,

    I'm trying to use
    GestureRecognizers
    for providing a Drag&Drop UI experience. My items are UI elements. They can, in some circumstances, intersect each others and when it happens, Fingers is using some pretty weird sorting strategy... which I'm not sure why is that way at all.

    I'm using
    GestureIntersectsObject
    to raycast for viable targets which uses a private raycastResultCompare. This compare look like this:

    Code (CSharp):
    1.  
    2. private static readonly System.Comparison<RaycastResult> raycastResultCompare = RaycastResultCompare;
    3.  
    4. [...]
    5.  
    6. private static int RaycastResultCompare(RaycastResult r1, RaycastResult r2)
    7.         {
    8.             SpriteRenderer rend1 = r1.gameObject.GetComponent<SpriteRenderer>();
    9.             if (rend1 != null)
    10.             {
    11.                 [...]
    12.             }
    13.             return r2.gameObject.transform.GetSiblingIndex().CompareTo(r1.gameObject.transform.GetSiblingIndex());
    So much I saw there is no way to override the strategy through any setting. My issue is that for UI elements it is guaranteed to always be the fallback strategy, but it doesn't make sense in the minute these two objects are actually having different transforms - which is pretty normal scenario for anything even a bit more complicated. This compare will just give control for the one whose' daddy has more objects and if it happens to have a collider - and our GestureRecogniser is on the other one - it's game over, since GestureIntersectsObject will break the foreach and return with null on the first collider it finds. Our gesture will then never be recognized. I don't think it makes much sense this way (but I can be wrong).

    I think it would rather need to actually consider the two item's position in the visual tree and order them based on that (top item first). For UI, this would make a lot more sense. SiblingIndex could only be a shortcut for objects where the parent is the same.

    Now I could update the logic in your code, but I don't want to break the asset's code for future updates. So I'm wondering whether by chance you would update/fix the logic (well, if you accept my suggestion as a better approach), or let me know if I missed an option which would let me change the strategy without altering your code?

    (note: this also affect your
    FingersDragDropComponentScript
    implementation)

    Thanks,
     
    Last edited: Feb 3, 2021
  8. MM-Mat

    MM-Mat

    Joined:
    Dec 11, 2015
    Posts:
    13
    Hi,

    I have another question. If I have an object with a
    LongPressGestureRecognizer
    inside a scroll rect, how can I mark a touch input (grabbed by this GestureRecognizer) as "handled" to prevent it from being propagated to the ScrollRect / or otherwise ensure that the ScrollRect won't "double" handle it (and treats it as scrolling) if it was consumed by the GestureRecognizer?

    (I tried creating a PointerEventData and call Use() on it, but that didn't do anything as I couldn't make a valid PED from the GestureRecognizer's data.)

    thanks!
     
    Last edited: Feb 4, 2021
  9. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I resubmitted the latest version that adds a comparison parameter that you can specify for your own method
     
  10. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I'll have to think about this. Maybe something that marks the gesture as consumed for the current frame.
     
  11. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You could set the fingers scroll view to enabled = false, then set it enabled = true when the long press gesture ends.
     
  12. MM-Mat

    MM-Mat

    Joined:
    Dec 11, 2015
    Posts:
    13
    Hi jjxtra,

    thanks for the updates and the proposed solution.

    Unfortunately it's quite hacky for me as the items are completely decoupled from the scrollrect and this would require a reference on each item for this sole purpose (or a somewhat costly GetComponent check at least in the dragged and dropped states). Also, if I set scrollview.enabled to false I assume the rest of the items in the viewport won't respond to other events (such as tooltips, state transitions) either while the dragging lasts. Being able to "consume" the input would be much cleaner and hopefully more lightweight.
     
  13. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I will look into marking touches as consumed for a frame.
     
  14. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Re-uploaded latest version with some changes.
    - GestureRecognizer has a Priority field to control order of execution/processing
    - GestureRecognizer has static ConsumeTouch and UnconsumeTouch methods, allowing your gesture to consume a touch. Probably the best place to call this is when the gesture begins, call it on the current tracked touches.
    - Make sure to call GestureRecognizer.UnconsumeTouches when the gesture ends or fails.

    Let me know if it works at all.
     
  15. MM-Mat

    MM-Mat

    Joined:
    Dec 11, 2015
    Posts:
    13
    Thanks for the quick update!

    I've added these method calls to my use case (either to State.Began/ State.Ended or just bluntly inside State.Executing and then State.Ended), but they don't seem to work.

    However there might be a misunderstanding about the root issue. Like I said, I'm trying to find a way to mark inputs as consumed for the
    EventSystem
    / standard Unity UI components in case the inputs were already treated as touches and consumed by your asset (through a
    GestureRecognizer
    ), because having both kind of components is probably a quite common scenario (at least it is in my project). As far as my tip goes your solution would only work inside your framework, ie. between multiple GestureRecognizers - at least I didn't see any place where those touches would be casted back as inputs and be used - that's why I can't see any change.
     
  16. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Would like some tips on how to have touch independency. I refer to the pangesture where i have different outcome when draging around the camera on different devices. I am using this code to pan around.

    Code (CSharp):
    1. private void PanGestureCallback(GestureRecognizer gesture)
    2.     {
    3.         if (gesture.State == GestureRecognizerState.Executing)
    4.         {
    5.             Quaternion q = mainCam.transform.rotation;
    6.             q = Quaternion.Euler(0.0f, q.eulerAngles.y, 0.0f);
    7.             moveVelocity += (q * Vector3.right * DeviceInfo.PixelsToUnits(gesture.DeltaX) * Time.deltaTime * panSpeed * 500.0f);
    8.             moveVelocity += (q * Vector3.forward * DeviceInfo.PixelsToUnits(gesture.DeltaY) * Time.deltaTime * panSpeed * 500.0f);
    9.         }
    10.     }
    11.  
    12. void Update{
    13. mainCam.transform.Translate(moveVelocity, Space.World);
    14.             moveVelocity *= dampening;}
    15.  
    I also wonder how to get rid of that square box that pops up when longpressing?
     
  17. BBET

    BBET

    Joined:
    Dec 18, 2012
    Posts:
    40
    Hi jjxtra,
    Do you think it makes sense to use fingers for more complex image recognition to recognize fast drawn sketches? The sketches will have more path counts and touches to track. The usage that you press "Space" that it is matched - is fine. The solution would not need to track every touch -only the result of the image when "Space" is pressed.
    As first step I have tried to import in fingers with bulk Images 20 categories with each 30 images to test the recognition rate. I can increase image numbers over time, but currently I see 3 problems:

    1.) Recognition rate is not good. Many mismatches to objects you would not expect a matching. Better would also be to get a ranking for the different suggested images (like 1 . Place 80% a house, 2. Place 75% a tree , 3.rd 10% a car, ….). Any chance for a ranking in fingers?
    2.) Confusion while drawing with path counts and when the images that is currently drawn gets deleted. I need more (best unlimited) path counts - till the image is finalized - then whole drawing should be used for matching then space is pressed.
    3. Sometime there are also problems with path counts that I try to match the image with space, but it is not deleted afterwards and when drawing further the next paths are added to the existing image and suddenly it gets deleted in total. I would prefer that you draw something and you can press a button, so that the system matches the drawing, but then you can further draw the image and press the button again - and it is matched again. There should be another Botton for starting new drawing (if this is only a UI problem we can solve it, but I assume it is a path problem)..
    4.) not a Problem, but a question: What is the best approach for good recognition rate. Currently I have all images included in the „DemoSceneImageRecognition“ under "gesture images" with 1 record „All“ and have included all 600 images (with the 20 categories) in this first record. Would it be better to split it in each category and „play around with the parameter“ for each category? How would you do this with much more images to get best recognition?

    My end goal would be to have about 500 categories (maybe even more) with about 100 images each to train, I plan to use 64x64 images. Do I need to adapt the image size somewhere or is this taken from the image? I understood the 64x64 is highest - or is there any sense in trying with higher (or maybe also lower) resolutions to improve recognition?

    Do you think it is possible to use fingers for those drawing recognitions? Are you planing any add-ons in short time for the image recognition part?
    If fingers does not work, I would need to use neural networks, but sure would highly prefer a smaller solution with fingers (as long as the recognition is good enough).
    Till now, I have not checked any code and only tested existing functionality. But before I look there in more detail, it would be good to know if there is a chance with fingers, or if we better invest time in another approach.

    Thanks for your help!
     
    Last edited: Feb 21, 2021
  18. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    64x64 is the max size, in order to pack the image into an unsigned 64 bit integer.

    I think fingers will work for your use case, but you may have to play with the score padding property on the script.

    Ultimately, if fingers does not meet your needs, please reach out to support@digitalruby.com and I can issue a refund.

    It's probably worth spending a few hours or days on this with fingers before going with a more complex approach.
     
  19. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Is ShowTouches turned off?
     
  20. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    FingersScript.Instance.ShowTouches = false;
    I also unchecked it at the fingersscript. Still there. Any more tips?

    Any suggestion to my other issue?
     
  21. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Does this happen in DemoScene if you turn showtouches to false?
     
  22. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Yes. Tested in the DemoSceneDragDrop. Thicked of the show touches in theFingerScriptPrefab. When hold to drag it displays that square box.

    Using Fingers version 2.9.8
    Unity 2019.4.21f1
     
    Last edited: Mar 15, 2021
  23. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Can you post a screenshot here? Just want to compare to what I am seeing in the drag drop demo scene.
     
  24. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    ScreenShot.png
     
  25. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Is that a touch screen monitor? I see the same thing on my touch screen monitor. I don't see it if I use the mouse or a tablet.
     
  26. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Correct i am using a touch screen monitor.
     
  27. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Allright, does it happen with an actual build and have you gone through the tablet setup to mark the monitor as a touch screen?
     
  28. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Ok i noticed now that it is not appearing on my build on the phone. I do mostly rapid testing with play mode on unity. Ok so that is solved then thanks. :)
    Do you have any suggestions regarding the touch drag senistivety that i also mentioned as my problem? Have tested On my laptop with touch monitor and 2 samsung phones. Get different outcome when paning with the camera. On laptop draging makes the camera fly and on the phones i get slower respond. Is there a way to unify the touch drag movement?
     
  29. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The laptop dpi might be getting set to some default value, you can change the thresholdunits on the pan gesture to change sensitivity. Other getures have similar threshold properties you can adjust.
     
  30. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    I dont understand how would i change it for every possible mobile platfrom? Is it all about DPI? Do Fingers detect DPI for the current platform it is running and adjust its calculations so that the outcome will be the same on every touch platform , ex draging with finger is the same?
    I have tweeked the panspeed but as i said i get different panspeed on different divices.
    Using this code:
    Code (CSharp):
    1.  if (gesture.State == GestureRecognizerState.Executing)
    2.             {
    3.                 Quaternion q = mainCam.transform.rotation;
    4.                 q = Quaternion.Euler(0.0f, q.eulerAngles.y, 0.0f);
    5.                 moveVelocity += (q * Vector3.right * DeviceInfo.PixelsToUnits(gesture.DeltaX) * Time.deltaTime * panSpeed * 500.0f);
    6.                 moveVelocity += (q * Vector3.forward * DeviceInfo.PixelsToUnits(gesture.DeltaY) * Time.deltaTime * panSpeed * 500.0f);
    7.             }
     
  31. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    That code looks ok. Are you logging the dpi of each device? What feels good a the monitor should not be your base line as it will report a default dpi that is immaculate.
     
  32. KingPic

    KingPic

    Joined:
    Nov 5, 2012
    Posts:
    11
    Can I detect double tap using 3 fingers? so I need the user to tap using 3 fingers at once
     
  33. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Have tested on 2 samsung phones, s10 & s21. Both shows 420dpi. And the thing that differs seems to be the damping. Camera dampers slows down quickly on the s21 but on the s10 it goes further. Any tips what could be causing this. Using the code below on the update:
    Code (CSharp):
    1. mainCam.transform.Translate(moveVelocity, Space.World);
    2.             moveVelocity *= dampening;
     
  34. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Not sure, it will depend on frame rate. Maybe move to fixedupdate function?
     
  35. HalDevOne

    HalDevOne

    Joined:
    Apr 12, 2014
    Posts:
    67
    Thank you i appreciate all the help.
     
  36. fraeri

    fraeri

    Joined:
    Nov 8, 2018
    Posts:
    64
    Hi All,
    I'm using the Pan-Obit-Script to move the camera around a go. In one scene I also implemented an auto rotation in y direction until the user Pans the Go like this in the void update():

    Code (CSharp):
    1.  
    2. if (!AlreadyTouched)
    3. {
    4.    if (FingersScript != null)
    5.    {
    6.       if (FingersScript.PanGesture.State != GestureRecognizerState.Executing)
    7.       {
    8.          transform.RotateAround(point, new Vector3(0.0f, 1.0f, 0.0f), 30 * Time.deltaTime * speedMod);
    9.       }
    10.       else
    11.       {
    12.          AlreadyTouched = true;
    13.       }
    14.    }
    15. }
    16.  
    I would like to change into a initial spin when the scene is loaded, so that the part rotation slowly comes to an end like it does when the user quickly drags it into one direction. Any idea how to achieve that?
     
  37. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You can use Unity physics calls to apply angular velocity depending on gesture or when the scene starts up.
     
  38. fraeri

    fraeri

    Joined:
    Nov 8, 2018
    Posts:
    64
    Applying the force to the camera just in one direction? Does this mess with the Pan orbit script or does the script hold the camera on the right orbit path around the object
     
  39. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The pan/orbit script does not require a camera. It simply causes one object to orbit and optionally look at the other.

    Looking through the code it appears all transform based. So there would need to be some modifications to the code to make it more physics based instead of directly setting the transform position. I am not sure on how long this would take.

    In the script there are Orbiter.transform.position and Orbiter.transform.RotateAround calls, these would need to be refactored to use physics.
     
  40. fraeri

    fraeri

    Joined:
    Nov 8, 2018
    Posts:
    64
    Sounds complicated ,... How about simulate a drag gesture and simply apply that to the script? Lets say a quick swipe from right to left with customizable speed and length?
     
  41. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I am pleased to announce I have submitted Fingers Gestures 3.0.0 to asset store with preliminary support for the new Unity input system.
     
    bibloc likes this.
  42. mylesb

    mylesb

    Joined:
    Feb 17, 2011
    Posts:
    27
    Cool asset and all seems to be working however when I build to my note 10, the gestures are all off
     
  43. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    First thing to check is if Input.touches is returning results you expect.
     
  44. ItsAmee

    ItsAmee

    Joined:
    Nov 22, 2012
    Posts:
    11
    Just bought Fingers - all working as expected so far.

    Any way to get a %accuracy on the gesture? +/- 5% is fine.
     
  45. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    For the image gesture there is a score property.
     
  46. Elkis

    Elkis

    Joined:
    Jun 15, 2013
    Posts:
    87
    Hello!
    I am using a CameraJoystick and a FingerZoomPanCamera to move a camera around and to zoom it respectively. However, every time I try to zoom, one of the fingers activates the Joystick and I end up having a zoom with movement.
    How could I make that the Zoom gesture overrides the joystick? That is, disabling the joystick whenever the Zoom gesture starts?
    Thank you!
     
  47. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You can assign the joystick a mask area or you can require the joystick pan gesture to have a threshold units higher than the zoom gesture.
     
  48. Elkis

    Elkis

    Joined:
    Jun 15, 2013
    Posts:
    87
    Awesome! Thank you so much for the quick response. It worked really well. Was wondering if I could cancel a joystick gesture mid-execution when a second touch activates the Zoom gesture?
     
  49. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You could allow them to execute simultaneously and have the zoom start state reset the joystick pan gesture.
     
  50. Elkis

    Elkis

    Joined:
    Jun 15, 2013
    Posts:
    87
    I tried something like this:
    Code (CSharp):
    1. if(gesture.State == GestureRecognizerState.Possible || gesture.State == GestureRecognizerState.Began)
    2.                 _cameraJoystick.TapGesture.Reset();
    At the Gesture_Updated function for the FingersZoomPanCameraComponentScript where _cameraJoystick is a reference to my CameraJoystick instantiated class and TapGesture the joystick Tap gesture. However, it didn't work. Am I using the wrong Reset function?