Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

[NEW UPDATE!] Fingers - Touch Gestures - #1 in Quality, Support and Features : Dozens of Gestures✓

Discussion in 'Assets and Asset Store' started by jjxtra, Apr 25, 2016.

  1. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Does this happen if finger slides off the device? Are you able to attach a debugger or write some debug.log statements to output the gesture state? Let me know if you don't see an 'End' state when you output the gesture state to debug.log.
     
  2. cygnusprojects

    cygnusprojects

    Joined:
    Mar 13, 2011
    Posts:
    706
    Issue resolved, it occured when the action of the user was interpretted as a Tap instead of a Pan, in such a case the horizonal and vertical crossplatform values weren't set to 0 (and the motion continued). By adding this also to the End of the tap gesture like for the Pan the issue doesn't occur any longer.
     
  3. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Easily automate your UI touches by recording your touches, or manuall enter the touches - your choice! Fingers - Gestures for Unity is the best asset for automating and testing your game.

     
  4. fraeri

    fraeri

    Joined:
    Nov 8, 2018
    Posts:
    32
    Hey ijxtra,
    I love your asset! using it quite a while without thinking about it. It just works!
    Most of the time I use the PanOrbitView-Functionality, that moves the camera around the object with one finger, zoom with pinch and zoom and pan with two fingers.

    What I would like to ask if its possible to reverse that functionality so that the camera stands still and the object is rotating (zoom and pan are optional for me)? You have some examples for for moving and rotating objects but with different gestures ...

    He does it in a minimalistic way:

    But what's missing is the "Smoothness" I would call it, no Inertia.

    Thanks for your help
     
  5. fraeri

    fraeri

    Joined:
    Nov 8, 2018
    Posts:
    32
  6. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    I have a question regarding differentiating touches.

    In my current project I have a PlayerController MonoBehaviour that instantiates a TapGestureRecognizer and binds the StateUpdated event to a function that Performs a Physics.Raycast against a ground plane to drive a NavMeshAgent for click-to-navigate functionality.

    I then also have other gesture recognizers, both instantiated in code and attached to game objects via ...GestureRecognizerComponentScript instances. My problem is, I want any tap that isn't on something with a gesture recognizer to pass through to the 'global' tap gesture recognizer for navigation; but any tap gesture that gets handled by some other game object to *not* get seen/handled by the 'global' recognizer.

    I looked for something I could call/set on the Gesture instance to mark it as handled (like the preventDefault/cancelBubble/etc on HTML events) so the navigation would know to ignore it, but couldn't find anything.

    Does Fingers have a way to do this? I can't add every gesture recognizer that ever gets added anywhere to the 'global' tap gesture's ignore list. I need a way for the navigation behaviour to detect if anything else already handled the gesture.
     
  7. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Any help on this? I'm running into the same issue in multiple places in my project and the ad-hoc solutions I'm having to use are getting messy -- I've even had to modify the Fingers code in some cases. I've tried dynamically setting
    RequireGestureRecognizerToFail on gestures I don't want to fire, but they still get updated, possibly because they're updating before the gesture they're supposed to wait for failure on gets updated? I've also tried setting the Gesture View property on ...GestureRecognizerComponentScript instances I'm using, without success; gestures still pass through to the 'global' recognizers. There is nothing in the FingersScriptPrefab instance's Pass Through Objects property.
     
  8. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Sorry everyone, once again Unity stopped sending me emails when people post here. So I will try to respond to the questions now.
     
  9. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Create a panel for the global gesture and set it as the platform specific view for the fallback gesture.

    For each other gesture, add game objects inside the panel and assign them as platform specific views for those gestures.
     
  10. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Sounds like a simple pan gesture would work well for you, with a little scripting you should be able to get going. For the inertia, just multiply the velocity in the Update method by 0.995 or some dampening value.
     
  11. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    i'm not sure I follow... Do you mean a UI panel? And that for every gesture I create, I need to add a child GO to the UI and set it as the platform specific view for that gesture?

    Currently I have the platform specific view of each gesture set to the game object that should respond to the gesture, which is also the GO that the gesture recognizer component script is attached to, when I'm using that method to set up the gesture.

    I tried assigning my ground plane GO as the platform specific view for my 'global' tap-to-navigate gesture, but that didn't seem to make any difference.
     
  12. tik_tikkers

    tik_tikkers

    Joined:
    Jan 5, 2018
    Posts:
    28
    I'm using additive scenes. Fingers stops giving touch information when I load and then unload a new scene on top of the current scene.

    Any tips on how to make Fingers recognize touch on the first scene?
     
  13. tik_tikkers

    tik_tikkers

    Joined:
    Jan 5, 2018
    Posts:
    28

    On the FingersScriptPrefab set "Level unload options" to "Nothing". Then adding and removing scenes will not break touch.
     
  14. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    OK, I've tried every combination of gesture recognizer component scripts and code instantiated gesture recognizers that I can think of and nothing is working.

    I now have a PanGestureRecognizerComponentScript on a gameobject in my scene, which I make no reference to in my code; that manages orbiting the camera about the focus object. It's configured like this:

    upload_2019-11-15_18-9-27.png

    Next, I have a prefab that is instantiated at runtime, with a component on it that instantiates and initializes a TapGestureRecognizer and a PanGestureRecognizer in Awake(). Both get their platform specific view set when they're created. Here's the code that sets those up:

    Code (CSharp):
    1. private void Awake()
    2.         {
    3.             ...
    4.  
    5.             mTap = new TapGestureRecognizer();
    6.             mPan = new PanGestureRecognizer();
    7.            
    8.             FingersScript.Instance.AddGesture(mTap);
    9.             FingersScript.Instance.AddGesture(mPan);
    10.  
    11.             mTap.PlatformSpecificView = gameObject;
    12.             mPan.PlatformSpecificView = gameObject;
    13.  
    14.             mTap.StateUpdated += OnTap;
    15.             mPan.StateUpdated += OnPan;
    16.  
    17.             mTap.RequireGestureRecognizerToFail = mPan;
    18.         }
    The instantiated object with that code on it is a Canvas set to World Space, if it makes a difference.

    The tap gesture recognizer seems to be working OK, but the pan gesture and the PanGestureRecognizerComponentScript are conflicting with each other. Both are firing regardless of where the touch originates.

    What I want to happen: if a touch and drag originates over the prefab game object, mTap.StateUpdated should fire, and the PanGestureRecognizerComponentScript should ignore that gesture; if the touch and drag originates anywhere else, the PanGestureRecognizerComponentScript should handle it and mTap.StateUpdated should not fire.

    I don't understand why I would need panels or any other additional objects just to manage touch gestures. I don't see anything like that in any of the Fingers demo scripts I've looked at either; maybe you could point to one that does what you're suggesting?

    I've been fighting this on and off for far too long. If I can't get it working soon I'll have to abandon Fingers and find another solution, as this project is now effectively blocked on this problem. Any help would be appreciated.
     
  15. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Email notifications failed again, I have resubscribed to them, again, that's why I didn't see your post.

    Please only create gestures in the "Start" callback, the Awake callback can execute too soon and cause issues.

    If you want to restrict a gesture to a game object, that object needs a collider and needs to be set as the platform specific view for that gesture, looks like you have assigned the platform specific view, does the object have a collider as well?

    Your pan/rotate/orbit component script most likely does not have a platform specific view for any of the gestures, so it will execute regardless of where the touch starts. So you'll need to set the planet as the platform specific view for all the gestures on that script if you want to ensure that those gestures only start when they begin on the planet.

    Does that make sense?
     
  16. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Thanks for the response. I’ll check for collides when I’m back in the office. I didn’t want the pan orbit gesture to have to start over the planet, but if that’s the only way I can get it going I’ll live with it.
     
  17. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    That's probably the best bet. With so many gestures, you'll have to restrict them. Hopefully they don't zoom out too far so that the planet has enough visible area to be easily touchable.
     
  18. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    https://drive.google.com/open?id=1xP4qDqK9nTOycm_kAAG-BagCGoBmibKq

    In the latest iteration I did in fact forget the collider, although I know I've included it before. I added code to set the PlatformSpecificView property of all three of the PanGestureRecognizerComponentScript's gestures, moved the creation of the other gestures from Awake() to Start(), and added a collider to the prefab. The Planet GO did have a collider already. The problem persists (but the pan/orbit action only works when started over the planet now).

    I'm focusing on this problem in hopes that understanding what I'm doing wrong may help figure out all the other problems. To that end, I've put together a stripped to the bones example project focusing on just this setup. If you run it (in the editor is fine) you'll be able to see how the gesture recognizers are conflicting. Hit the "Place Item" button and then try dragging it; it'll follow the gesture, but the planet will rotate too. Try dragging on the planet, away from the placed item, and the planet will rotate, but the placed item will jump to the location of the touch and follow it.

    As far as I know the only gesture recognizers in this project are those created by the PanGestureRecognizerComponentScript and the TapGestureRecognizer instantiated by the Placeable script attached to the Placeable prefab.

    Here's my setup checklist:
    • Camera has physics ray caster
    • All GOs have colliders
    • All gesture recognizers have correctly set platform specific view
    • Gesture setup code is in Start() rather than Awake() (per your advice above [1])
    So we have one Fingers component that creates/manages three gesture recognizers plus one bespoke component that creates/manages one gesture recognizer, for a total of four recognizers in the project, just one of which is created/managed by my code.

    [1] The Fingers Readme.txt directly contradicts your advice re. using start in multiple places. It advises instantiating gesture recognizers with field initializers (`private GestureRecognizer MyGesture = new ...`), which is even before Awake; and advises setting gestures up in OnEnable (~equivalent to Awake, as it's called immediately afterwards). It says the demo scripts "all show this pattern in action" although I noticed they seem to all actually use Start instead OnEnable...?
     
    Last edited: Nov 16, 2019
  19. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Create gestures whenever you want. Add them in onenable. Maybe you were already doing this. I’ll take a look at the sample project.
     
  20. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    So from your scene you want to:

    - Pan to orbit the planet
    - Pinch to zoom in and out from the planet
    - Click the button to add a new placeable object
    - Move the placeable object around by panning
    - Rotate / scale the placeable?

    What is the tap gesture for?
     
  21. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Uploaded 2.9.2, see DemoSceneOrbitAndPlaceObjects. This does not require that you restrict the orbit gestures to the planet. The place object demo script will add a cube that can be panned, rotated and scaled. While the cube is acted upon, the orbit gestures will be reset, preventing the two from executing at the same time.
     
  22. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Yes, except rotate/scale the placeable. It needs to respond to tap / double-tap but I didn't add that in as I wanted to keep the sample project to a minimum. The only tap gesture should be the one in the Fingers pan/orbit component, which I'm also not using in this simplified sample project.

    That's terrific, thank you, I'll take a look.
     
    Last edited: Nov 17, 2019
  23. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Missed this; earlier you said specifically that we should "only create gestures in the "Start" callback, the Awake callback can execute too soon and cause issues." You said that in response to the code I posted earlier where I used Awake(). So in that sample project I followed that advice and used Start().
     
  24. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Looks like the new demo script achieves the stated goal, but the same technique didn't work for my sample project; once you hit Place Item, the placed item receives all touch input (whether it originates over the object or not) and you can't orbit the planet.

    I wonder if it makes a difference that the placed object is a world-space canvas rather than a cube as in the new demo script?

    Another possibility: the demo script doesn't set the instantiated prefabs's PlatformSpecificView property and the FingerPanRotateScaleComponentScript component on the prefab doesn't seem to have a corresponding property exposed, but it does have a "Mode" set to "Require Intersect With GameObject". Looking at the code, that looks to be more of an internal implementation detail of the gesture recognizer MonoBehaviour sub-system though?
     
  25. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    I haven't tested with a world space canvas and 2D objects, my guess is I am doing something wrong there. Are needing to place just 2D objects then?
     
  26. laurie71

    laurie71

    Joined:
    Aug 15, 2018
    Posts:
    37
    Currently, yes, just canvases with UI elements representing the placeable item's type and status.
     
  27. SpiderJones

    SpiderJones

    Joined:
    Mar 29, 2014
    Posts:
    164
    I've used Fingers in a few projects and it works great. But in a current project I can't get it to function.

    I'm trying to get a double and single tap to work. Neither will. It works in the demo scene when run in the my project. But when I add DemoScriptTapAndDoubleTap to my scene It doesn't work. I get no taps.

    I'm using it in an ARFoundation scene, and I've got the same method working in two other apps. But not this one.

    Do you have any tips? You seen this happen before?

    Thanks.
     
  28. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Did you get this resolved? I remember emailing someone with a similar issue it might have been you maybe?
     
  29. SpiderJones

    SpiderJones

    Joined:
    Mar 29, 2014
    Posts:
    164
    No, I did not get it resolved. And I don't believe I've contacted you in the past. The project is pretty large and convoluted. No one on the team is blaming your asset. We just decided to not spend time trying to get it to work.
     
  30. unity_rsenoron

    unity_rsenoron

    Joined:
    Oct 24, 2019
    Posts:
    6
    Hi,

    Thanks for this great asset. I found one problem though and I am not sure why.

    I added a UI Panel. Added the 'Pan,Rotate,Scale' component.

    I was able to pan but after 3 pans, the panel disappears in the game view but not in the scene view.
     
  31. ljsanti

    ljsanti

    Joined:
    Jun 10, 2015
    Posts:
    1
    Hello, I am interested in acquiring this tool, I only have one doubt, can the recognition be used with characters such as Spanish letters or symbols such as Japanese and recognize line direction? Or there is some limitation or other problem.
     
    Last edited: Dec 8, 2019
  32. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Hard to say, did the z index change maybe?
     
  33. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    You can try it out, if it is not up to what you want contact me for a refund.
     
  34. francoiscoiscois

    francoiscoiscois

    Joined:
    Oct 23, 2019
    Posts:
    37
    Hi, this touch control seems to be really well made, I wonder if you will be interested to make it available to playmaker action ? As a new Unity user I am lost with C# and there is currently no touch gesture asset compatible with playmaker.
    Thaank you !
     
  35. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    This is a common request, what kinds of playmaker integration were you thinking of?
     
  36. olonge

    olonge

    Joined:
    Sep 22, 2014
    Posts:
    16
    Finally got around to looking into using this asset 2yrs and 2 months after buying it :p .
    The truth is I'd forgotten that I bought it, and had written what I needed into my game.

    Anyway, it's definitely a great asset, so I'm looking to roll it out into all my apps.
    However, found an issue right off the bat in the FingersJoystickAnywhereComponentScript.cs

    The TapToJump boolean is not being used, so if you disable it, taps are still recognised.o_O
    This is easily fixed by exiting the jumped function if it is not set, (or anywhere else you might use it)

    Sounds like a minor oversight to me, but it might confuse some who are new to Unity.
    Thanks for the asset, and the well thought out videos :)

    upload_2019-12-14_15-30-34.png
     
  37. francoiscoiscois

    francoiscoiscois

    Joined:
    Oct 23, 2019
    Posts:
    37
    Thank you for your response, I finally bought finger touch so I would be really glad if you provide an integration with playmaker, that would be awesome !
    I am new to playmaker and programming in general so it is a bit hard for me to have a clear idea how to integrate both assets.
    I think that we would need to detect the gesture event and access the values in playmaker to be able to use in our game logic : Like the mouvement roation scale values as variable.
    The best would be to have a custom action per gesture I guess.
    here is a link to the standart input action in playmaker and how they work https://hutonggames.fogbugz.com/default.asp?W60

    There is some pretty basics tutorials on how to adapt a script to made an action out of it. I can send it to you via PM if you want, also the creator of playmaker would be certainly more helpfull than me to help for a good integration of both assets.

    Please let me know if I can help in any way.
     
  38. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Face palm, will send update
     
  39. mostlyapt

    mostlyapt

    Joined:
    Nov 5, 2016
    Posts:
    8
    Hi,

    First of all many thanks for this excellent asset!

    Recently while profiling my game I noticed that FingersScript.Update() is allocating memory on each frame, in my particular case it was allocating ~0.6 KB per frame. After deep profiling I found it was due to foreach loops in FingersScript.



    Will it be possible to convert all these "foreach" statements into "for" loops? That will likely help avoid these allocations from performance perspective.

    I referred following article.
    https://docs.unity3d.com/Manual/BestPracticeUnderstandingPerformanceInUnity4-1.html

    Thank you!
     

    Attached Files:

  40. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Profile an actual build. Player profiler has debug code.
     
  41. mistergreen2016

    mistergreen2016

    Joined:
    Dec 4, 2016
    Posts:
    215
    Hi,
    I have Fingers controlling models in space. I have a modal window menu that pops up. The canvas is set block raycasts & interactable. It blocks buttons under the modal window fine but it doesn't block interactions from Fingers on the models. How do I block Fingers raycasts?
     
  42. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Add a transparent image set as raycast target.
     
  43. mistergreen2016

    mistergreen2016

    Joined:
    Dec 4, 2016
    Posts:
    215
    yes, I have that too , a semi transparent image but the raycast goes right through to the models. The models are placed during runtime and is added to the touchable list, fingers.Targets. I’m using fingers in an AR environment as well. Not sure if that is the issue.
     
    Last edited: Jan 19, 2020
  44. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Are you adding the models to the pass through list? You may need to implement the capture gesture handler on fingers script as it gives ultimate control.
     
  45. mistergreen2016

    mistergreen2016

    Joined:
    Dec 4, 2016
    Posts:
    215
    No I'm not adding any gameObject to the pass through list.
    I'm using the FingersPanARComponentScript.
    The function below gets all GO through the raycast even when raycast target is true. Is there a way to find if a GO raycast target is true and return a currentTarget = null?

    Code (CSharp):
    1. private Transform SelectCurrentTarget(float x, float y)
    2.         {
    3.             if (EventSystem.current == null)
    4.             {
    5.                 Debug.LogError(GetType().Name + " requires an EventSystem in the scene");
    6.                 return null;
    7.             }
    8.             else if (currentTarget != null)
    9.             {
    10.                 return currentTarget;
    11.             }
    12.  
    13.             PointerEventData data = new PointerEventData(EventSystem.current);
    14.             data.position = new Vector2(x, y);
    15.             raycastResults.Clear();
    16.             EventSystem.current.RaycastAll(data, raycastResults);
    17.  
    18.             foreach (RaycastResult result in raycastResults)
    19.             {
    20.                 //Debug.Log("result: " + result.);
    21.                 foreach (Transform t in Targets)
    22.                 {
    23.                     if (result.gameObject.transform == t)
    24.                     {
    25.                         return (currentTarget = t);
    26.                     }
    27.                 }
    28.             }
    29.             return null;
    30.         }
     
  46. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Did you mean a raycast target of false?
     
  47. mistergreen2016

    mistergreen2016

    Joined:
    Dec 4, 2016
    Posts:
    215
    The UI image raycast target is checked to block the raycast reaching the gameobject behind it. Am I wrong?
    Either way, the raycast is reaching the model/gameobject.
    Oh, I guess I can hack it and see if the raycast hits a modal window and return a currenttarget = null.
     
    Last edited: Jan 19, 2020
  48. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Does the main camera have physic2d and canvas raycasters?
     
  49. mistergreen2016

    mistergreen2016

    Joined:
    Dec 4, 2016
    Posts:
    215
    yes, it's auto added by fingers or somebody at runtime.
    I added manually too to play with the mask.

    Oh! I just thought of a simple hack... Make a background BUTTON. That blocks the raycast reaching the models/GO.
     
    Last edited: Jan 19, 2020
  50. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,198
    Does adding a collider 2d to the ui elements also work?
     
unityunity