Search Unity

[NEW UPDATE!] Fingers - Touch Gestures - #1 in Quality, Support and Features : Dozens of Gestures✓

Discussion in 'Assets and Asset Store' started by jjxtra, Apr 25, 2016.

  1. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Will add it in!
     
    Anydaytv likes this.
  2. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Fingers Gestures for Unity - 2.3.3 update

    - Fixes for WebGL touches and work-arounds for Unity bugs where touch end and touch cancel events get lost.
    - Option to make control key optional when making a scale gesture with the mouse wheel.

    Please let me know if you see any problems with this release. Happy gesturing!

    - Jeff
     
  3. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I added the null check in the latest asset, thanks for sharing!
     
  4. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Fingers Gesture is currently 50% off!!! Get it while the sale lasts!
     
  5. Deleted User

    Deleted User

    Guest

    Hi all,

    I'm looking at the examples atm, but what I basically want is the Demo3Dscene, but with a separate control for the orbit and pan.
    Now it's disabling one of the orbit axis to pan L/R or U/D. I want to be able to fully rotate on X/Y, and pan from left to right.
    Still failry new on Unity, so these gestures & components still aren't fully clear to me how to set up.

    cheers!

    Rob
     
  6. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I just sent an update that allows setting the pan speed as well as limiting the pan distance from target
     
  7. Coks

    Coks

    Joined:
    Jul 23, 2012
    Posts:
    28
    Hi there! I update to new version (2.3.6) and I have strange behavior. The previous touch position returns always the start position of gesture. I use gesture.CurrentTrackedTouches[0].PreviousX. In 2.3.2 version it worked correctly.
     
  8. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I will take a look.
     
  9. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Having trouble reproducing this. I've tried Android, iOS and a Windows touch screen monitor. Can you email support@digitalruby.com and send an example project with the problem?
     
  10. DragonslashDevteam

    DragonslashDevteam

    Joined:
    Mar 28, 2018
    Posts:
    1
    Hi,

    In first, thank you for this asset !
    I need to detect began and ending swipe without release touch. So i set : swipeGesture.EndMode = SwipeGestureRecognizerEndMode.EndContinusously
    But in my gesture callback function, i never detect the state GestureRecognizerState.Began only detect GestureRecognizerState.Possible or GestureRecognizerState.Ended.
    How could i detect Began and Ended swipe properly, without release touch ?
    It's for make a 'Fruit Ninja like' gameplay.

    Best regards,
     
  11. buckUnity

    buckUnity

    Joined:
    Apr 23, 2018
    Posts:
    10
    Hi,
    I was wondering if there is an easy way to make Gestures use only ids that haven't been used yet. For example, if we have two Gestures - Scale and Pan. Scale requires two fingers while Pan requires only one finger. If I start using the Scale Gesture and then release one of my fingers, the Pan gesture should not be activated since the one finger remaining has been used by the Scale Gesture.
     
  12. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You could require the pan gesture to have the scale gesture fail. Let me know if that works.
     
  13. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Thanks for your email. I think we decided the pan gesture is much better with it's velocity properties which you can use to determine if a cut is being attempted.
     
  14. bbbenni

    bbbenni

    Joined:
    Jun 8, 2018
    Posts:
    2
    Hi.
    I don't know if ist the right place to post a problem, concerning the fingers asset. Is there a better place to post it?

    I am trying to figure out, if the fingers script is the right asset for my application.
    My project deals with the development of an 2d user Interface. It will have a menus you can drag by pan gesture from each side, like the control center on ios devices. Additionally there will be other gestures on the main canvas.
    Each draggable menu will have the Content on a Panel which is place outside of the canvas. An small Image works as a handle (specified as pass through object) on the canvas is the game. The attached PNG shows how one these menus looks like. It works fine with a pan gesture (with one menu).

    Now my problem:
    My idea was to build a prefab for the menu, which includes a handler script with the GestureRecognizer and animations, and use it 4 times. This does not work, because only one recognizer gets the callback of both handles.

    So my questions are:
    - is it possible to apply a recognizer just to one gameObject?
    or
    - can i get on which gameObject (pass through object) a gesture is performed?

    Thanks in advance.

    Best regards.
    Benjamin
     

    Attached Files:

  15. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The PlatformSpecificView property was made for just this case. There is a demo scene showing off how to do this, I believe it is DemoScenePlatformSpecificView.

    Make sure to add a collider / collider2d to your game object. Then ensure that any Unity UI on the game object has unchecked 'Raycast Target'. You then need to add a physics / physics2d raycaster to your camera. Then set the PlatformSpecificView property of your gesture to the game object to restrict the gesture to.

    Let me know how it goes!
     
  16. mhogle

    mhogle

    Joined:
    Jan 3, 2016
    Posts:
    20
    Hi there,

    Thank you for the great asset. I want to use the mouse scroll to change the Z coordinate of the model (zoom, but adjust the model location instead of camera). Can you please show me how to do that using your ScaleGesture with dampening?
    Also, I want to use the mouse scroll without having to hold Crtl button, can I edit that in your script to that as soon as the user scrolls, the state is Executing?

    Thank you!
     
  17. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The fingers script object recently got a properly that made the ctrl key optional. Let me know if you have trouble finding it.

    When the scale gesture executes, you would take the camera forward vector, multiply by speed and time.delta time and then take that and multiply by scale gesture. If scale gesture delta scale > 1, multiple by scale gesture, if scale gesture delta scale < 1, multiply by -(1/deltaScale). Then take the result and add that to the model transform position. Let me know if that makes sense.
     
  18. mhogle

    mhogle

    Joined:
    Jan 3, 2016
    Posts:
    20
    Hi. For some reasons, the Scale gesture keeps confusing between rotating the model and changing the Z coordinate. Here is the snippet of my codes. It only zoom in 1 direction now, but do you happen to know what I can improve on this?
    Code (CSharp):
    1.  
    2. private void Scale_Updated(DigitalRubyShared.GestureRecognizer gesture)
    3.     {
    4.         if (ScaleGesture.State != GestureRecognizerState.Executing || ScaleGesture.ScaleMultiplier == 1.0f)
    5.         {
    6.             return;
    7.         }
    8.  
    9.         // invert the scale so that smaller scales actually zoom out and larger scales zoom in
    10.         float scale = 1.0f + (1.0f - ScaleGesture.ScaleMultiplier);
    11.  
    12.  
    13.         // get camera look vector
    14.         Vector3 forward = ThreeDCamera.transform.forward;
    15.  
    16.         // set the target to the camera x,y and 0 z position
    17.         Vector3 target = currentModel.transform.position;
    18.         //target.z = 0.0f;
    19.  
    20.         // get distance between camera target and camera position
    21.         float distance = Vector3.Distance(target, ThreeDCamera.transform.position);
    22.  
    23.         // come up with a new distance based on the scale gesture
    24.         float newDistance = Mathf.Clamp(distance * scale * 0.01f, 1.0f, 100.0f);
    25.  
    26.         // set the camera position at the new distance
    27.         //currentModel.transform.position = target - (forward * newDistance);
    28.         Debug.Log(newDistance);
    29.  
    30.     }
    31.  
     
  19. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I would suggest not using the distance between the objects, instead pick a speed and multiply by your forward vector and time.delta time and multiply that by the scale factor. To reverse the zoom do a multiply by 1.0 / scaleMultiplier and then negate, that is the correct formula.
     
  20. toto007

    toto007

    Joined:
    Jul 18, 2014
    Posts:
    33
    Hi,
    I have an issue when there is a overlapping ui objects.
    I want drag an object but if it is overlapping another object, the drag touch is intercepted by the latter.
    You can look this video to understand better my issue. I am using the DemoSceneUIPlus3dElements scene. I have two object UI. The green object is overlapping on white object but if tap green object the white object is drag.
     
    Last edited: Sep 3, 2018
  21. toto007

    toto007

    Joined:
    Jul 18, 2014
    Posts:
    33
    I solved editing the method RaycastResultCompare the class FingersPanRotateScaleComponentScript.cs


    Code (CSharp):
    1.   private static int RaycastResultCompare(RaycastResult r1, RaycastResult r2)
    2.         {
    3.             SpriteRenderer rend1 = r1.gameObject.GetComponent<SpriteRenderer>();
    4.             if (rend1 != null)
    5.             {
    6.                 SpriteRenderer rend2 = r2.gameObject.GetComponent<SpriteRenderer>();
    7.                 if (rend2 != null)
    8.                 {
    9.                     int comp = rend2.sortingLayerID.CompareTo(rend1.sortingLayerID);
    10.                     if (comp == 0)
    11.                     {
    12.                         comp = rend2.sortingOrder.CompareTo(rend1.sortingOrder);
    13.                     }
    14.                     return comp;
    15.                 }
    16.             }
    17.  
    18.             //my edit
    19.             if (r1.gameObject.transform.GetSiblingIndex() < r2.gameObject.transform.GetSiblingIndex())
    20.                 return 1;
    21.             else if (r1.gameObject.transform.GetSiblingIndex() > r2.gameObject.transform.GetSiblingIndex())
    22.                 return -1;
    23.             else
    24.                 return 0;
    25.         }
    In brief, the problem stems from the fact that in case the elements are not sprites the gesture conflict is not handled. So I entered a little logic to handle cases using hierarchical sorting.
     
  22. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Thanks for sharing, will incorporate that into the code if you are OK with it.
     
    toto007 likes this.
  23. toto007

    toto007

    Joined:
    Jul 18, 2014
    Posts:
    33
    For me is ok
     
  24. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Fingers - Touch Gestures for Unity now has a discord channel! Please post screenshots and videos of your games and apps, I know I would love to see what you all are working on.

    Click here: https://discord.gg/QfTJbWq
     
  25. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,047
  26. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    There is no extension in the asset with out of the box integration for those assets.
     
  27. BloodyBeard

    BloodyBeard

    Joined:
    Oct 17, 2017
    Posts:
    4
    Hi Jeff!
    I have some weird behaviour with using your asset and loading scene.
    I have 2 scenes: my menu sample scene and your demo scene. I can load scene not-async and it's works great, but if I try to load scene async(in single mode or additive) gestures are not working in DemoScene :(
    So I've used a "hack":
    Code (CSharp):
    1. public class testCameraChecker : MonoBehaviour {
    2.  
    3.     public Camera sceneCamera;
    4.  
    5.     public GameObject objectForEnanble;
    6.  
    7.     private IEnumerator Start()
    8.     {
    9.         do
    10.         {
    11.             yield return null;
    12.         } while (sceneCamera != Camera.main);
    13.         yield return null;
    14.         objectForEnanble.SetActive(true);
    15.     }
    16. }
    In
    public Camera sceneCamera;
    we put our camera in the scene.
    in
    public GameObject objectForEnanble;
    we put object with our FingerScript and with DemoScript(in Demoscene it called FingersDemo).
    Is it possible that issue happens cause of destroing main camera from previous scene (menu scene camera in my case)?
     
  28. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You can change the scene load behavior on the fingers script. Right now I believe it defaults to clearing all the gestures, but if you need a different option then you can pick it from the set of enum values. Let me know if that helps.
     
    BloodyBeard likes this.
  29. BloodyBeard

    BloodyBeard

    Joined:
    Oct 17, 2017
    Posts:
    4
    Yeah, I've changed option to "Reset GestureState" and it's look like it helped. I'll test it more, but I'm sure it's works. Thanks a lot.
     
  30. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    hey Jeff - very nice asset! i'm trying to figure out how exactly to get two fingers touched and held on an object to be dragged, and i'm not sure. there's the LongPressGesture script method from DemoSceneDragDrop, but i would want to adapt that so that it recognized two simultaneous touches on its collider. how would i go about doing this? MultiDrag doesn't seem quite right either. it obviously can work on one object but i would want it to not move at all unless two touches were detected on the object. any assistance appreciated!
     
  31. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You can set min and max touches to 2 on the gesture. Let me know if it helps.
     
  32. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    okay, i made a little progress, i changed the min and max touches to 2. also needed to be able to change the delay on the long press action to make it shorter. seems to work fine. but now i'm trying to copy the behavior to a 3D object and it doesn't seem to work. i notice the FingersDragDropComponent script declares a bunch of 2D variables. how would i adapt this to 3D items? i guess the movement is called a Pan, and it looks like FingersPanRotateScaleComponent can deal with either 3D or 2D objects.

    what i've done so far in the FingersDragDropComponent script is to declare the rigidBody variable as a standard 3D Rigidbody instead of 2D. then all the way down at the OnEnable function i look for a Rigidbody component instead of a Rigidbody2D. but my object doesn't have a SpriteRenderer and can't change sorting order, and it seems it fails there, though there's no compilation error and no NullRefs. the object just stays where it is and doesn't respond.

    i do know that the FingersPanRotateScaleComponent works with 3D objects as that's being used in your demo scene, but all of those objects only require single touches to move, so i didn't think it was a useful starting point.

    i think i'm getting close - just need to figure out a way to communicate from the FingersDragDropComponent to the FingersPanRotateScaleComponent that i'm looking for a 3D object, instead of a 2D one.
     
    Last edited: Feb 4, 2019
  33. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I'm uploading an update now with drag drop support for 3D objects. You will need a physics raycaster added to your camera. Look for the update in about an hour.
     
    JayGarxP likes this.
  34. metaphysician

    metaphysician

    Joined:
    May 29, 2012
    Posts:
    190
    wow - thanks! that was fast. i guess updates to Asset store assets happen a lot quicker than they used to. i'll try this out and let you know how it goes. looks like it should do the job. only issue is whether it interferes with the slider interaction. very much appreciated!
     
  35. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    You are welcome. I find as long as I don't change the title of my asset, it generally goes through fast.
     
  36. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Fingers Gestures 2.6.0 now has bulk import of images for image recognition / OCR. More details in readme.txt file or see this tutorial:

     
    JayGarxP likes this.
  37. nicloay

    nicloay

    Joined:
    Jul 11, 2012
    Posts:
    540
    Hello @jjxtra

    Thanks for the great asset. I'm just having one issue.
    I tried to follow your code structure and wrote a script which handles pan gestures.
    The problem is that I use drag&drop logic in another part of the scene (scale, rotation handles)

    You can see that Gesture script just ignore by default if pointer is over gameobject (in my case it's a spriteRenderer with component which implement ```IPointerClickHandler, IPointerDownHandler``` )

    My solution was to check is pointer over gameobject, or drag something when I first see GestureTouch
    Code (csharp):
    1.  
    2. GameObject GetGOForTouch(GestureTouch touch)
    3. {
    4.    int nativeTouchId = touch.Id;
    5.    if (EventSystem.current.IsPointerOverGameObject(nativeTouchId))
    6.    {
    7.       return null;
    8.    }
    9.    
    10.    var inputModule =  EventSystem.current.currentInputModule;
    11.    MethodInfo methodInfo = inputModule.GetType().GetMethod("GetLastPointerEventData", BindingFlags.Instance | BindingFlags.NonPublic);
    12.    PointerEventData pointerEventData =
    13.       (PointerEventData) methodInfo.Invoke(inputModule, new object[] {nativeTouchId});      
    14.    if (pointerEventData != null && pointerEventData.pointerDrag != null)
    15.    {
    16.       return null;        
    17.    }
    18.          
    19.    var ray = _camera.ScreenPointToRay(touch.ScreenPosition);
    20.    var intersection = Physics2D.GetRayIntersection(ray, 1000, _stickerLayer);
    21.    return intersection != null && intersection.collider != null ? intersection.transform.gameObject : null;
    22. }
    23.  
    And what I don't like here is the part where I use reflection (as unity UI don't give access to the PointerEventData directly). And this is the reason why mouse simulation doesn't work with this code.

    Do you think there is a more proper way to handle this?

    Thanks.
    //Nikolay
     
  38. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    I would suggest using the one finger pan and one finger scale gestures. You can set the PlatformSpecificView property of the gesture to the button with the rotate or scale icon. This will restrict the gesture to start only on the button and no other buttons or objects. I would also suggest assigning a tap gesture to your X icon.

    Once you have the gestures created, you will need to handle multiple game objects I assume. For this, you will need to set the CustomData property of the gesture to the game object that you want scaled or rotated. When the gesture executes, you can then get a GameObject from the gesture.CustomData property and then rotate or scale it or delete it.

    DemoScriptOneFinger.cs has some good examples.

    Let me know if that makes sense. You'll need to re-download the 2.6.0 version to get the CustomData property.
     
    nicloay likes this.
  39. Deleted User

    Deleted User

    Guest

    Hi, can I use Rewired as input value provider for Fingers or are the two not really compatible?
     
  40. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    Fingers has a virtual touch driver, so you can easily send rewired events as touch events, you just need to map the "phase" of the virtual touch, along with an x and y value. I suggest to most people to use an x and y value between 0 and 1024. Let me know if you have more questions.
     
    Deleted User likes this.
  41. jjfranzen

    jjfranzen

    Joined:
    Nov 11, 2013
    Posts:
    14
    Hello. I'm working on implementing these controls into an AR project so that I can rotate and scale a placed 3D object in an AR scene. I've been running through the tutorials and everything seems straightforward, up to the point where I add the FingersScript prefab into my project. It just won't let me. Am I missing something or is there a different way to do this now? I am using Unity 2018.2.20f1, if that helps. I've imported all prefabs, but none of the demo files. Are those required for some reason? At any rate, any assistance you can provide would be very appreciated. Cheers,

    J^2
     
  42. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The demo files are not required. The prefab is not required if you don't need to customize it, fingers script will initialize the prefab for you.

    I just did a test with the demo scene and dragged the prefab in and ran it, seemed to work without issue.

    Unity 2018.2 should be fine, but as a test can you try it in 2017.4?
     
  43. jjfranzen

    jjfranzen

    Joined:
    Nov 11, 2013
    Posts:
    14
    Should it be a standalone game object or a component of something else? I have a Game Manager script that I'm implementing the code into and it's seems to have added the FingersScript as a component of that GameObject. Everything is compiling so I'm gonna try a test build and see what goes boom. Whee!
     
  44. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    It should be a standalone game object if you drag it into the scene. Let me know how it goes!
     
  45. jjfranzen

    jjfranzen

    Joined:
    Nov 11, 2013
    Posts:
    14
    Dragging it into the scene does nothing. It won't let me. I drag and drop and it just zips back to the prefab panel...
     
  46. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    It's possible something is corrupted. The easiest way to tell is start a new blank project, import just the fingers asset and then try running the demo scene. Let me know if that works.
     
  47. pjccccc

    pjccccc

    Joined:
    Oct 7, 2015
    Posts:
    43
    Would be great to add PanX and PanY gestures.

    Think about Photo editors, panX to change effects and panY to change strength(amount).
    Also these gestures are exclusive, If you swipe up during the panX recognization, nothing happens.
     
  48. jjfranzen

    jjfranzen

    Joined:
    Nov 11, 2013
    Posts:
    14
    The demo scene works fine in the editor. I'll try building a fresh scene and see if I still get the same behavior.
     
  49. jjxtra

    jjxtra

    Joined:
    Aug 30, 2013
    Posts:
    1,464
    The pan gesture can do x or y movement. You can use allow simultaneous execution to let them execute at the same time.
     
  50. Deleted User

    Deleted User

    Guest

    Hi, the virtual touch driver was not 100% enough to plug in in a diferent input driver like Rewired or InControl:
    - I had to comment out the ProcessTouches call to disable Unity Touch Input Processing.
    - I had to add a getter to previousTouchPositions to reimplement GestureTouchFromTouch in the VirtualTouchObjectHandler

    It would be great to have a flag for ProcessTouches and some way to access previousTouchPositions in the next version so I don't have to modify the asset source :)