Search Unity

[RELEASED] FingerGestures - Robust input gestures at your fingertips!

Discussion in 'Assets and Asset Store' started by Bugfoot, Jul 8, 2011.

  1. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    I haven't had the opportunity to test this, however if the touches are detected through the standard Unity's Input.touches interface, then FG should work just fine with that device as well.
     
  2. Vern_Shurtz

    Vern_Shurtz

    Joined:
    Mar 6, 2009
    Posts:
    264
    Thank you for the quick response. My monitor will be here on Wednesday so I will purchase FG and give it a go. I take it the sample scenes will probably be the quickest way to test compatibility?

    I will report here my test results.
     
  3. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Yes, the sample scenes will do the trick. Make sure that you change the "Desktop Gesture" prefab to "TouchScreen Gesture" in the "FingerGestures Initializer" prefab. This will ensure that FG uses Unity's Input.touches as its input source instead of the mouse device. Don't hesitate to PM me if you need more help with this.
     
  4. Vern_Shurtz

    Vern_Shurtz

    Joined:
    Mar 6, 2009
    Posts:
    264
    Will do. It would be good to have a definitive answer to this question. Many Mac's, PC's and standalone monitors are now touchscreen capable with much more to come in the future like tablet PC's such as the already released ASUS Eee Slate B121-A1 Tablet PC, which is my next purchase. :)
     
  5. se7en

    se7en

    Joined:
    Dec 3, 2008
    Posts:
    232
    Thanks for the performance/response info - sounds like the best of both worlds. Keep up the great work!

    In the Javascript Scenes folder /MoveToFinger - the main script is actually C# : ) Also, none of the pinch and zoom samples work because of the line breaks (on Mac) - I think that was already mentioned.
     
  6. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    That sounds really cool, however for my current game I'll want to go back to XZ (which I believe is the old default). I think we're still waiting on the new updated version? Anyway will you let us know (when it's out) how to change the drag plane type?

    Thanks!
     
  7. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    You can already do that via the TBInputManager's "Drag Plane Type" property. There currently are 4 options to choose from, with one of them allowing you to provide a custom collider to project against.
     
  8. ibyte

    ibyte

    Joined:
    Aug 14, 2009
    Posts:
    1,047
    Hi Spk, your examples show one gesture per object. If an object needs to respond to multiple versions of the same gesture do we need to put multiple copies of the same gesture recognizer script (with different setups) on a game object. Like single tap, double tap, triple tap for example?

    iByte
     
  9. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    I can't find the TBInputManager? Where is that? I'm on the latest version. Is that a prefab or a script?
     
  10. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    In general, yes. For taps, it's a little bit different: you can specific a required tap count of 0, and it will fire on every tap the user performs, providing you with the current tap count in the sequence. The next update improves this further.

    If you want your object to react to single finger drag and 2-finger drag, you'll need to setup 2 different drag gesture recognizers (one setup for 1 finger, the other setup for 2).
     
  11. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    The TBInputManager is a central piece in the toolbox scripts framework. It's responsible for subscribing to the various FingerGestures events and dispatching the calls to scene objects that can be interacted with (e.g. have a TBxxx script equipped, such as TBDrag, TBTap, etc...).

    Assuming you were talking about the toolbox scripts: if you open the Toolbox-DragDrop scene, and expand the "Toolbox Drag Drop Sample" object in the scene hierarchy, you will find the "TB InputManager" child object. If you click on it, you will be able to access its various properties, including the "Drag Plane Type" and "Drag Plane Collider" properties.
     
    Last edited: Nov 15, 2011
  12. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Hey Joe, thanks for the small patch. I've applied the corrections and they will be included in the next update (2.2). And thanks for pointing out the issues with the Destroy() calls - I usually pay attention to this but I missed it this time around. I think I had game objects in the first place there, but then changed their type to Transform but didn't update the Destroy() ;)
     
  13. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    FingerGestures v2.2 has been submitted and is pending approval. Changes include:
    - TapGestureRecognizer: added MaxDelayBetweenTaps and RaiseEventOnEachTap properties
    - TBDrag: dragging an object no longer recenters it on finger, added DragPlaneType.Camera to drag parallel to the camera plane
    - TBDragOrbit: added support for two-finger panning
    - Fixed runtime compilation
    - Fixed line break format issue causing compilation warnings on MAC
    - Added component menu entries for the various gesture recognizers
    - Moved the MoveToFinger.cs script out of the javascript samples folder

    Additionally, I'm working on adding Playmaker support soon but work is still in progress on that front.
     
  14. Vern_Shurtz

    Vern_Shurtz

    Joined:
    Mar 6, 2009
    Posts:
    264
    Ok, I got the monitor in and setup was a breeze. I did what you wrote above and there is no response to any touch. If I leave the Desktop Gesture to Mouse Gesture all of the single finger touch samples work but the multitouch samples do not..

    I am doing some tests to see if Multitouch is working with the monitor.
     
    Last edited: Nov 16, 2011
  15. sendel76

    sendel76

    Joined:
    Mar 17, 2010
    Posts:
    36
    Having some issues using FingerGestures with Unity Remote on iOS. It recognizes just one finger, so all pinch/zoom stuff does not work with Unity Remote.

    Using:
    - FigureGesture 2.1.2
    - Mac Book Pro (Core i7) with Lion
    - Unity Pro 3.4.2f2
    - Unity Remote 3

    Testet UnityRemote on following Devices:
    - iPhone4G iOS 4.3.2
    - iPhone 3GS iOS 4.3.2
    - iPad1 iOS 4.3.2
    - iPad2 iOS 5.0.0

    Always Same Result:
    - Pinch / Zoom Examples not working properly
    - One Finger Examples do their job
    - Two Finger Swipe, Drag, Tap, Long Tap working ( so its recognizing that there are two touches at least? or just if they are comming one after another maybe? )


    What is the default way to use your tool for crossplattform, may be I have to add some "Touch converter" code?
    Sorry for all these questions, but I am a bit confused how to develop my Multi-Touch-Application in a way to avoid building and rebuilding XCode Projects and testing the App nativly on device all the time..

    looking forward hearing from you !
    Hans
     
  16. Vern_Shurtz

    Vern_Shurtz

    Joined:
    Mar 6, 2009
    Posts:
    264
    I've done some testing and found some interesting results. My monitor, an ACER T231H, is capable of detecting only 2 touch inputs at a time. I tested this capability with MS Paint and drawing 2 lines with 2 fingers simultaneously. When testing using the FingerGesture sample scenes and the Desktop Gesture prefab using Mouse Gestures all of the single finger/touch samples work but none of the two finger or multitouch samples do, EXCEPT the Pinch gesture. This two finger gesture works as expected rescaling objects or zooming in or out.

    Using a Mouse the pinch gesture in FingerGestures is simulated by the scroll wheel. This would lead me to believe that Win 7 Touch has been implemented in Unity through the mouse function somehow. When the Desktop Gesture prefab is using Touchscreen Gesture I get absolutely nothing.

    It sure would be nice if the Unity developers would chime in on this.

    Thoughts anyone?
     
    Last edited: Nov 16, 2011
  17. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Check out the "Testing with Unity Remote" section of the user guide page at http://www.fatalfrog.com/?page_id=322. You need to set the FingerGestures Initializer's "Desktop Gesture" property to reference the "TouchScreen Gestures" instead of the default "Mouse Gestures".
     
  18. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    The mouse scroll pinch is actually of my own making - I replace the default pinch gesture recognizer by a "mouse scrollwheel" implementation when using the default Mouse Gestures.

    I'm sorry I haven't had much time to look into this touch monitor deal and whether unity supports yet or not. My hope was that they would handle the monitor touches through the standard Unity's Input.touches API. If they are using another API that is currently available, let me know and we can create a custom input handler for it.
     
  19. Vern_Shurtz

    Vern_Shurtz

    Joined:
    Mar 6, 2009
    Posts:
    264
    Well the fact that using 2 fingers to zoom in and out or to rescale the rectangle when actually the mouse scroll wheel is whats programmed to work tells me that pinching with 2 fingers is the same as scrolling with the mouse wheel as far a Unity is concerned. This is what makes me wonder if the Win 7 Touch is implemented through the mouse input and not through input.touches.
     
  20. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Just a quick note to say that FingerGestures v2.2 has been approved and is now available in the Asset Store ;)
     
  21. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Updated the front page with the new v2.2 release info.
     
  22. SteveJ

    SteveJ

    Joined:
    Mar 26, 2010
    Posts:
    3,085
    Just a pretty general pre-sales type question. I'll admit I'm being lazy and not reading the entire thread to see if it's been asked previously.

    The game I'm working on is multi-platform; iOS, Windows, MacOS. How much work is required to make FingerGestures workable on multiple platforms? i.e. if on the iPhone the player has to drag an item from the ground to their inventory to pick it up, I want this same behaviour to occur in the MacOS version, only they'll click and drag the item using their mouse.

    Is that kind of the default behaviour, or would that require some work?
     
  23. se7en

    se7en

    Joined:
    Dec 3, 2008
    Posts:
    232
    It's very easy you just change the singleton from Touch to Mouse. TouchGestures is a huge time saver thats why it was my first Unity Asset Store purchase. Well worth the investment.
     
  24. SteveJ

    SteveJ

    Joined:
    Mar 26, 2010
    Posts:
    3,085
    That's what I was hoping to hear. Thanks for the quick response! :)
     
  25. ray77

    ray77

    Joined:
    Nov 22, 2011
    Posts:
    2
    Hi
    I have buy the fingergestures 2.2 from asset store of unity3d
    but i can not found the sample from the package to drag two/multi
    toolboxes with two/multi fingers at the same time. Is it possible to do
    it by use fingergestures package?
     
  26. patch24

    patch24

    Joined:
    Mar 29, 2009
    Posts:
    120
    Hey this is a great package! Just having some trouble... I'm trying to use the camera orbit/zoom with a third person controller and I'm having some issues.

    I have it set up so that if you start translating the character around(via virtual d-pad), the TBDragOrbit script gets disabled and a separate script brings the camera in behind the character so we see forward. Then when we stop, the orbit script gets enabled again. There seems to be some residual movement in the camera when the TBDragOrbit script get switched back on. The camera snaps to a random spot around the character.

    I have tried alot of ways to fix the camera snapping offset out but haven't had any luck. I suspect that as I am pushing a virtual joystick forward (while the TBDragOrbit script is deactivated), the Drag recognizer is still seeing my finger drags on the joystick so that when the DragOrbit script is reactivated it picks up this offset. Any ideas?

    I should probably just set it up so that all screen input goes through the TBInputManager and turn on and off touch inputs there. How do I need to set up the TBDragOrbit script differently to get it to use the input manager? Event handlers?

    Thanks for any help.
     
  27. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Two-finger dragging/swiping is not supported yet in the toolbox, although FingerGestures supports this feature. There's a sample on how to do multi-finger swipe in the "Advanced Mode" samples folder (Multi-Finger Swipe.unity) that will show you how to set this up using a gesture recognizer. Alternatively, you can also use the convenience FingerGestures.OnTwoFingerSwipe or FingerGestures.OnTwoFingerDrag events.
     
  28. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    The TBDragOrbit doesn't actually use the TBInputManager. It doesn't require any external dependencies. Now, as to your problem with this perceived residual camera motion: this is probably due to the smooth camera motion code. Basically, the TBDragOrbit script keeps track of an Ideal Current value for the various key camera parameters (yaw, pitch, distance to target). When you want to change the rotation or distance of the camera, you can only modify the "Ideal" values (e.g. IdealYaw, IdealPitch, IdealDistance). This tells the script the final configuration you want to be in. Then, the script smoothly interpolates from its current internal values to your ideal values over time, based on the smooth motion properties you have set.

    When you disable the TBDragOrbit script, you freeze the transition to these ideal values as well. So when you re-enable the script, it resumes it progress towards the ideal values.

    If you want to skip the transition, you'll want to force both the current ideal value to the same value right away (e.g. Yaw = IdealYaw = 0).

    Let me know if you need more help/clarifications on this.
     
  29. patch24

    patch24

    Joined:
    Mar 29, 2009
    Posts:
    120
    Yes, that was it. Thanks a ton for the hints. I had been only resetting the 'ideal' vars before.
    added this function:

    Code (csharp):
    1.    
    2.     public void ResetCamPos()
    3.     {      
    4.         Vector3 angles = cam.transform.eulerAngles;
    5.         float dist = Vector3.Distance (cam.transform.position , target.transform.position);
    6.  
    7.         distance = IdealDistance = dist;
    8.         yaw = IdealYaw  = angles.y;
    9.         pitch = IdealPitch = clampPitchAngle ? ClampAngle( angles.x, minPitch, maxPitch ) : angles.x;        
    10.     }
    11.  
    I honestly wasn't sure about the benefits of buying FingerGestures, I've coded iPhone touch functions before...but your library makes it so much more plug and play. Very easy to get pro results.

    I just have one more question. I'm still trying to work out the best way to get touches meant for specific buttons without activating the drag every time. I had figured that it might make sense to rewrite the TBDragOrbit script to register its touches with the TBInputManager rather than how it is now. Then I can try to centralize control of touch responses. Or should I go about this a different way? Thanks again.
     
  30. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    I don't have a very good system in place for that at the moment, I should add this to the road-map. For now, you could use the CanBeginDelegate (GestureRecognizer.SetCanBeginDelegate) to provide a custom delegate to control whether the gesture recognizer should even start recognizing or not. Alternatively, you could have a kind of manager that keeps track your various gesture recognizers and enables or disables them depending on the game state.
     
  31. Intrawebs

    Intrawebs

    Joined:
    Mar 24, 2011
    Posts:
    45
    Whats the best solution for wiring up "enter" and "exit" touch events? I have a bunch of spheres in a 2d game that I want to do something when a touch (drag I'm assuming) enters the sphere and then when the touch exits it, I will have about 25 spheres on the screen. What do you recommend? I'm already using this on another game in development, loving it so far.

    Sorry, to add to this....the player WONT be tapping, they will be dragging their finger all over the bottom 2/3 of the screen which is why I was looking for a recommendation to manage enter and exit states, its important that I know when the player comes back into a sphere after exiting with their drag.
     
    Last edited: Nov 29, 2011
  32. yuewah

    yuewah

    Joined:
    Sep 21, 2009
    Posts:
    98
    Is it possible to drag or swipe rigid body ?
     
  33. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    You need to monitor the OnFingerDrag, OnFingerDragMove and OnFingerDragEnd events and raycast for there sphere the finger is currently over when the finger is down/moved. You will need to track the currently selected/hovered sphere so that you can see when you enter or exit a sphere (current selection changed). In OnFingerDragEnd, you'd check to see if you were over a sphere (current selection not null) to fire the "exit" event there as well, and then set the selection to null.
     
  34. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Not out of the box, but It should be quite easy to modify the standard DragRigidbody script to use FingerGestures instead of the mouse. I'll include that in next update, thanks for the suggestion ;)
     
  35. Intrawebs

    Intrawebs

    Joined:
    Mar 24, 2011
    Posts:
    45
    Thats what I was thinking, but regarding perf will it be ok to raycast constantly while the finger is moving?
     
  36. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Shouldn't be a problem at all if it's only once per frame...
     
  37. soofaloofa

    soofaloofa

    Joined:
    Nov 18, 2011
    Posts:
    26
    Hi, I'm trying to create an object that slides with mouse movement using the Drag gesture.

    I have code as follows:

    Code (csharp):
    1.  
    2. private void FingerGestures_OnDragMove( Vector2 fingerPos, Vector2 delta )
    3.     {
    4.             // update the position by converting the current screen position
    5.         // of the finger to a world position on the Z = 0 plane
    6.         Vector3 newPosition = _dragObject.transform.position;
    7.         newPosition.x += delta.x;
    8.    
    9.             _dragObject.transform.position = newPosition;
    10.     }
    It works great, but as you drag the object near the screen edge the object does not follow very precisely. Any suggestions?

    Thanks for this great tool!
     
  38. Intrawebs

    Intrawebs

    Joined:
    Mar 24, 2011
    Posts:
    45
    What do you recommend for capturing the drag then if I don't want to actually drag anything and only want to detect when the finger was dragged over the top of something?

    I can get the TBDrag to work, but then it drags my object, and If I create a copy of that script and wire it up the same way (rename the class etc) nothing happens. I also tried putting the DefaultDrag prefab as a child of a game object that was already receiving the events from the TBDrag and that didn't work. I like the prefab approach because then I just get all the stubs without all the toolbox stuff but couldn't get it to work. I would love some documentation that breaks out how to use all this stuff in a step by step way vs. parsing through the scenes to see how someone else did it (I couldn't find a scene where the prefabs are used).
     
  39. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Don't use the toolbox scripts - use the DragGestureRecognizer directly - It will tell you when drag events happen, but it won't move things for you. The toolbox is a high-level set of scripts to help you get started with FingerGestures quickly, and implements some of the most basic operations for you. You're after something a lot more unique/customized, so you need to use the lower-level stuff.
     
  40. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    You need to project the finger's screen space position to a world-space position. Please take a look at the drag code implementation in TBInputManager and TBDrag for examples on how to do that.
     
  41. soofaloofa

    soofaloofa

    Joined:
    Nov 18, 2011
    Posts:
    26
    Thanks, I stored the previous world position and current world position as a world position delta and used that. Everything works great now.
     
  42. soofaloofa

    soofaloofa

    Joined:
    Nov 18, 2011
    Posts:
    26
    I noticed in the sample code that a single "Manager" is responsible for handling the touches on multiple objects rather than having a script attached to each object that needs touch handling. Is there a reason why one method might be preferred over another?
     
  43. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    It's more performance friendly. Image you're making a game where you have to tap bubbles on the screen, and there's 100s of them. If each of them is listening for input events and then doing a raycast to test if it was hit, that might quickly become a performance bottleneck (and it doesn't scale well, no matter what). With the manager approach, you sacrifice a little bit of flexibility for better performance and scalability, because you will only be doing that once per frame in a single place, and then dispatch the appropriate event to the bubble that was hit.
     
  44. soofaloofa

    soofaloofa

    Joined:
    Nov 18, 2011
    Posts:
    26
    Hi,

    My goal is to have the script receiving a toolbox Message access the original fingerPos, fingerIndex, etc. of the event. Is there a way to do this without editing all of the messaging code to include parameters?
     
  45. Krodil

    Krodil

    Joined:
    Jun 30, 2010
    Posts:
    141
    hey,
    great work on the package.
    Is it possible to assign more than one Raycast Cam in the TBInputManager?
    Alternatively assign another camera at runtime?
     
  46. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    Hi Spk,

    I just bought a "magic trackpad" for the Mac, thinking it's more like using a touchscreen (iOS) than a mouse. It would probably help me to not have to deploy things to the iPAD to test them.

    However, I found that tapping and dragging on the magic trackpad doesn't seem to work with FingerGestures. I just get warnings from FingerGestures, but no actual tap action.

    Please let me know if you plan to support this real soon, otherwise I need to return it. I have only a week I think.
     
  47. Joe ByDesign

    Joe ByDesign

    Joined:
    Oct 13, 2005
    Posts:
    841
    @jerotas
    Have same set up here and it works fine.

    One thing though: touch events can be dropped when using remote (not a FingerGestures issue) and performance is low.

    Turn off Show Image on the remote and you should be ok.
     
  48. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    I don't use Unity Remote. I find that a useless application. I'm not sure if you're understanding me. I bought the bluetooth laptop-type touchpad to use while testing my game in the Unity Editor. But it only registers about 1 in 20 taps.
     
  49. Joe ByDesign

    Joe ByDesign

    Joined:
    Oct 13, 2005
    Posts:
    841
    @Jerotas
    Yeah, sorry, had inferred wrong.

    If it helps, do have the same device and FingerGestures recognizes the taps very well here (with default settings even); does the device work otherwise (i.e. is it faulty)?

    Also, probably not related, can say that Webplayer currently does not support the axis data from it (heard it is being addressed).
     
  50. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    I have no idea if Unity supports the trackpad in the first place (it could be the case, I just don't know).

    In the case it does, do you know if the trackpad is handled as a "touch device" or "mouse device" by Unity? Does it generate the same input messages as a mouse device would (Input.mousePosition, Input.GetMouseButton...), or does it generate touches via Unity's Input.touches? If it's the former, make sure you are using the "Mouse Gestures" implementation of FingerGesture. If it's the latter, make sure you're using the "TouchScreen Gestures" implementation. You can set this up in the FingerGestures Initializer prefab.

    Let me know how it goes.