Search Unity

[RELEASED] FingerGestures - Robust input gestures at your fingertips!

Discussion in 'Assets and Asset Store' started by Bugfoot, Jul 8, 2011.

  1. Swearsoft

    Swearsoft

    Joined:
    Mar 19, 2009
    Posts:
    1,632
    Thanks for the reply, will be implementing this tomorrow.
     
  2. bobcat53

    bobcat53

    Joined:
    Jul 12, 2010
    Posts:
    26
    Hi SpK, instead of using timescale = 0 I used 0.001f instead :D

    It's working fine for what I need and finger swipes are still just as responsive.
     
  3. oliverdb

    oliverdb

    Joined:
    Jan 30, 2010
    Posts:
    165
    Hey

    How do I get the precise swipe direction as a vector instead of this?

    ingerGestures_OnFingerSwipe( int fingerIndex, Vector2 startPos, FingerGestures.SwipeDirection direction, float velocity )
     
  4. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    The swipe gesture is meant as a quick shortcut for the more complex drag gesture. The swipe gesture is designed to recognize finger motions in one of the 4 cardinal directions. If you need a general motion vector, you should use the drag gesture instead.

    You could also create a modified version of the SwipeGestureRecognizer class and remove the direction constraints (so you retrain the velocity constraint) and expose the motion vector instead.
     
    Last edited: Jan 28, 2012
  5. markhula

    markhula

    Joined:
    Sep 3, 2011
    Posts:
    630
    Here's how I grab the direction and 'force' (i.e. velocity)

    Code (csharp):
    1.  
    2. player.throw_in_progress = false;
    3.                             throw_end = touchController.Fingers[0].pos;
    4.                             float angle = (Mathf.Atan2(throw_start.y-throw_end.y, throw_start.x-throw_end.x)*Mathf.Rad2Deg)+180.0f;
    5.                             //float length = Vector2.SqrMagnitude(touchController.Fingers[0].delta)*8.0f;
    6.                             throw_force = distance*2.0f;
    7.                             throw_force = Mathf.Clamp(throw_force,180.0f,700.0f);
    8.                        
    9.                             print("SWIPE:"+angle+" "+touchController.Fingers[0].delta+" "+throw_force);
    10.        
    11.                             touchController.Fingers[0].action = finger_modes.FingerNULL;
    12.    
    13.                             Rigidbody tr = player.moving.GetComponentInChildren<Rigidbody>();
    14.                        
    15.                             player.moving.transform.parent = null;
    16.                             player.moving = null;
    17.                        
    18.                             tr.rigidbody.isKinematic = false;
    19.                             tr.gameObject.rigidbody.useGravity = true;
    20.                             tr.gameObject.rigidbody.AddForce(character.transform.forward * throw_force);
    21.  
    22.  
     
  6. SteveJ

    SteveJ

    Joined:
    Mar 26, 2010
    Posts:
    3,085
    This is probably something really easy and it has probably been answered, but I'm just going to ask it quickly to save going through this whole thread.

    I'm working on something in which I want the player to be able to move their player object on screen by drawing a path with their finger. So they would touch the object and draw a path on screen and the object would then move along that path to the point where they lifted their finger.

    This is in a top-down 2D view - so movement is just on the XY axis.

    Is this something that I can do easily with Finger Gestures? I was looking at the webplayer demo - Basic Mode, example 2:"Finger Move". That's essentially what I want, but I need both the start and end points, and some way to "travel" the path in between.
     
  7. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Yeah it should be fairly easy, although you'll have to do some extra coding where movement is concerned.

    There are some FingerGestures events you'll want to listen to:
    - OnFingerDown to know when the finger is touching the screen for the first time
    - OnFingerMove is fired whenever the finger is moving on the screen. Use these to build up your path waypoints.
    - OnFingerStationary is fired whenever the finger is held in place, but still touching the screen
    - OnFingerUp to know when the finger is lifted from the screen

    Then you'll need a function to convert the finger screen position to world position (typically, you'll use a raycast from the camera). Then you can build a path from the waypoints obtained via OnFingerMove, and get your dude moving once the finger is lifted from the screen.

    The sample included should be a good start for that.
     
  8. SteveJ

    SteveJ

    Joined:
    Mar 26, 2010
    Posts:
    3,085
    @Spk - that's excellent. Thanks for the comprehensive response :)
     
  9. figbash

    figbash

    Joined:
    May 13, 2008
    Posts:
    60
    Has anybody been having problems with FingerGestures and Android with Unity 3.5? I have a project that was working fine before the update to 3.5, and it still continues to work fine on iOS and the editor, but now the numbers I'm getting from FingerGestures are way off on android.

    This simple test case yields incorrect results on the Y:

    void FingerGestures_OnFingerDown(int inFingerIndex, Vector2 inFingerPos)
    {
    Debug.Log("Touch down: " + inFingerPos);
    }

    As I move my finger up and down, Y will go negative towards the middle of the screen.
    We are running in landscape if that makes a difference.
     
  10. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    I don't have an android device to test FG on at the moment :( Could you maybe try to isolate if it's a change introduced in 3.5 that's causing this issue, or if it still happens with the older stable version? Remember 3.5 is still on beta, so that could very well be an android input bug as well.
     
  11. figbash

    figbash

    Joined:
    May 13, 2008
    Posts:
    60
    Unfortunately we don't have 3.4 running or I'd check, but I'm 90% sure it's 3.5 related. The FG sample scene doesn't work either, so I'm guessing it's a unity bug.
     
  12. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    You could maybe print the actual Input.touch.position to the screen to see what position Unity returns for each finger, and if the value matches with the one held on the corresponding Finger in FingerGestures.Fingers.
     
  13. figbash

    figbash

    Joined:
    May 13, 2008
    Posts:
    60
    OK that's off too, I reported a bug to Unity, thanks :)
     
  14. soofaloofa

    soofaloofa

    Joined:
    Nov 18, 2011
    Posts:
    26
    Hi,

    When using the toolbox Drag and Drop script on iPad there is significant lag. i.e. the object that I'm dragging is always behind my finger and takes a moment to "catch-up" to my finger motion.

    Is there a setting I need to change to get a one-to-one correspondence between finger movement and object motion?

    Thanks,

    Kevin
     
  15. patch24

    patch24

    Joined:
    Mar 29, 2009
    Posts:
    120
    Hey William, As an example, when you force a return of 'false' using a Pinch Gesture's CanBeginDelegate ...and this stops it from performing a pinch using the fingers it detected on screen, does it somehow reset the finger ID's that it had been tracking? My problem is that my pinch gesture is picking up the wrong fingers to use for the pinch gesture. I'm trying to keep pinches happening only on a collider trigger I set up on the top part of the screen. I'm not sure that using the can begin delegate will help my issue. Unless calling 'false' on it will expunge the finger id's it is trying to use for the pinch gesture. Does this makes sense?
     
  16. qholmes

    qholmes

    Joined:
    Sep 17, 2010
    Posts:
    162
    Hello, Just purchased Finger Gestures.. I am trying to use it with PlayMaker and EZ Gui.. I have a pretty good integration between PM and EZ Gui but now i am trying to integrate Finger Gestures..

    In my situation i have a product demo project and so i need the person to be able to use the Orbit Zoom Camera script but also tap on objects for animations and information... So i set up an EZ Gui button to enable the Behaviour on the camera.. it works in editor if i do it with a Key.. but when i put it on the iPad then the Touches are messed up. The camera script picks up on the button touch as its first touch instead of getting the next two touches.. Is there a way i can fix that?

    If it helps i am dead in the water and need a demo out in 24 hours...

    q
     
  17. patch24

    patch24

    Joined:
    Mar 29, 2009
    Posts:
    120
    qholmes -this issue has been brought up alot. (I continue to have issues with this too) It would be great if FG had a quick and easy way to include/exclude specific touches. check out page 17 (towards the top of page) There is discussion of manipulating the TouchFilter for this. You could also try what I did, check against a collider to determine if FingerGestures should engage or not. It has worked well for the single finger drag, my problem comes in with multiple finger touches. I think the CanBeginDelegate can be used to solve it though. i.e. -If your conditions are not met, you send 'false' to the CanBeginDelegate and then set your reset mode to 'NextFrame' I've gotta try this route later and see if it works for my pinch gesture.
    From William:
    " You might want to try the NextFrame reset mode instead, which will tell the GestureRecognizer to reset as soon as possible (the next frame after it failed/succeeded), allowing it to attempt to detect the gesture again."
     
  18. qholmes

    qholmes

    Joined:
    Sep 17, 2010
    Posts:
    162
    Uhhh thank you so much for replying... i was worried about this. I had read quite a few of the posts trying to figure something out... I will take a look at page 17..

    I dont quite get how to use this asset yet so am a little lost. I was thinking that if i used the FG for firing my buttons then the touches would be registered with the system but it seems from the scripts that the camera script does not filter touches at all which i find strange.. very limiting.. But i am just learning this.

    I too was wondering about the collider solution.. i have the help doc open right now to try this.. But not sure how to proceed to do that... i was thinking i could add a collider object on one part of the screen which works for me where my MOVE button is not located... Then test against that object?? I have a TBInput Manager in my scene now which is where i would set that up..

    The CanBeginDelegate.. without knowing much about it... just by the name.. In my case the person holds down a button with one hand and does the gestures with the other so the touch filter issue may still exist?? not sure.

    Thanks again! Hope i can figure this out... i am totally bummed that i am getting no where. Backup plan is to have a manual Zoom instead of pinch i guess for now.

    Q
     
  19. patch24

    patch24

    Joined:
    Mar 29, 2009
    Posts:
    120
    Here's some of my TBDragOrbit modified code. It checks the game object that a touch hits.

    Code (csharp):
    1.  
    2.  
    3.     public GameObject camControlArea;//Collider placed in top screen area
    4.  
    5.     void OnEnable()
    6.     {
    7.         FingerGestures.OnFingerDown += FingerGestures_OnFingerDown;
    8.     }
    9.     void OnDisable()
    10.     {
    11.         FingerGestures.OnFingerDown -= FingerGestures_OnFingerDown;
    12.     }  
    13.  
    14.     void FingerGestures_OnFingerDown(int fingerIndex, Vector2 fingerPos)
    15.     {        
    16.         GameObject selection = RayPickObject( fingerPos );
    17.          
    18.         if (selection == camControlArea  dragLatched == -1)
    19.         {
    20.             dragLatched = fingerIndex;
    21.             dragging = true;            
    22.         }
    23.         else if (selection == camControlArea  dragLatched > -1)
    24.         {
    25.             dragging = false;
    26.             pinchLatched = fingerIndex;
    27.         }
    28.         else if (selection != camControlArea)
    29.         {
    30.             ignoreFinger = fingerIndex;    
    31.         }
    32.  
    33.     }
    34.  
    35.     void FingerGestures_OnFingerDragMove( int fingerIndex, Vector2 fingerPos, Vector2 delta )
    36.     {
    37.         // if we panned recently, give a bit of time for all the fingers to lift off before we allow for one-finger drag
    38.         if( Time.time - lastPanTime < 0.25f || pinchLatched > -1 )
    39.         {
    40.             dragging = false;
    41.             return;
    42.         }
    43.  
    44.         else if( target  dragLatched > -1  pinchLatched == -1 )
    45.         {
    46.             dragging = true;
    47.             IdealYaw += delta.x * yawSensitivity * 0.02f;
    48.             IdealPitch -= delta.y * pitchSensitivity * 0.02f;
    49.         }
    50.          
    51.     }
    52.  
    53.  
    And this function checks what the game object is that you touch

    Code (csharp):
    1.  
    2.  
    3.     public GameObject RayPickObject( Vector2 screenPos )
    4.     {
    5.         Camera rayCam = inputMngr.raycastCamera;
    6.  
    7.         Ray ray = rayCam.ScreenPointToRay( screenPos );
    8.         RaycastHit hit;
    9.  
    10.         if( Physics.Raycast( ray, out hit ) )
    11.             return hit.collider.gameObject;
    12.  
    13.         return null;
    14.     }    
    15.  
    16.  
     
  20. qholmes

    qholmes

    Joined:
    Sep 17, 2010
    Posts:
    162
    Hey Thanks so much for this!! Really appreciate it.

    I am going to go for my backup plan...... I am fried and i have to have more content. Just hope they understand the cheesy zoom function.

    But i have to make this work so after my demo is done i will start working on implementing this code.... There must be a way. I have browsed through the FG code and i dont seem to see Touch ID much... I find that strange.. Maybe i am just not looking in the right places. But i was thinking that it would have a better Tracking / Filtering system... BUT like i said i am not a super coder and i just got it so a bit lost and... no sleep for days does not help either.

    Anyway sorry for the rambling. I will attack this another day.

    Q
     
  21. Atrixx

    Atrixx

    Joined:
    Feb 8, 2012
    Posts:
    24
    If its not too much to ask, i was wondering if anybody could help me with getting an object when dragged to emit particles? The samples that i've seen are not coded in Javascript, so i was wondering if someone could help me get started with this.


    EDIT: Helps when you use the OnEnable() function.
     
    Last edited: Feb 8, 2012
  22. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    Hi, I can't for the life of me figure out how to make it so that small drags trigger a tap instead of a drag event? I tried altering the Move Tolerance on the Default Tap and Default Drag prefabs (wildly) and it has seemingly no effect. Tiny drags are still drags.

    Can anyone explain how to fine tune these to do what I want? Or is it even possible?
     
  23. jerotas

    jerotas

    Joined:
    Sep 4, 2011
    Posts:
    5,572
    Well.....more importantly I have a separate issue with FingerGestures. I can't get EVERY click / tap to actually trigger, no matter what settings I use. On the iPhone 3s, it's missing a lot of taps. Now when I write my own 3 line code without FingerGestures, I get every hit. Has anyone else had this problem? I've changed every setting on "default tap" to hell and back and can't get every hit to register.

    So basically at this point I'm thinking I just need to write my own simple code instead of using this.
     
  24. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    This could happen for several reasons:
    1) Your app is running at low frame rate due to overall scene complexity
    2) Your app is capped at 30 FPS by Unity (I think that's the default)
    3) A bug introduced in a recent version of FG

    If you're sure your app is running smooth, then it has to be 3). I've checked the web player demo for the latest version of FG, and it does indeed seem to lag a little bit behind when dragging. Unless im mistaken, this didn't use to be the case, so I will look into it this and see if I can provide a fix.
     
  25. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    If you want to make your life easier, you could use the toolbox drag script for this. Put an instance of TBInputManager in your scene, and TBDrag on the object you want to drag (it must have a collider). You can configure the TBDrag script to send OnDragBegin, OnDragMove and OnDragEnd messages to the other components on the same object, so you could simply add a new script that reacts to these by turning on the particle system when getting OnDragBegin and turning it off when OnDragEnd kicks in:

    Here's some sample code you can paste in a new "EmitParticlesOnDrag" javascript (it's meant to run on Unity 3.5, with the new particle system)
    Code (csharp):
    1.  
    2. var ps : ParticleSystem;
    3.  
    4. function Start()
    5. {
    6.     if( !ps )
    7.         ps = particleSystem;
    8.  
    9.     if( ps )
    10.         ps.enableEmission = false;
    11. }
    12.  
    13. function OnDragBegin()
    14. {
    15.     if( ps )
    16.         ps.enableEmission = true;
    17. }
    18.  
    19. function OnDragEnd()
    20. {
    21.     if( ps )
    22.         ps.enableEmission = false;
    23. }
    Hope this helped ;)
     
  26. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Hi Jerotas,
    I'm going to look into it this weekend. I've had a few reports of taps not registering correctly under some situations with the last version, so this could be what you're experiencing.
     
  27. aikenau

    aikenau

    Joined:
    Jan 26, 2012
    Posts:
    2
    Hello, two questions i want to ask

    TBDragOrbit cannot use with TBTap in the same scene, even one of them are unclick ( disable ). Am I right?

    And i use 2 camera in the same scene ( for render texture menu ). It show me a error like this "NullReferenceException UnityEngine.Camera.ScreenPointToRay(Vector3 position) (at ...C:/....etc...UnityEngineCamera.cs:267" "

    Thank you for answer
     
    Last edited: Feb 12, 2012
  28. aikenau

    aikenau

    Joined:
    Jan 26, 2012
    Posts:
    2
    Oh ! I solved this, I add the main camera tag for one camera. then everything ok! COOL!
     
  29. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Hi Spk,

    Great plugin, and very interesting ways of managing input cross-platform. I have learned quite a bit reading your code.

    I'm having trouble getting EZGui to work with Finger Gestures. I've looked through the rest of this thread, EZGui forums, and elsewhere online and I haven't seen a simple solution on how to get Finger Gestures input to get picked up before EZGui.

    I tried both the methods from the tutorial video on your site, as well as using the TP classes from the Toolbox folder. In both cases EZGui handes the input and I don't see any evidence in my console of Finger Gestures being called.

    I believe I saw one post where you mention working with EZGui in the past and being able to get the two plugins to play well together. Is there something different you did in the way you set up EZGui's UIManager? Can I do something on the EZGui side to disable it from picking up input and only drawing gui, while making Finger Gestures my main input processor?

    Thanks in advance!

    Vlado
     
  30. tnaseem

    tnaseem

    Joined:
    Oct 23, 2009
    Posts:
    149
    I'm using both too, but haven't experienced any problems with one interfering with the other. EZGUI works as before, detecting my finger/mouse taps, etc, and other taps and drags (not over EZGUI items) are picked up by FingerGestures.

    FYI, I've added the Finger Gestures Initialiser to an empty GameObject and registered the OnFingerTap, OnFingerDragBegin/Move/End event handlers. I've attached a screenshot of my UIManager in case that helps.

    If you need anything else let me know!
     

    Attached Files:

    Last edited: Feb 14, 2012
  31. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Thanks Tarique!

    Were you able to get GameObjects with EZGui scripts attached to them working along Finger Gesture scripts on the same object?
     
  32. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Hi Vlado,
    Thanks for the kind words.

    I'm not totally clear on what you're trying to do, so i apologize in advance if my explanation is not spot on. EZGUI and FingerGestures should work independently just fine out of the box. They simply have no knowledge of each other whatsoever. EZGUI has its own input polling code, and FingerGestures also runs its own, in their respective Update() methods.

    Now if you are trying to use some of FingerGesture's gestures to add new functionality to your EZGUI-based UI, that's gonna be a bit more complicated. This is a bit of a tricky problem when using other libraries that also consume user inputs, because they usually have their own built-in input management layer. One the one hand, you have EZGUI checking for Unity's Input.touches for its UI interactions, and on the other you have FingerGestures also looking at Unity's Input.touches independently of EZGUI. At the moment, there is no "out of the box" way or established protocol for one library to tell the other one that "Hey, I've consumed this touch, so don't use it plx!". That's something that must be implemented by you at the moment (I've got some plans to do something about this soon though).

    EZGUI's input polling code is quite deeply nested in its core UIManager class and it clearly hasn't been designed to be extended/replaced by an external implementation. I think that replacing EZGUI's input handling code by a FingerGestures-based implementation is not the way to go. Instead, it's probably easier to let EZGUI do its thing, and then check if a gesture should be processed by FingerGestures or not. EZGUI allows you to attach several delegates (callbacks) to listen for certain input events (SetNonUIHitDelegate(), AddNonUIHitDelegate(), AddMouseTouchPtrListener()).

    If you setup EZGUI to run its Update() before FingerGestures by tweaking the script execution order, you can then generate a list of inputs that have been either processed or discarded by EZGUI, and use that data to prevent FingerGestures from handling these same inputs. Right now, there is system to do that on a global scale but you can provide a TouchFilter on each GestureRecognizer you use. I'm planning on introducing a more global way of doing this in the next update though.

    If you simply want to discard input events that have already been handled by your UI system, I'd just add a quick check at the beginning of each of your gesture handling method to see if the input position is a UI hit or not. If the finger is on top of the UI, in most cases it means the UI system is going to handle it. That's what I usually do in my own projects.

    I hope this helped a bit, let me know if you have more questions.
     
  33. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Thanks for the explanation William, this makes a lot of sense and gives me a better understanding how the plugins work and interact. Let me fiddle with the delegates and script execution order and see if I can get these to work like you described.

    On a similar note, I performed an experiment earlier to see if I could get the two plugins to work side by side. I did the following:

    - Added a Finger Gestures Initializer into the scene from the prefab
    - Added a new GameObject into the scene, made it a simple cube
    - Added a TapGestureRecognizer script to the cube
    - Added a sample script called TestScript to the cube

    - Inside the TestScript I have the following code:

    - In Unity, I dragged the TapGestureRecognizer script from the cube to the TestScript's tapGesture variable on the same cube

    Running the scene, the cube does not react to being clicked. Am I missing something in my setup?
     
  34. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Found the solution to this one quickly:

    - Having Touch Gestures as my Editor Gestures in Finger Gestures Initializer will not allow me to detect the clicks in the editor for the situation above
    - Switching to Mouse Gestures works fine

    The odd thing is that my build platform is Android, so I assumed that it would perform the Touch Gestures code fine if I put it as the editor gestures.
     
  35. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    From your description, the behavior you should see is the "Clicked the test object" log message being displayed regardless of where you you tap on the screen - that is, tapping the cube will have the same effect as not tapping the cube. The reason for this is simply that you have not added any code in your tapGesture_OnTap() method to check whether or not the tap happened on the cube (e.g. by doing a raycast into the scene from the tap position). That's not something the low-level TapGestureRecognizer does for you.

    However, FingerGestures ships with a convenience toolbox library that implements some of that higher-level logic for you. If you put a TBInputManager in your scene, and then add a TBTap script on your cube, the system will send a "OnTap()" message (by default) to your cube object when tapping it.

    If you are not seeing any log message at all when you tap on the screen, then that's a different problem, most likely something that hasn't been setup properly.

    Please let me know which of these two situations you are witnessing so that I can help you further.
     
  36. tnaseem

    tnaseem

    Joined:
    Oct 23, 2009
    Posts:
    149
    In my code I just do the following in Start():

    Code (csharp):
    1.  
    2. FingerGestures.OnFingerTap += FingerGestures_OnFingerTap;
    3.  
    And in OnDisable():
    Code (csharp):
    1.  
    2. FingerGestures.OnFingerTap -= FingerGestures_OnFingerTap;
    3.  
    FingerGestures_OnFingerTap() is my event handler, of course.

    Code (csharp):
    1.  
    2.     /// <summary>
    3.     ///   Handle finger down event
    4.     /// </summary>
    5.     private void    FingerGestures_OnFingerTap(int fingerIndex, Vector2 fingerPos, int tapCount)
    6.     {
    7.         // Check if specific object
    8.         if(!objectHelper.HasObjectBeenPicked(fingerPos, myObject))
    9.             return;
    10.  
    11.         // Do stuff with object...
    12.  
    13.     }
    Helper functions:

    Code (csharp):
    1.  
    2.     /// <summary>
    3.     ///   Return the GameObject at the given screen coordinate, or null if none found
    4.     /// </summary>
    5.     public  GameObject  PickObject(Vector2 screenPos)
    6.     {
    7.         // First check if the game camera has been set up
    8.         if(gameCamera == null)
    9.         {
    10.             Debug.Log("ObjectHelper: gameCamera not set up");
    11.             return null;
    12.         }
    13.  
    14.         // Create a ray to be cast from the current screen position
    15.         Ray         ray = gameCamera.ScreenPointToRay(screenPos);
    16.         RaycastHit  hit;
    17.  
    18.         // Project a ray from the screen position into the screen to see what it hits
    19.         if(Physics.Raycast(ray, out hit))
    20.             return hit.collider.gameObject;
    21.  
    22.         return null;
    23.     }
    24.  
    25.     /// <summary>
    26.     ///   Check if the specified object has been picked
    27.     /// </summary>
    28.     public  bool    HasObjectBeenPicked(Vector2 screenPos, GameObject obj)
    29.     {
    30.         GameObject selection = PickObject(screenPos);
    31.  
    32.         if(selection != null  selection == obj)
    33.             return true;
    34.  
    35.         return false;
    36.     }
    37.  
    I've just cut and paste the functions as I use them in code. All split up as I use them for various things. Hope this sheds some light on it!
     
    Last edited: Feb 15, 2012
  37. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    It will do so at runtime when running your application outside of the editor, but it has a separate behaviour when ran from inside the editor. If you check the FingerGesture Initializer prefab, you will see that it has a dedicated "Editor Gestures" property, so that you can specify which FingerGestures implementation to use when running from inside the Editor. I set it by default to the Mouse-based one, as you're usually using your mouse from the editor. However, you can set the Editor Gestures to the TouchScreen Gestures implementation if you want to test your application via the Unity Remote controller.
     
  38. tnaseem

    tnaseem

    Joined:
    Oct 23, 2009
    Posts:
    149
    That, I must admit, I haven't tried as yet.

    If this is an issue, I'm wondering if adding a dummy object as a child with FingerGestures would work? Just off the top of my head.
     
    Last edited: Feb 15, 2012
  39. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    William, I tried the option of using the Toolbox scripts that you mentioned.

    - I added a TBInputManager to my scene to an empty game object
    - In TBInput manager, everything is checked, the camera is set to the camera that renders the layer which the cube is on, Ignore layers is set to nothing, Drag plane type is XY, Drag plane collider is none, and Drag plane offset is 0
    - Added a TBTap script to my TestObject
    - Changed the name of the function in TestScript (above) to be public void OnTap() to work with the default (as it didn't work with putting in the Click function name)

    I am not receiving input.

    I should mention that I changed the script execution order so that the Toolbox scripts in my scene execute before UIManager from EZGui, if that makes any difference in this case.
     
  40. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Tarique, I've tried the code snippets you mentioned above earlier and it appears to make no difference. I believe that was the "simple" way of connecting to input from the tutorial video.

    I like your suggestion for adding a dummy object. In my experience with EZGui, I have a feeling that children objects with colliders pass on their clicks to parents with colliders (like a compound collider), meaning that a child dummy object used to detect Finger Gesture events may wind up triggering the EZGui events in the parent. In my limited knowledge I would assume that it is possible to get things working in this manner as long as the parent of the dummy doesn't have EZGui objects assigned to it, and the EZGui objects don't have collision boxes.

    I'm going to give this a try and see what happens once I get the toolbox scripts detecting :)


    EDIT: I just noticed the 2 big code snippets, with the object detection. Thanks for posting the picking code, this will certainly come in handy :)
     
    Last edited: Feb 15, 2012
  41. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    You probably already checked, but just to make sure: does your test object/cube have a collider? It must have one for the toolbox scripts to work properly.

    Also, EZGUI should have no effect on FingerGestures whatsoever. Try this in a brand new project without EZGUI to make sure of this.
     
  42. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Yep, the object has a collider.

    I did a couple of tests to see what happens if I follow instructions.

    Test 1:
    - New project, new scene
    - Added cube, gave it the TBTap script, my test script, and then I changed the call to be "Click" in TBTap's settings
    - Added initializer prefab, made no changes
    - Added new game object and added the TBInputManager script. Only changed the camera to be the default camera that renders the cube

    Test 2:
    - Old project where we found things weren't working
    - Went into new scene, did the same steps in Test 1

    Results:
    - In both tests everything worked perfectly fine

    It appears that my scene has something different about it that's causing the calls to be missed. Is it possible that having multiple cameras in the scene may have something to do with it? I use an untagged camera that's used to draw the ui only. The cube is in the same layer as the rest of the ui.
     
  43. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    Strange. Only thing I can think of is that having no camera in your scene with the "main camera" tag will prevent unity from finding your main camera when using "Camera.main" from code. But the camera should have no effect on FingerGestures's low level events. Try adding a log message to FingerGestures.OnFingerDown event, both in the FingerGestures implementation, and in your test script in a new handler method. See if they fire at all.
     
  44. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Interesting find:

    I changed the layer of the cube in my scene to be one that's rendered by the main camera (which doesn't render UI). I changed the TBInputManager's camera to the main camera. Loaded up the scene again and voila, it detects input!

    Is the Finger Gestures code dependent on a camera that's tagged as the main camera?
     
  45. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    I tested with yet another camera, and it works fine. That means there's something wrong with my UI-rendering camera, and not with Finger Gestures code.
     
  46. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    The toolbox scripts might (TBInputManager). But not the core library. Thanks for the notice, I'll take a look at it tomorrow. Time to sleep for me :)
     
  47. digitalthinker

    digitalthinker

    Joined:
    Feb 14, 2012
    Posts:
    10
    Thanks for the help, I'll keep playing with it and let you know of anything I learn that might be of help. Good night :)
     
  48. acs_andrew

    acs_andrew

    Joined:
    Aug 26, 2010
    Posts:
    26
    Hi, I've packaged the demo and even tried the web demo from the site but still have same problem with multitouch. All single touch gestures work and also any pinch with 2 fingers. But nothing else does (e.g. the pinch and rotate the rotate doesn't work). Pretty much anything with multiple touch doesn't seem to be detected unless it is pinching. This is a brand new multitouch monitor running on windows 7. We're using Unity free rather than pro, but as I mentioned it seems to be the same on the website's demo.

    I've tested the functionality of Microsoft's own touch pack apps and they all seem to work fine (2 finger rotation etc). Any ideas what could be wrong?
     
  49. Bugfoot

    Bugfoot

    Joined:
    Jan 9, 2009
    Posts:
    533
    This is not a FingerGestures issue - I think it's because Unity does not properly support multitouch screens on desktop. It seems to treat the touch screen inputs as a regular mouse device, not a fully featured touch screen like on iOS. This might be something worth raising with the Unity devs.
     
  50. acs_andrew

    acs_andrew

    Joined:
    Aug 26, 2010
    Posts:
    26
    Thanks for the swift response. Do you have a list of what does and doesn't work? (i.e. "Tested on desktop, iOS and Android platforms"). Just so I know what to concentrate on using in my own projects (at least until Unity implement multitouch fully).