Search Unity

  1. Calling all beginners! Join the FPS Beginners Mods Challenge until December 13.
    Dismiss Notice
  2. It's Cyber Week at the Asset Store!
    Dismiss Notice

TouchScript — multi-touch library for Unity [RELEASED]

Discussion in 'Assets and Asset Store' started by valyard, Mar 6, 2013.

  1. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    How exactly it doesn't work?
    You need to have inputs in every scene.
    There're singletons indeed but in 5.0 it should be fine.

    If it still doesn't work you can post an issue on github.
     
  2. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Just sent version 5.2 to the Asset Store.

    Release notes:
    - Added tags to touch points.
    - Added PlayMaker actions for common gestures.
    - Removed duplicated input sources from assemblies since Unity 4.5 fixes the bug with linked DLLs.
    - Mouse or Mobile input is created automatically if there's no input in the scene.
    - Gestures now reset cached screen positions.
    - PanGesture now uses cached screen position.
    - Gesture.IsFriendly(gesture) is now public.
     
    devandart likes this.
  3. snowangel912

    snowangel912

    Joined:
    Mar 13, 2014
    Posts:
    2
    Hi guys. I'm wondering if I can edit code in plugin such as Simple Pan Gesture for more function?
     
  4. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
  5. snowangel912

    snowangel912

    Joined:
    Mar 13, 2014
    Posts:
    2
    Thank you very much . I really appreciate your help :cool:
     
  6. Itmindco

    Itmindco

    Joined:
    Nov 13, 2013
    Posts:
    3
    How to use new FullscreenLayer?
    Need to move the map by moving the finger (pan gesture).
    I add to main camera CameraLayer, FullscreenLayer, FullscreenTarget and PanGesture. By pan, i get error:

    NullReferenceException: Object reference not set to an instance of an object
    TouchScript.Gestures.Simple.Transform2DGestureBase.updateProjectionPlane ()
     
  7. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    This is a known bug: https://github.com/InteractiveLab/TouchScript/issues/95
    Will be fixed soon. Sorry for that.
     
  8. rsreagan

    rsreagan

    Joined:
    Jun 7, 2014
    Posts:
    1
    Hi Valyard,

    Thanks for writing this library and making it public. I'm having a little trouble understanding how I should accomplish something.

    I'm working on a multi-touch TV screen with Windows 8. I have an architectural walkthrough, and I want to be able to pan the camera left and right using a pan gesture. Here's what I've done:

    1. Added a TouchManager to the Main Camera.
    2. Added a FullscreenLayer to the Main Camera, and set its type to MainCamera. I understand that this is basically a plane that sits right in front of the camera and receives touch events? I'm a little fuzzy on the difference between this layer and the deprecated FullScreenBackgroundTarget.
    3. Added a SimplePanGestureScript to the Main Camera. Its projection type is layer, and I am not using SendMessage. I'm not entirely sure what SendMessage is for.
    4. Added my own custom script to Main Camera. It is:

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using TouchScript.Gestures;
    4. using TouchScript;
    5. using System;
    6.  
    7. public class Touchscape_Pan : MonoBehaviour {
    8.  
    9.     private void Awake()
    10.     {
    11.  
    12.     }
    13.    
    14.     private void OnEnable()
    15.     {
    16.         GetComponent<PressGesture>().Pressed += pressedHandler;
    17.         GetComponent<ReleaseGesture>().Released += releasedHandler;
    18.     }
    19.    
    20.     private void OnDisable()
    21.     {
    22.         GetComponent<PressGesture>().Pressed -= pressedHandler;
    23.         GetComponent<ReleaseGesture>().Released -= releasedHandler;
    24.     }
    25.    
    26.     private void releasedHandler(object sender, EventArgs e)
    27.     {
    28.         Debug.Log ("Released");
    29.     }
    30.    
    31.     private void pressedHandler(object sender, EventArgs e)
    32.     {
    33.         Debug.Log ("Pressed");
    34.     }
    35. }
    I'm getting null reference exceptions on the OnEnable and onDisable methods.

    So my questions are:
    1. Am I going about this correctly?
    2. What am I doing wrong when attempting to hook event handlers up to the PanGesture events?

    Thanks!
     
  9. vzheng

    vzheng

    Joined:
    Dec 4, 2012
    Posts:
    34
    Hi, I just want a function in this plugin : I have a GameObject and it have some children plan object. Then I put the Relese Gesture to the Parent Object. When I Click one of the Plan children, I get the Relese Event.
    My Question is: Could I Get The Children Object Name .

    My Code:

    GetComponent<ReleaseGesture>().Released += InputManager_Released;

    void InputManager_Released(object sender, System.EventArgs e)
    {
    }
     
  10. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Thanks for writing this library and making it public. I'm having a little trouble understanding how I should accomplish something.​

    Thanks. Don't forget to rate it in the asset store q:

    Added a FullscreenLayer to the Main Camera, and set its type to MainCamera. I understand that this is basically a plane that sits right in front of the camera and receives touch events?​

    Correct. Main difference is that it's not required to be in a camera. FullscreenLayer can be attached to any object and can be a parent in a hierarchy of objects.

    Speaking about your code. You are using PressGesture and ReleaseGesture which are separate gestures. You need to add them to the object. To work with PanGesture subscribe to Panned event.
     
  11. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    In this case you should add a ReleaseGesture to the planes.
    Gestures work with objects and when they are recognized they don't care which child in hierarchy was actually pressed. This information stays on touch level.
     
  12. RobDaPraia

    RobDaPraia

    Joined:
    May 18, 2014
    Posts:
    9
    Thanks for this nice library!

    Just started using this library and manage to get the tap, drag and flick working on an object.

    Question: The gestures work fine when I start the gesture on the object. I'm also trying to use another scenario when the gesture does not start on the object itself, for example you touch the screen outside object and drag your finger until you drag over the object. At this moment how can i trigger that you touched the object?


    This works, I can drag the object to new position:

    start touch/drag on object................end drag
    ()-------------------------------------------->
    object
    [ ]

    This doesn't work, would like to drag object as well to new position:

    start touch/drag...........object....................end drag
    ()----------------------------[ ]----------------------->

    Can I do this with a gesture on the object or do I have to place for example a drag gesture on the camera and check at each drag changed event if an object was touched?
     
  13. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Hi.
    When a touch comes into the system some Transform immediately grabs it and owns it. I.e. if you touched a ball with a collider this ball will own the touch. So it's not possible to transfer touch ownership in the middle of a gesture. You will need to write a separate gesture, attach it to the ball's container and on any touch move use TouchManager.Instance.GetHitTarget method to find out an object this touch hovers: https://github.com/InteractiveLab/TouchScript/blob/master/TouchScript/ITouchManager.cs#L149
     
  14. RobDaPraia

    RobDaPraia

    Joined:
    May 18, 2014
    Posts:
    9
    Thanks for the fast reply, trying to think about solution:

    So for a simple scenario like only a ball on top of a plane, when I start dragging on an empty spot, the plane will own the gesture. When I continue dragging also over the ball, the custom gesture of the plane should detect with TouchManager.Instance.GetHitTarget if the touch is above the ball collider?

    So I could then also add SimplePanGesture to the plane, attach my code to SimplePanGesture PanStateChanged event and in my code use the TouchManager.Instance.GetHitTarget to check if I'm pointing over the ball.

    I will try this in my code.
     
  15. RobDaPraia

    RobDaPraia

    Joined:
    May 18, 2014
    Posts:
    9
    Hi,

    Yes this works, I can now detect if I'm dragging over another object, even when the drag started outside the object. For example see the code below added to the plane under the ball:

    I will probably extend the code so as soon as drag over ball, then drag movements will be sent from plane to ball, so ball object can piggybag on the drag movements from the plane gesture, because the plane stays owner of the gesture.


    Code (CSharp):
    1. using TouchScript;
    2. using TouchScript.Gestures;
    3. using TouchScript.Gestures.Simple;
    4. using TouchScript.Hit;
    5. using UnityEngine;
    6.  
    7. public class PlaneController : MonoBehaviour
    8. {
    9.  
    10.     private void OnEnable()
    11.     {
    12.         if (GetComponent<SimplePanGesture>() != null)
    13.         {
    14.             GetComponent<SimplePanGesture>().StateChanged += PanStateChanged;
    15.         }
    16.     }
    17.  
    18.     private void OnDisable()
    19.     {
    20.         if (GetComponent<SimplePanGesture>() != null)
    21.         {
    22.             GetComponent<SimplePanGesture>().StateChanged -= PanStateChanged;
    23.         }
    24.     }
    25.  
    26.     private void PanStateChanged(object sender, GestureStateChangeEventArgs e)
    27.     {
    28.         switch (e.State)
    29.         {
    30.             case Gesture.GestureState.Began:
    31.                 break;
    32.             case Gesture.GestureState.Changed:
    33.                 var gesture = (SimplePanGesture)sender;
    34.                 ProcessDraggingChanged(gesture);
    35.                 break;
    36.             case Gesture.GestureState.Ended:
    37.             case Gesture.GestureState.Cancelled:
    38.             case Gesture.GestureState.Failed:
    39.                 break;
    40.         }
    41.     }
    42.  
    43.     private void ProcessDraggingChanged(SimplePanGesture gesture)
    44.     {
    45.         //var dragPosition = new Vector3(gesture.WorldTransformCenter.x, 0, gesture.WorldTransformCenter.z);
    46.         //Debug.Log(this.ToString() + ",ProcessDraggingChanged(), dragPosition==>" + dragPosition);
    47.  
    48.         ITouchHit hit;
    49.         if (!TouchManager.Instance.GetHitTarget(gesture.ScreenPosition, out hit)) return;
    50.  
    51.        
    52.         if (hit == null || hit.Transform == null) return;
    53.         if (hit.Transform == gameObject.transform) return;
    54.  
    55.         Debug.Log(this.ToString() + ", ProcessDraggingChanged(), We hit something==>" + hit.Transform.name);
    56.     }
    57.  
    58. }
     
    Last edited: Jun 20, 2014
  16. oneuglyrobot

    oneuglyrobot

    Joined:
    Jun 22, 2014
    Posts:
    9
    Hi, great script.

    I've been having some trouble with Touch Priorities.
    My situation is this, if you touch a GUI element, I want to cancel all touches into the game world.

    I've been trying to understand the Layer system and the Untouchable Layer, but it's not very clear.

    Cocos2d, you simply called 'swallowTouch', is there anything like this here?

    from hitTest()
    Error = 0, /// Something happened.
    Hit = 1, /// This is a hit, object should recieve touch.
    Miss = 2, /// Object should not recieve touch.
    Discard = 3 /// Object should not recieve touch and this touch should be discarded and not tested with any other​

    Is there no, HitThenDiscard?
    apologies if this has been answered before, i've searched for half a day on this and still can't figure it out.

    ********UPDATE
    Ok, it seems the the 'touchBeganHandler' does not obey any of the Layers, etc. It gets called no matter what.
    Using the gestures seems to work, though it would like to be able to cancel things in the 'TouchesMoved' event

    What I ended up doing was using the PressHandler and two Cameras with 'Camera Layer 2D' components one set to GUI one set to GameObjects.
     
    Last edited: Jun 22, 2014
  17. gretty

    gretty

    Joined:
    Jun 20, 2012
    Posts:
    72
    Hello

    I have a question for Valyard and people who develop games that run on both Tablets (or Smart Phones) and Desktop.

    Do you write 2 different input handling scripts? Ie, do you write 1 to handle Mouse and Keyboard input (for desktop) and 1 for Touch Input (Tablets and Smart Phones)?

    Or do you just write 1 input handling script that uses TouchScript? Can TouchScript detect normal desktop input though?
     
  18. lavz24

    lavz24

    Joined:
    Mar 14, 2013
    Posts:
    45
    Hi,

    I tried to import in psm unity but didnt work. Any clue?
     
  19. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Yes, TouchScript was developed exactly for this purpose. It has several input sources: mouse, touch, tuio, etc. Touch points come into the system via these input sources and after that it doesn't matter anymore. You write your logic once for these abstract touches.
     
  20. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Sorry, not enough information.
     
  21. IndiumIndeed

    IndiumIndeed

    Joined:
    Jun 17, 2014
    Posts:
    6
    The release notes told that there are Playmaker actions available. I have problem locating them.

    I installed the latest version of TouchScript and Playmaker. Then I installed the PlayMaker.Examples package from TouchScript/Examples/ folder.

    The Playmaker examples don't seem to contain any FSMs. The only Playmaker related Object/Component seems to be PlayMakerGUI. I don't find any TouchScript related actions in the PlayMaker Actions list. Am i missing something?

    Anyway, I would like to thank the makers for this awesome asset. I will use it through C#, if I cannot make the PlayMaker actions to work.
     
  22. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Hi.
    It seems that my build script got messed up and some classes were not included in the package.
    You can download the fixed package from here: https://github.com/InteractiveLab/TouchScript/releases/tag/6.1
     
  23. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    as I move the camera with touch gestures, you have some demo?
    as if I slide down a plane by touching with two fingers.
     
  24. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    hello also do not understand why I did not detect two touches now, before if I detect 2 touches and looked past projects if I detected but when I compile and run on the big screen multitouch did not detect two touches, and the screen the problem is because there aplicaiones flash if the multi-touch function
     
  25. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    already solved from the touch but I need to move the camera as if I were slipping, is there any way to get it?.

    I get it up and creating a pan gesture giving transformer2d and then add the camera inside the plane and when I move the camera moves up, but if I move the other way around moves clockwise should move left to give the effect that this and when I use sliding scale approach is also the other way around when fingers about 2 fingers separated and when away, should be the other way around any way?
     
  26. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    well as I can generate inertia to throw objects with pan gesture, take the object and drag and drop it to continue on its own, changing the transformer2d I get is not very good but the effect that the object becomes very slow
     
    Last edited: Jul 7, 2014
  27. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    I want to create a gallery like that dragging photos with their boundaries pass included with touchscript but I can't: (
     
  28. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    also can i add as a minimum and a maximum in scaleGesture?
     
  29. _milla_

    _milla_

    Joined:
    Aug 29, 2013
    Posts:
    2
    Great Library! I tested the Examples and they worked like charm. But did you ever tried testing it with Vuforia?

    I took your Pan Example for a Vuforia Test. I replaced the MainCamera with the ARCamera and of course parented the Buttons to an ImageTarget.

    Hierarchy:

    Touch Debugger
    Image Target
    >Directional Light
    >Plane
    >...
    ARCamera
    TouchScript

    I could build the scene without Errors. When i drag a Button on my Android device the button moves very fast in the direction i drag. It does not follow my finger, it speeds up and disappears.
    Any ideas why this happens?
     
  30. smeagols

    smeagols

    Joined:
    Oct 5, 2013
    Posts:
    35
    Hi,

    I need some help using TouchScript on my JavaScript code.

    I have many code on my Project in JavaScript and i don't know how to use TouchScript at this code.

    I need to know if i'm touching an object at world. It's work ok with Input.GetMouseButtonDown(0) and raycastHit.

    but i don't know how recieve the touchinput.

    Can anyone give some example to use it on JS?

    thanks
     
  31. smeagols

    smeagols

    Joined:
    Oct 5, 2013
    Posts:
    35
    .....
    I'm stupid..... :(

    if i don't import TouchScript......

    Sorry... XDDDDDDD
     
  32. smeagols

    smeagols

    Joined:
    Oct 5, 2013
    Posts:
    35
    Hi again...

    One simple question...

    it's posible use touch with GUI.Button?

    I do not find how to...

    Thanks
     
  33. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    how i can make PanGesture just horizontally!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
     
  34. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    pan gesture just horizontally i don't need now, but how i can drop objects with inertia?
     
  35. MPanknin

    MPanknin

    Joined:
    Nov 20, 2011
    Posts:
    361
    Hi guys,

    thanks for providing us with such a great plugin. Integration into our project as well as the included gesture components work perfectly.

    However, I do have a question. All gestures which are derived from TwoPointTransform2DGestureBase are only triggered, if both touches hit the collider of the respective object. But if you want to rotate or scale a small object, it might be difficult, to hit the collider with both fingers.

    I would like to do the following. The first touch hits the collider of an object and no matter where on the screen I move the second touch, it rotates the object.

    Is this possible at all? Do I need to write a custom gesture for this or is there something I'm missing?

    Thanks for your feedback, which is really appreciated.

    Cheers,
    Martin
     
  36. design&develop

    design&develop

    Joined:
    Aug 17, 2013
    Posts:
    3
    I am using Scale gesture for zoom in/out for windows touchscreen monitor, however to test it i need to use alt + single finger. So how to do same thing with two fingers
     
  37. lizardboy79

    lizardboy79

    Joined:
    Sep 3, 2013
    Posts:
    4
    Hi gabrielstuff,
    I'm trying to do something similar, wherein I click on an object that instantiates another object, and the touch is transferred to the instantiated object. I could not find the spawnCube example you refer to, any chance you can post it? Many thanks
     
  38. gretty

    gretty

    Joined:
    Jun 20, 2012
    Posts:
    72
    Hello, first off, TouchScript is an awesome asset :D

    Is it possible to use the TouchScript library to drag (pan) the camera around?

    I have been able to successfully pan/drag around a GameObject (cylinder) but no luck getting the main camera to move?

    Any suggestions on what TouchScript layers I should use (Fullscreen, Camera, Camera 2d), which GameObject I should I apply the Pan script to and etc. would be extremely helpful. How have you been able to control camera movement using TouchScript?

    Here's my setup but its not moving the camera:

    - MainCamera GameObject has CameraLayer, PanGesture, Transformer2D and BoxCollider components/scripts. Note the camera also has the default camera components.
    - TouchScript GameObject has MobileInput & MouseInput scripts

    Inspector Screen Capture: (Heres a larger image)

     
  39. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    how did you pan / drag around an object?
     
  40. zlSimon

    zlSimon

    Joined:
    Apr 11, 2013
    Posts:
    31
    Thank you valyard for making this public, it is really well made!I just looked through the examples and they are really easy to understand and as far as I can tell cover everything I want to do.I will have a 2D UI which can be dragged around as well as 3D objects in the game which should receive touch events, both behaviors are covered in your example so I think I will not have much trouble implementing this.However I want to have a "global" gesture in my game to get to the menu which should be similar to the IOS close app gesture (5 finger grab)I am not sure if I need to implement a gesture for that, or if I can somehow "hack" it?Do you have any idea how to achieve this?
     
  41. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    https://github.com/InteractiveLab/TouchScript/wiki/FAQ#2-does-touchscript-work-with-guibutton
    The short answer is "sort of". Default touch layers (i.e. CameraLayer and CameraLayer2D) work with objects which have colliders on them. GUI is not a part of 3d scene and GUI objects don't have colliders. But it is possible to implement a custom touch layer with all the functionality needed for GUI buttons.
     
  42. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    This might be tricky. You see, a gesture has a target and it "owns" touches which happen to hit target's collider (or more precisely "when a touch layer reports that a touch hits the target or its child"). This means that the most common case is when gestures work with touches on their targets.

    In your case you either have to make a global gesture and place it on the top most container of hierarchy, or somehow to fake it:

    1. If you have hierarchy with a global container you can put a gesture on it which will have to check what object is actually hit and change state when your small target is hit by the first finger. After that any touch is treated as part of scaling squence until all touches are lift off.
    2. Another way is to route all touches to the small object after the first touch hits it. You can implement a custom touch layer and activate it only when one finger touches the object. This layer will "tell" the system that all touches hit this small object even if they don't. In this case standard scale gesture should work. When all touches are lift off you disable the layer.

    Though I'm not sure that the second one will work, but it sounds cool (8
     
  43. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    Hi. You most likely need a FullscreenLayer.
    Depending on if you want to drag objects you can tap or not you will need to put the layer on the top most container of your hierarchy and add SimplePanGesture and a custom script to listen to its Panned event. Important part is to make sure that you are doing everything in screen coordinates.
     
  44. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    290
    One way to do this is to have SimplePanGesture on your top most container with FullscreenLayer. In this case it will get all the touches globally. But you need to start this gesture only if you have 5 fingers on the surface. To do this you'll need a class implementing IGestureDelegate and especially ShouldBegin method (https://github.com/InteractiveLab/T.../TouchScript/Gestures/IGestureDelegate.cs#L31). Where you will check if this PanGesture in question has exactly 5 fingers. You set an instance of this new class to gestures Delegate property and when the gesture is ready to begin it will check ShouldBegin method first. Which is going to return true only if the gesture "has" 5 fingers. After this is done you just subscribe to Panned event and check what direction and how much distance the gesture traveled.
     
  45. MPanknin

    MPanknin

    Joined:
    Nov 20, 2011
    Posts:
    361
    Thanks valyard. I guess now I did something similar to what you have suggested in point 1.

    A tap on an object activates a fullscreen layer with a custom transformer script. Gestures that are performed on the fullscreenlayer are then applied via the custom transformer to the tapped object. A simple tap somewhere on the fullscreenlayer deactivates it. Works really well.

    Thanks again for the food for thought.

    Cheers,
    Martin
     
  46. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
  47. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    to rotate any object to 3 axis, with differents movements to up rotate to axis z ,and rotate object rotate around it,
     
  48. Kscorrales

    Kscorrales

    Joined:
    May 12, 2014
    Posts:
    13
    another question how can I detect when pangesture is ended: because i used this code but does not work

    if(sender.State == Gesture.GestureState.Ended) {
    print ("Ended");
    }
     
    Last edited: Jul 30, 2014
  49. zlSimon

    zlSimon

    Joined:
    Apr 11, 2013
    Posts:
    31
    thanks valyard for your help. However I have trouble dealing with multiple object receiving touch events. I have one full screen touchlayer for the interface where you can pan interface elements around. The problem is that when 2 elements with a collider are at the same position only one element receives touch events. Is there a way that both receive touch events? I tried to befriend them with no effect. My actual use case is that there are 2 elements on the same spot where on element receives simplePanGesture and the other one tap gestures. Unfortunately only one element receives touch events.
     
  50. kim999

    kim999

    Joined:
    Jul 13, 2014
    Posts:
    1
    Hi guys! Pleas Please help me. I'm trying to make a Flick Gesture in my project. But I dont't understand what is the flow of these bunch of code. What I'm trying to do is a simple flick the object. Please help.


    using System;
    using System.Collections.Generic;
    using TouchScript.Utils;
    using UnityEngine;
    namespace TouchScript.Gestures
    {
    /// <summary>
    /// Recognizes fast movement before releasing touches. Doesn't care how much time touch points were on surface and how much they moved.
    /// </summary>
    [AddComponentMenu("TouchScript/Gestures/Flick Gesture")]
    public class FlickGesture : Gesture
    {
    #region Constants
    /// <summary>
    /// Message name when gesture is recognized
    /// </summary>
    public const string FLICK_MESSAGE = "OnFlick";
    /// <summary>
    /// Direction of a flick.
    /// </summary>
    public enum GestureDirection
    {
    /// <summary>
    /// Direction doesn't matter.
    /// </summary>
    Any,
    /// <summary>
    /// Only horizontal.
    /// </summary>
    Horizontal,
    /// <summary>
    /// Only vertical.
    /// </summary>
    Vertical,
    }
    #endregion
    #region Events
    /// <summary>
    /// Occurs when gesture is recognized.
    /// </summary>
    public event EventHandler<EventArgs> Flicked
    {
    add { flickedInvoker += value; }
    remove { flickedInvoker -= value; }
    }
    // iOS Events AOT hack
    private EventHandler<EventArgs> flickedInvoker;
    #endregion
    #region Public properties
    /// <summary>
    /// Gets or sets time interval in seconds in which touch points must move by <see cref="MinDistance"/> for gesture to succeed.
    /// </summary>
    /// <value>Interval in seconds in which touch points must move by <see cref="MinDistance"/> for gesture to succeed.</value>
    public float FlickTime
    {
    get { return flickTime; }
    set { flickTime = value; }
    }
    /// <summary>
    /// Gets or sets minimum distance in cm to move in <see cref="FlickTime"/> before ending gesture for it to be recognized.
    /// </summary>
    /// <value>Minimum distance in cm to move in <see cref="FlickTime"/> before ending gesture for it to be recognized.</value>
    public float MinDistance
    {
    get { return minDistance; }
    set { minDistance = value; }
    }
    /// <summary>
    /// Gets or sets minimum distance in cm touches must move to start recognizing this gesture.
    /// </summary>
    /// <value>Minimum distance in cm touches must move to start recognizing this gesture.</value>
    /// <remarks>Prevents misinterpreting taps.</remarks>
    public float MovementThreshold
    {
    get { return movementThreshold; }
    set { movementThreshold = value; }
    }
    /// <summary>
    /// Gets or sets direction to look for.
    /// </summary>
    /// <value>Direction of movement.</value>
    public GestureDirection Direction
    {
    get { return direction; }
    set { direction = value; }
    }
    /// <summary>
    /// Gets flick direction (not normalized) when gesture is recognized.
    /// </summary>
    public Vector2 ScreenFlickVector { get; private set; }
    /// <summary>
    /// Gets flick time in seconds touches moved by <see cref="ScreenFlickVector"/>.
    /// </summary>
    public float ScreenFlickTime { get; private set; }
    #endregion
    #region Private variables
    [SerializeField]
    private float flickTime = .1f;
    [SerializeField]
    private float minDistance = 1f;
    [SerializeField]
    private float movementThreshold = .5f;
    [SerializeField]
    private GestureDirection direction = GestureDirection.Any;
    private bool moving = false;
    private Vector2 movementBuffer = Vector2.zero;
    private bool isActive = false;
    private TimedSequence<Vector2> deltaSequence = new TimedSequence<Vector2>();
    #endregion
    #region Unity methods
    /// <inheritdoc />
    protected void LateUpdate()
    {
    if (!isActive) return;
    deltaSequence.Add(ScreenPosition - PreviousScreenPosition);
    }
    #endregion
    #region Gesture callbacks
    /// <inheritdoc />
    protected override void touchesBegan(IList<ITouch> touches)
    {
    base.touchesBegan(touches);
    if (activeTouches.Count == touches.Count)
    {
    isActive = true;
    }
    }
    /// <inheritdoc />
    protected override void touchesMoved(IList<ITouch> touches)
    {
    base.touchesMoved(touches);
    if (!moving)
    {
    movementBuffer += ScreenPosition - PreviousScreenPosition;
    var dpiMovementThreshold = MovementThreshold*touchManager.DotsPerCentimeter;
    if (movementBuffer.sqrMagnitude >= dpiMovementThreshold*dpiMovementThreshold)
    {
    moving = true;
    }
    }
    }
    /// <inheritdoc />
    protected override void touchesEnded(IList<ITouch> touches)
    {
    base.touchesEnded(touches);
    if (activeTouches.Count == 0)
    {
    isActive = false;
    if (!moving)
    {
    setState(GestureState.Failed);
    return;
    }
    deltaSequence.Add(ScreenPosition - PreviousScreenPosition);
    float lastTime;
    var deltas = deltaSequence.FindElementsLaterThan(Time.time - FlickTime, out lastTime);
    var totalMovement = Vector2.zero;
    foreach (var delta in deltas) totalMovement += delta;
    switch (Direction)
    {
    case GestureDirection.Horizontal:
    totalMovement.y = 0;
    break;
    case GestureDirection.Vertical:
    totalMovement.x = 0;
    break;
    }
    if (totalMovement.magnitude < MinDistance * touchManager.DotsPerCentimeter)
    {
    setState(GestureState.Failed);
    } else
    {
    ScreenFlickVector = totalMovement;
    ScreenFlickTime = Time.time - lastTime;
    setState(GestureState.Recognized);
    }
    }
    }
    /// <inheritdoc />
    protected override void touchesCancelled(IList<ITouch> touches)
    {
    base.touchesCancelled(touches);
    touchesEnded(touches);
    }
    /// <inheritdoc />
    protected override void onRecognized()
    {
    base.onRecognized();
    if (flickedInvoker != null) flickedInvoker(this, EventArgs.Empty);
    if (UseSendMessage) SendMessageTarget.SendMessage(FLICK_MESSAGE, this, SendMessageOptions.DontRequireReceiver);
    }
    /// <inheritdoc />
    protected override void reset()
    {
    base.reset();
    isActive = false;
    moving = false;
    movementBuffer = Vector2.zero;
    }
    #endregion
    }
    }