Search Unity

TouchScript — multi-touch library for Unity [RELEASED]

Discussion in 'Assets and Asset Store' started by valyard, Mar 6, 2013.

  1. Fellow

    Fellow

    Joined:
    Apr 26, 2013
    Posts:
    4
    Yes I had.I compared what I had with an example and I couldn't figure out what was wrong.

    So I just abandonned this way and did my own FullScreenBackgroundTarget but this time horizontal-aligned with the collider box adapting to the camera movement.

    Anyway, good job valyard :)
     
  2. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Set Direction to Horizontal and check the sign of ScreenFlickVector property. <0 - left flick, >0 right flick.
     
  3. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    That's weird. Have always worked for me o_O
    Can you provide a project to reproduce this behavior.
     
  4. dyego_s

    dyego_s

    Joined:
    Mar 26, 2013
    Posts:
    24
    Oh man that works pretty nicely. Thanks so much!!
     
  5. Mikie

    Mikie

    Joined:
    Dec 27, 2011
    Posts:
    367
    Just tried with 4.1.2f1 pro. I get the following error "system.net.sockets" . I have Unity Android Pro.
     
  6. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    This error occurs with iOS/Android Indie license since it doesn't support native sockets.
    You should be fine. But if you get this error delete the PRO dll which contains code for TUIO support which relies on sockets.
     
  7. Felipe-Brito

    Felipe-Brito

    Joined:
    May 23, 2013
    Posts:
    5
    It's an impressive tool, but unfortunately I can't use.
    The examples is running well, I tried to change somethings but was impossible.

    Congratulations!!
     
  8. Fellow

    Fellow

    Joined:
    Apr 26, 2013
    Posts:
    4
    Unfortunately, no, I have a NDA on the project and don't really have the time to make a new one with juste the issue. Maybe on my spare time, I'll keep you informed.

    And just to clarify, there is no chance to have the lib working easily on webplayer I guess.
     
  9. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Can you describe the problem?
     
  10. Lexile

    Lexile

    Joined:
    May 21, 2013
    Posts:
    1
    I'm currently also working on an NDA project and I ran into the same problems as Fellow here. I'm rotating my camera around an object and some of the controls are handled by using gestures "in the background".

    I applied the "FullScreenBackgroundTarget" behaviour script to the camera and the necessary scripts (in this case, PanGesture and my own camera control script), but the collider applied by the behaviour didn't seem to be working (the Pan Gesture wasn't triggering any events...no touches were detected). The same collider seemed to work with a Tap Gesture though (did a debug script which displays the amount of taps to the background).

    The way I fixed it was to create a plane which rotates around the object, being always on the opposite side of the object compared to the camera. I then added the Pan Gesture and my camera controls to the plane and it seems to be working just fine. I just can't see how using a plane object differs from the collider provided by the "FullScreenBackgroundTarget"-behaviour. There certainly seems to be something funky with it, or we just need step-by-step directions for the usage of the behaviour.

    Ps. TouchScript is otherwise a life saviour. Great work!
     
  11. ganesh-pingale

    ganesh-pingale

    Joined:
    Dec 19, 2012
    Posts:
    11
    How to count touch points
     
  12. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Depends.

    All touch points:
    Touch points during a specific gesture:
     
  13. Imawizrd

    Imawizrd

    Joined:
    Jan 19, 2013
    Posts:
    20
    Hi, I finally received a windows 8 computer to test on.

    TouchScript works!

    Although, sometimes a touch will be stuck and the touch texture from the touch debugger will stay on the screen. I'm just about to test if the application thinks the touch is still continuing or if it's just the texture that isn't clearing. I don't think this is a windows 8 related issue.
     
  14. aum

    aum

    Joined:
    Dec 17, 2012
    Posts:
    20
    Hi, congrats about the plugin, it's amazing.
    Now, I'm testing and while the move, and sacle gestures works fine, the rotate gesture alwways say me this error:
    NullReferenceException
    UnityEngine.Transform.InverseTransformDirection (Vector3 direction)
    TouchScript.Behaviors.Transformer2D.onRotateStateChanged (System.Object sender, TouchScript.Events.GestureStateChangeEventArgs e)
    TouchScript.Gestures.Gesture.set_State (GestureState value)
    TouchScript.Gestures.Gesture.setState (GestureState value)
    TouchScript.Gestures.RotateGesture.touchesMoved (IList`1 touches)
    TouchScript.Gestures.Gesture.TouchesMoved (IList`1 touches)
    TouchScript.TouchManager.updateMoved ()
    TouchScript.TouchManager.updateTouches ()
    TouchScript.TouchManager.Update ()


    I only have a scene with a plane and a Box, and I'm triyng to move, rotate, and scale with tipical mobile interaction.
    and I achieved to works with move, and scale (putting into the move gestures), but rotate gesture don't works.

    I tried with only the rotate gesture, and the Transform2d in the scene, and always says me the before error.

    Can you help me? Sure that I'm doing something wrong...

    Thanks
     
  15. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    Amazing plugin!

    But do anyone of you guys know how to get the location of a TapGesture? :rolleyes:
     
  16. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Hi. Can you send me a test project where this error happens to v [at] lent.in?
     
  17. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    That's funny, but I just checked out the code and there's no straightforward way to do it.
    What location you need? In screen coordinates or world coordinates?
     
  18. Grahammmm

    Grahammmm

    Joined:
    Jun 4, 2013
    Posts:
    12
    Hello,

    I appreciate the work that has been put into this plug-in so far, it's been really helpful to me!

    I'm hoping someone can help me with an issue that I'm having. I would like to recognize a background gesture at the same time as a tap gesture on another object in the scene. Here is my set up:

    I have Camera1 with FullscreenBackgroundTarget attached it has a Camera Layer attached with the Layer Mask set to 'backgroundLayer' and a metaGesture script attached. I have another camera as a child of Camera1 with Camera Layer attached with a Layer Mask set to 'cube'. Lastly I have a cube in the scene in layer 'cube' with a tap gesture script attached to it. When I start any gesture on the cube, the FullScreenBackgroundTarget is not receiving any gesture events. Separately they both work fine.

    I have tried using ShouldRecognizeSimultaneouslyWith function but it has not made a difference.

    Thanks for any ideas
     
  19. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    For this to work you have to have your cube as a child of Camera1 or have a common parent object containing the camera and the cube with a gesture attached. Because the cube is the first to get touch points and blocks the collider on Camera1. If they are in one hierarchy the common parent object will get touch points from their children and if there are gestures which are configured to work together they will.
     
  20. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    My current script now gets the screen coordinates from where its clicked last, then fires a ray from the camera to any collider in the scene and an object will move to that location. So a simple point and click idea... however I cant get this to work with the touchscript because the touchscript requires a collider, I cant use the screen position as collider for the touchscript (is that possible).

    So long story short I somehow want to have the screen coordinates at first because I need to fire a ray from that position for a point and click system.

    I got this system working with a simple mouseclick as a backup.. but I would like to have it also working with the touch script instead.
    I hope you can help me!
     
  21. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    TouchScript does raycasting itself. Just add TapGesture to all your objects.
     
  22. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    So how do I read the position from the raycast hit?
    Sorry im quitte new to Unity and C#, and this Touch Script is quite hard as a first project.. :)

    Is it possible for you to give a quick example on where I can read the raycast hit position from?
     
  23. ThreeDeeZ

    ThreeDeeZ

    Joined:
    Jun 22, 2012
    Posts:
    4
    Also trying to find the position, and raycast results but no luck. Think I may have found why...
    I believe that gestures (tap and pan) are failing to report these positions correctly if they are able to at all. Why?

    Inside of the state changed handler I try:

    Code (csharp):
    1.  
    2. if (e.State == Gesture.GestureState.Recognized){
    3.     Gesture target = sender as TapGesture;
    4.     Vector2 outOfLuck1 = target.ActiveTouches[0].Position; //out of range error. I have stopped tapping so there is nothing there...
    5.     Vector2 outOfLuck2 = target.PreviousScreenPosition;//Again missed the boat. From Clusters 76: if (length == 0) throw new InvalidOperationException("No points in cluster.");
    6. }
    Both of these examples do not get the position information from the event message, they go retrieve it from another entity (clusters or manager) and by that time the touchpoint(s) have been updated and the data is stale at best, and missing at worst.

    Other systems (Unity default Touches) give direct access to the touch data: time, position, delta from last, etc. Same for mouse input. Is there a way to do this that I am not seeing with TouchScript?
     
  24. Mr.ider

    Mr.ider

    Joined:
    Dec 18, 2012
    Posts:
    1
  25. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    You are exactly right. Somehow before you brought it up nobody needed to know coordinates of a tap. I personally was always interested in what object was tapped not where it was tapped.

    The system has information about where touch points hit colliders, TouchManager have Touches collection which is a list of TouchPoints. They got Hit property which is a result of successful raycast. But TapGesture was made to work on clusters and it doesn't matter if you used one finger or 10 fingers to tap on an object. Of course it's easier to assume a one-finger tap but TouchScript was made for large touch screens which are not precise. Sometimes you get phantom touches or fake touches from fingers 1-2 cm over the surface. This is where clusters come from.

    But I understand the problem and will make some kind of a solution to this.
     
  26. ThreeDeeZ

    ThreeDeeZ

    Joined:
    Jun 22, 2012
    Posts:
    4
    Thanks for looking into it.

    I like that the system handles it for me in most cases, but I am doing some gesture work that needs to pick apart motions and track individual touchpoints and decide what to do with them, not just the whole group or collection centroid. Had my own system but I much prefer your system using events the way you have set it up, just need that data along with the event. Also, if this common data is part of the event, I may not have to redo raycasts or world to screen conversions, or other such things in every event handler.
     
  27. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Look at how gestures are made. With a simple custom gesture you'll get functionality that you need.
    I'm making new gestures all the time for custom behaviors. Because in TouchScript a gesture is just a script which handles touch points on an object or hierarchy of objects. This is what it was designed for. Not to be limited with Pan or Tap gestures which come with every other library.

    Check out PressGesture or TapGesture — the simple ones. You just need to override several handlers which are called when touch points are added/removed/moved. These touch points already have all the hittest information you might need in TouchPoint.Hit property.
     
  28. Kadaiyen

    Kadaiyen

    Joined:
    Jul 16, 2012
    Posts:
    14
    This is awesome! :D

    I've had some trouble finding out how to get the direction of pan movements, however. Is there a member whose value holds this information as in the Flick gesture, or will I have to calculate its direction myself? (And if so, any hints? ;) )

    Also, I notice that when I try to use TouchManager.Instance.TouchesCount / ActiveTouches.Count to check # of touches when performing logic in an OnGesture method I create in script, their count is always zero - seems to be that they're emptied before my method is given control. How then would I check for # of active touches in a registered OnGesture method, besides going through Unity's Input.Touches?

    Thanks again.for this awesome tool :)
     
    Last edited: Jun 12, 2013
  29. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    Hi guys I found a way to get the coordinates.. what I did was cast a new ray from a tap gesture and at the screen position cast a new ray with the direction of the camera and thats how I got it to work correctly..

    On request I will post some of my code..

    Next problem is.. I want to create camera that pans when I use the pan gesture but I dont really have an idea where to start..
    Well I got a simple idea, testing it out right now and that is to parent the camera under a plane and attach the script to the plane and use the 2D transform to rotate another parent somehow.. not sure yet but im not sure if this is the correct way to do so but I have no idea on how to do it differently.. like I said before I am really new to scripting in general and only have knowledge of python..

    So if anyone would like to point me into the right direction I would really appreciate it, im not afraid of programming so you can tell me any complex ideas and I will try to script it.. :)
     
  30. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    Hi guys.. just finished my next script where I can pan the screen but I really need some help with my next step..

    I want to combine the point and click with my camera pan script but the camera pan is stopping me from clicking anything else because I use an invisible plane to pan around.. anyone have any ideas on how I could solve this..?
     
  31. IainStanford

    IainStanford

    Joined:
    Aug 18, 2010
    Posts:
    28
    Hi.

    I've got a large 3M display that I'm trying out your script on, were definitely using the Win7TouchInput (I print it out when running) but I don't seem to be getting any touch input returned.

    For example, we have an object with a script that in the update loop we access TouchManager.Touches, but this list is always empty.

    The 3M display is hooked up to a pc with Windows 8 on it (64 bit), have you or anyone else every had issues like this?

    Even when just building the test scenes that come with the package (Basic Example, Everything, Hit Test) they don't work (I replace the MouseInput on TouchScript with Win7TouchInput, but I do leave the TuioInput there as I notice you always have both?).

    On the Everything scene, you have all the scripts on TouchScript so I disabled Mouse and Mobile but left Tuio and Win7, but still nothing happens when I run.

    Any ideas what might be going on?
     
  32. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    First of all you should place your invisible plane farther away from the camera than other objects.
    The first object to get hit takes touch points.
    If you want to be able to pan when touching other objects you need to place them as children to that plane.
     
  33. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Hm.

    Are you sure that multi-touch works in your system? Can you open Paint and draw with multiple fingers?
    Unfortunately I don't have windows8 installed anywhere around so I can't check o_O
     
  34. Kadaiyen

    Kadaiyen

    Joined:
    Jul 16, 2012
    Posts:
    14
    While using ScaleGesture, I check the state for Gesture.GestureState.Changed, and if the state matches, I check the gesture's LocalDeltaScale for <1 and >1 (to deduce whether the scale is a pinch or a spread), zooming out / in respectively. My problem is that the LocalDeltaScale seems to vary wildly between frames, and output to a debug.log shows it go from 0.5 to infinity and back, even while executing for the same gesture.

    I've only used the documentation regarding this member, am I using it improperly? I'm lost on this one.
     
  35. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    Thanks I will try that, but it might be a bit hard to implement it like that since I use transform 2D to move the plane, so if I place them as children I have to make some major adjustments I suppose.. :)
     
  36. IainStanford

    IainStanford

    Joined:
    Aug 18, 2010
    Posts:
    28
    Yeah its definitely working, comes with a calibration tool you can see all (apparently 40) touch points. Only tested it with my 10 fingers tho!
     
  37. eclipseav

    eclipseav

    Joined:
    Jun 3, 2013
    Posts:
    10
    Okay, I'm just another step closer to finishing this part.. I can now ray cast through my touch plane and this works perfectly.. :)
    But now I have another problem which is when the tap gesture is enabled my pan gesture is disabled.. I'm not sure why..

    Any ideas?

    By the way thanks for helping out, I really appreciate any help!
     
  38. IainStanford

    IainStanford

    Joined:
    Aug 18, 2010
    Posts:
    28
    Hmm, Well I couldn't get the Win7Touch to register for some reason, the touch list was always empty.

    I got http://forum.unity3d.com/threads/152685-RELEASED-Windows-7-multitouch this one and that works fine (removed the NGUI integration as we'ev got our own bridge for that anyway).

    Not sure why TouchScript wasn't working for us, if we ever get time in the future we might fork your repository and see what we find. FOr now schedules a bit tight.
     
  39. Grahammmm

    Grahammmm

    Joined:
    Jun 4, 2013
    Posts:
    12
    Hi I have a quick question. I have both Win7TouchInput and MouseInput enabled as I would like the option to use either at any time. The problem is that when using my finger i am receiving two touch points (one finger and one mouse). Is there a way that I can receive only one input (only mouse or only touch)?
     
  40. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Not sure about this.
    Unity might dispatch the first touch point as mouse event for compatibility with software which doesn't know anything about multi-touch. At least Flash does that. You'll have to google this. I'll be glad if you post what you find (8
     
  41. Grahammmm

    Grahammmm

    Joined:
    Jun 4, 2013
    Posts:
    12
    I've been looking into the problem with some success. There is a script that someone wrote available here that will intercept any touch events and prevent the corresponding mouse click from triggering. This will allow unity to receive only the touch event when both MouseInput and Win7TouchInput are enabled.

    Unfortunately there are downsides (for me anyways). My application is using Unity GUI and that only responds to mouse clicks. Normally touches would trigger it but not with this script running. The other downside is that the cursor sits in the middle of the screen and is visible while interacting with touches. It seems that hiding the touch cursor (the little diamond) while keeping the regular cursor available is not as easy as it should be.

    I will hopefully get a chance to look into this problem again another time, until then I will be using a button to switch between mouse and touch controls.
     
  42. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Hello guys.

    Who had troubles getting tap position from TapGesture?

    Please try the latest develop branch. TapGesture now correctly returns ScreenPosition from Recognized handler.
    And there's a new method GetCentroidHitResult which casts a ray from current screen position and returns if it has hit the same target and where.

    Added an example to Basic Example.
     
  43. dyego_s

    dyego_s

    Joined:
    Mar 26, 2013
    Posts:
    24
    Hey guys!

    I have one object that have panGesture and inside it I have a list of objects that also have a panGesture on each one.
    Can I get those two gestures working at same time?

    What I want is to move the list of objects, but I need to know when the user is trying to pan the piece out. I can't find a way to make it.

    I put an image to exemplify.

    thanks.

    $Untitled-1.jpg
     
  44. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Hi. Thanks for the image q:
    But I still don't understand. So, you want the master box to be draggable, right? You want to move it around... Scroll left and right, I guess.
    What do you want to happen when a user tries to move a child box? The big box still moves? Or maybe you got the master box scrolling from left to right but child objects from bottom to top?

    You can add master's PanGesture to childs' WillRecognizeWith array, this way they'll work together. But behaviour might be not the one you need.
     
  45. dyego_s

    dyego_s

    Joined:
    Mar 26, 2013
    Posts:
    24
    I'll try to explain it better.

    I have a gameEmpty with a Pangesture, inside of it I create a list os pieces that contains a Pangesture on each one.
    I'm trying to make a menu scroll. Like you said, the boxcollider Master will pan to left and right, but the users will be able to drag the piece out with the mouse and put it where they want on the screen. So, if my boxcollider Master is in front of the pieces I can only get the gesture on boxcollider master.

    So, I check if the xPosition of the (mouseFirstTouch - mouseLastTouch) is greater than 30 (for exemple) inside the panGesture state.changed of boxcollider Master, if it's true, the user can scroll it left or right.
    But if the yPosition of the (mouseFirstTouch - mouseLastTouch) is greater than 30 inside the panGesture state.changed of boxcollider child, the user will only be able to drag the piece out.

    I hope you understand and sorry about my english :)

    Thanks!
     
  46. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    I would do it this way:

    The master game object contains main controlling script with several states which determine if it's scrolling right now, doing nothing or dragging.
    It has PanGesture attached. Also every small box got a PanGesture attached too. Small boxes got a script which is subscribed to PanGesture's StateChanged and got several states too. When they are not in Moving state they accumulate how much a user wants a box to move. If the value exceeds 30px in y direction for example, it fires an event which is caught by master controller.

    And the logic works as following: if an event from a box is caught during idle state, state changes to dragging and master object can't scroll. If PanGesture on master object axceeds accumulated x movement by a certain threshold it goes to scrolling state and now can only scroll, any events from small boxes are ignored.

    So, this all is just a nested state machine.
     
  47. dyego_s

    dyego_s

    Joined:
    Mar 26, 2013
    Posts:
    24
    So, you are saying that the Master you see every event on each piece and the master itself at the same time?

    I'll follow your way, if I succeed, I'll post here again.

    Thanks for the help! :D
     
  48. dyego_s

    dyego_s

    Joined:
    Mar 26, 2013
    Posts:
    24
    @valyard I'm back to thank you so much, that works beautifuly. Thanks for the support! See Ya!
     
  49. fgarmo

    fgarmo

    Joined:
    Apr 22, 2013
    Posts:
    9
    Is it possible to avoid listen touches under GUITextures? I mean, I have my GUI and a surface created by cubes each one has its box collider. When I touch mi GUITexture (like a button) the attached gesture on the cube (that is under this GUITexture) is activated. I dont want this. I want to avoid gestures under my GUI elements. Is it possible to do this?
     
  50. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    thanks q:
    don't forget to rate it on Asset Store