Search Unity

TouchScript — multi-touch library for Unity [RELEASED]

Discussion in 'Assets and Asset Store' started by valyard, Mar 6, 2013.

  1. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Glad it works for you. Don't forget to rate TouchScript at Asset Store q:
     
  2. OP3NGL

    OP3NGL

    Joined:
    Dec 10, 2013
    Posts:
    267
    hi valyard,

    how do i get touchscript to work? im still having problems...

    basically, im trying to get single touch to orbit around scene, 3 fingers to pan, 2fingers to pinch to zoom double tap single touch to open menu in gameobject...

    What does drag a gesture do? Can the gestures work without additional scripting?
     
    Last edited: Jan 13, 2014
  3. valerik

    valerik

    Joined:
    Aug 8, 2013
    Posts:
    11

    I forgot to thanks you for this fantastic framework =) very very usefull!

    I have also another question:
    I'm trying to merge the touchscript gesture on NGUI objects (directly on them, not only on field of view's camera projection) but also with a box collider attached, can't recognize anything. Suggestion?

    (i'm trying to create a panel with a sprite attached and 4 box colliders to move the sprite along the screen during pressure and stop it when there aren't more tap)
     
  4. valerik

    valerik

    Joined:
    Aug 8, 2013
    Posts:
    11
    I have also trouble with scale gesture =/
    first of all, i can't build for ipad because i'm on windows, i build for ipad from a mac of a friend when is it possible.

    While in panGesture i considered to use the WorldDeltaPosition.x/y to get the pan gesture direction, in scale i'm using LocalDeltaScale supposing that if it is >0 i'm zooming in because there is more distance between the 2 fingers, while if LocalDeltaScale is <0 i'm zooming out because fingers come close. it's this assumption correct? it worked for the zoom in but not for the out so i thought that probably LocalDeltaScale isn't never negative but in a bound from 0 to x.

    resuming, my code is something like this:

    Code (csharp):
    1.  
    2. If(ScaleGesture.LocalDeltaScale < 0){
    3.  //zooming out
    4. }else if(ScaleGesture.LocalDeltaScale > 0{
    5.  //zooming in
    6. }
    7.  
    8.  
     
  5. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    No, you need to multiply scale. I.e. deltaScale 1 means no scale, deltaScale 2 means that the object increased two times during one frame, deltaScale 0.5 means that it shrunk two times during a frame. You can't get deltaScale value 0 because it doesn't make sense (8
     
  6. BossHoss

    BossHoss

    Joined:
    Jan 19, 2014
    Posts:
    3
    Hi,

    Does this script support 2D apps/games as well is it written in c# or js?

    Thanks.
     
  7. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Supports 2D, doesn't support JS as a mean to interact with itself.
     
  8. BossHoss

    BossHoss

    Joined:
    Jan 19, 2014
    Posts:
    3
    Wow, thanks for the fast reply. Cheers!
     
  9. CraigGraff

    CraigGraff

    Joined:
    May 7, 2013
    Posts:
    44
    Is there a way to get multiple pan or tap gestures to work at the same time (different fingers performing different tap or pan gestures)?
     
  10. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    This might be tricky.
    First of all, you need these pan gestures to be on the same object? I don't know how you would separate touch points into groups for gestures but you can do it by defining a Delegate and method called ShouldReceiveTouch in it. After that you assign this delegate to Delegate property of a gesture. This way you can control which touch points gesture gets and which touch points it doesn't get.
     
  11. CraigGraff

    CraigGraff

    Joined:
    May 7, 2013
    Posts:
    44
    Many thanks for the reply. All touches are on the same object. I ended up making a modified tap gesture that stores the last active touch points and then looped through all of those. It turned out that the active touches for the pan gesture were enough for what I needed.

    If I ever need a more robust system, I'll keep your suggestion in mind.
     
  12. valerik

    valerik

    Joined:
    Aug 8, 2013
    Posts:
    11
    So easy, just switching 0 to 1 =)

    I have another question that may be a bug.

    I was trying to add another camera (think as a minimap with a pan movement to move the minimap) and I add a camera layer (with a different name
    from the main camera layer) to the second camera.

    When I create the object at runtime TouchManager recognize the new layer but still doesn't work, only doing refresh of layer it works.
    At this point I did a script to create and setup the new camera layer before add it at TouchManager by code, but still doesn't work... need to refresh manually by editor and all it's fine.

    At last I resolved adding the gameobjects in the scene without create them at runtime and refreshing the layers in TouchManager (that miss a refresh() metod), so I think there is a problem in the order of layer, because also at runtime all works fine (just switching the 2 layers) while putting them in the scene and refreshing the order they don't change anymore. I did something wrong or is this a bug?
     
  13. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Refresh manually fills layers array of TouchManager with undefined order.
    Programmatically you actually can change layer order using ChangeLayerIndex method.
    Your new layer when created in a script goes to the bottom of layers array which means it will be under all other layers. Maybe that's why you are not getting touch points? Can you check that? Also, can you assemble a simple project where I could reproduce the behavior you are getting?
     
    Last edited: Jan 20, 2014
  14. totsboy

    totsboy

    Joined:
    Jul 12, 2013
    Posts:
    253
    Hello,
    I'm trying to use touchScript on virtual buttons for my game on windows phone 8. There are several buttons on screen during gameplay, like jump, left, right, etc.
    Now, what happens is when I'm pressing one button I MUST release it before pressing another one, if I just slide my finger to some other button the old one will remain pressed, while the new one will remain unpressed.
    That makes sense since the gestures are called Press and Release, but is there another gesture that would do the work here or od I have to implement something else? I had no luck looking up the documentation or this thread :/
    The code I'm using is basically the same that you use on the exemple:

    Code (csharp):
    1.  
    2. private void Start()
    3.     {
    4.         if (GetComponent<PressGesture>() != null) GetComponent<PressGesture>().StateChanged += onPress;
    5.         if (GetComponent<ReleaseGesture>() != null) GetComponent<ReleaseGesture>().StateChanged += onRelease;
    6.     }
    7.  
    8.     private void onRelease(object sender, GestureStateChangeEventArgs gestureStateChangeEventArgs)
    9.     {
    10.          if (gestureStateChangeEventArgs.State == Gesture.GestureState.Recognized)
    11.             gameObject.SendMessage ("UnClick", SendMessageOptions.DontRequireReceiver);
    12.     }
    13.  
    14.     private void onPress(object sender, GestureStateChangeEventArgs gestureStateChangeEventArgs)
    15.     {
    16.         if (gestureStateChangeEventArgs.State == Gesture.GestureState.Recognized)  
    17.             gameObject.SendMessage ("Click", SendMessageOptions.DontRequireReceiver);
    18.        
    19.     }
    Also, thanks for sharing TouchScript for free, it's a great tool! :)
     
    Last edited: Jan 21, 2014
  15. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Yes, this particular behavior might be tricky to implement.
    TouchScript follows rules of usual mouse-based gui with roll over / roll out events excluded. If you press a button in windows you usually can't "press" another button by rolling it over.

    You will have to create a special gesture and attach it to container which contains these buttons.
    In this gesture you'll need to constantly check where your fingers are and trigger appropriate buttons.

    Thanks. Don't forget to rate and review it in the asset store! q:
     
  16. totsboy

    totsboy

    Joined:
    Jul 12, 2013
    Posts:
    253
    Just did! :)

    Ok, forgive me if I'm being ignorant here, but how do I create another gesture? All the gestures are in .dll, I'm not really sure how to do it (kind of noobie in this area hehe)
     
  17. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
  18. totsboy

    totsboy

    Joined:
    Jul 12, 2013
    Posts:
    253
  19. madhur

    madhur

    Joined:
    May 16, 2012
    Posts:
    86
    Hi, I started using Touchscript. Can I get a sample code to rotate a 3D model using touch script. What I want to do is, zoom and rotate different 3D models using Touchscript.
    Thanks.
     
  20. kylekaturn

    kylekaturn

    Joined:
    Feb 24, 2013
    Posts:
    20
    When I build example and test with window 7 multitouch.

    applcation occur following error.

    EntryPointNotFoundException: SetWindowLongPtr
    at (wrapper managed-to-native) TouchScript.InputSources.Win7TouchInput:SetWindowLongPtr (intptr,int,intptr)

    at TouchScript.InputSources.Win7TouchInput.init () [0x00000] in <filename unknown>:0

    at TouchScript.InputSources.Win7TouchInput.Start () [0x00000] in <filename unknown>:0

    Am i doing something wrong?
     
  21. kylekaturn

    kylekaturn

    Joined:
    Feb 24, 2013
    Posts:
    20
    when I changed my build to 64bit version. It works fine.

    It seems error only occurs in 32bit build.
     
  22. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
  23. OSG

    OSG

    Joined:
    Oct 28, 2013
    Posts:
    4
    Hi, valyard. I`m using your TouchScript and have little problem with TapGesture. It works good when one finger used, but when I use more than one finger screen coordinates of touch goes to centroid of all touch coordinates (as I understand it). So my problem is that I need coordinates of one touch at one monent. Can you help me with this problem?
    P.S. I use gesture.ScreenPosition to get touch position and then I use it in Physics.Raycast.
     
    Last edited: Jan 24, 2014
  24. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Do you get ScreenPosition when TapGesture is recognized, i.e. all fingers lift off?
    It seems that at this stage it's impossible to access individual fingers. I might add an option to use behavior you specified.
    Do you want to have the position of the last touch point to lift off, right?
     
  25. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    I uploaded version 4.1.
    4.0 has a stupid bug in Windows touch code. Sorry for that.
     
  26. valerik

    valerik

    Joined:
    Aug 8, 2013
    Posts:
    11

    Hi Valyyard, finally i have the time to check it! I did a simple project with the main camera (touchmanager,layer script and input scritp), a cube with ( tap gesture) and a simple script to add a new camera (child of an empty gameobject, in the original scenario my camera was inside an NGUI root object) when pressing "b".
    The new camera is a rect in the bottom left corner and "watch" the same object of the main camera. It's in the same position of the main camera (0,0,-10f).
    Actually when i add at runtime the new camera, it's layer ( TestCameraLayer) goes to the bottom of layers array but it works fine, no problem, tap is correctly recognized. Also doing refresh (in layer array at runtime) the two layers are switched and it works correctly!

    I can send the scene via mail, but I don't know how to reproduce the wrong behaviour =)

    I try also to create camera disabled, and, storing reference to it, enable when pressing "c". works fine also in this way! Good, sorry for the wrong information, maybe i did something wrong in my original scenario!
     
  27. OSG

    OSG

    Joined:
    Oct 28, 2013
    Posts:
    4
    Yes, I get ScreenPosition when TapGesture is recognized.

    Yes, it`s exactly what I need
     
  28. mplaczek

    mplaczek

    Joined:
    Feb 13, 2012
    Posts:
    19
    Firstly, thank you for the awesome framework... I have your example applications working brilliantly on the hardware I need to use for my current project and it is really nice to see how active this thread is.

    I'm having some trouble integrating your TouchScript Windows 7 touch input with NGUI and was hoping someone could help.

    I found the 'hack' on https://github.com/InteractiveLab/TouchScript/issues/6 ... however it seems to give me some errors.

    I would be extremely grateful for some pointers to get these two married and playing nicely! I'm not doing any complex multi-fingerd gestures, however, I need to support multiple users.
     
  29. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Unfortunately I don't own an NGUI license.
    Can you post the errors you are getting in this issue at github? I'll think what I can do.
     
  30. Hectorous

    Hectorous

    Joined:
    Feb 4, 2014
    Posts:
    5
    Hi,

    I am using the touch library and found that if more than one touch point happens on the screen it clusters them and the other touches are being destroyed. I was wondering if there was a way that you can get the raw data of each individual touch point and not have them get destroyed.

    I am trying to make a play-doh type game where you have multiple touchpoints outside the game object and have it be affected by the touches. Is there a way to get this to happen?

    Cheers!
     
  31. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Not sure what you are trying to do.
    Have you checked Input.unity example?
    If you are using a tap gesture, for example, than yes you'll get a cluster of all touch points.
    You can either subscribe to TouchManager events directly or use MetaGesture which forwards touch events to C# events.
     
  32. Hectorous

    Hectorous

    Joined:
    Feb 4, 2014
    Posts:
    5
    Do you have an example of either subscribing to the TouchManager or MetaGesture you can share.
     
  33. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Yes, TouchScript package got Advanced example which includes MetaGesture and Input example which works with TouchManager directly.
    Though usually a better way is to create a custom gesture for your own needs.
     
  34. mplaczek

    mplaczek

    Joined:
    Feb 13, 2012
    Posts:
    19
    The guys at NGUI posted a solution for all those using the script at https://github.com/InteractiveLab/TouchScript/issues/6 to get NGUI working with TouchScript

    Hopefully this will help others :)

    UICamera.Raycast has the following signature in NGUI:
    static public bool Raycast (Vector3 inPos, out RaycastHit hit)

    The wrapper seems to expect this:
    static public GameObject Raycast (Vector3 inPos, ref RaycastHit hit)

    I'm not sure which version of NGUI it was written for but it seems it needs to be updated from this:

    UICamera.hoveredObject = UICamera.Raycast(UICamera.currentTouch.pos, ref UICamera.lastHit) ? UICamera.lastHit.collider.gameObject : UICamera.fallThrough;

    To this:

    UICamera.Raycast(UICamera.currentTouch.pos, out UICamera.lastHit);
     
    Last edited: Feb 5, 2014
  35. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    I added CombineTouchPoints flag to TapGesture. It's available in the inspector and turns off clustering.
    The code is available at github: https://github.com/InteractiveLab/TouchScript/tree/develop
    Compiled DLLs are in Examples folder.
     
  36. HavocX

    HavocX

    Joined:
    Jan 5, 2014
    Posts:
    40
    As I understands it, TUIO input is fully supported with free Unity as long as I deploy to Windows or OS X? Pro is only needed for deployment on mobile platforms, right?

    My hope is to use TouchScript with free Unity on OS X and with a touch frame from PQLabs. Maybe you have even tried this combination?

    (Sorry for the basic question, I'm pretty new to Unity.)
     
  37. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Yes, Unity free supports .NET sockets. You'll be able to use TUIO.
    But you should try to compile an example to make sure that everything works.
     
  38. HavocX

    HavocX

    Joined:
    Jan 5, 2014
    Posts:
    40
    Thank you for the quick reply!
     
  39. Hectorous

    Hectorous

    Joined:
    Feb 4, 2014
    Posts:
    5
    Hey there again, thank you for the responses so far. I am just wondering how to make multiple gestures/scripts work together. I have figured out how to make the scale and rotate gestures work together by adding each other to the friendly gestures but i am trying to use the transformer script with say, the scale gesture without having to release first. I cannot drag the transformer script into friendly gestures so i am stuck.

    Cheers!
     
  40. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Sorry, can you describe better what interaction type you want to achieve?
     
  41. Hectorous

    Hectorous

    Joined:
    Feb 4, 2014
    Posts:
    5
    At this point in time I can rotate and scale an object without releasing. However, i would like to be able to scale the object and move it without releasing it. I cannot seem to make the 2 scripts friendly to each other so i was wondering if there was anyway to do this.

    Thanks
     
  42. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    This should work if the gestures are Friendly. Are you sure that you are adding one to another one's friendly list?
     
  43. Hectorous

    Hectorous

    Joined:
    Feb 4, 2014
    Posts:
    5
    I am wondering if you can add the Transformer2D script to the Scale Gesture. Since Transformer2D isnt a gesture is there a way to make it so the two can interact with each other, or would I have to do something else.
     
  44. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Transformer2D is not a gesture. It listens to other gestures on the object and moves/scales/rotates this object if gestures are recognized.
    Transformer2D works automatically so you don't need to set it up or add anywhere. Just make sure that it's on the same object where your gestures are.
     
  45. mateustavares

    mateustavares

    Joined:
    Feb 18, 2014
    Posts:
    4
    Hi I just recently started using this asset and while it´s great and has helped a lot, I´ve having trouble getting an object to rotate using the Pan gestures. I need it to work just like in the example scene for Flick. I would use flick, but I need the object to rotate in real time. Any ideas on how to get this to work?
     
  46. totsboy

    totsboy

    Joined:
    Jul 12, 2013
    Posts:
    253
    Hello!
    I'm having some problem with panGesture. It's working fine on PC, but does not work on the phone (lumia 520).
    The touchscript object on my scene has Mouse Input, Mobile Input and Touch Manager.
    I have also deleted all dll except for touchscript.dll
    The touchDebugger does detect the touch on the screen, but the object does not move at all. Am I missing something?

    EDIT: Removing the Mouse Input fixes it! :D

    Thanks!
     
    Last edited: Mar 7, 2014
  47. tmanallen

    tmanallen

    Joined:
    Nov 8, 2009
    Posts:
    395
    Simple quick question, I am doing metagestures to handle swipe, so I created a test scene and it worked perfect, but when I put it in my game scene and it has moving parts and other background pieces with colliders on them, the swipe never gets seen by the metagestures. Is it something that I am doing or is this a known issue?


    Thanks
     
  48. czuczr

    czuczr

    Joined:
    Jan 23, 2014
    Posts:
    7
    Hi!
    Mr. Valyard! It's a beautifull product! Thnkx for making it free! ;-)

    I am trying to use Touchscript with StageScaleMode.SHOW_ALL, is it possible?
    I have a setup with a resolution 7680x2160 which is divided in four collumn horizontaly but runs in only one Unity application. To test it on a normal computer with lower resolution I have to scale it down but keep the aspect ratio. In every column i would like to show the same swf running separately.

    Problems arrives with touch coordinates when I change the scalemode and move swf from horizontal center to the position of the collumn x coordinate.

    Please, could you help me where to start?

    Thanks
     
  49. valyard

    valyard

    Unity Technologies

    Joined:
    Jun 4, 2010
    Posts:
    291
    Yeah, sorry for this. Trying to solve this problem in the next release.

    Don't know. Might be a bug.
    Generally metagesture is not a good way to implement gesture recognition.

    So, you are using Scaleform with 4 flash movies playing?
    Yes, I'm sure that each flash movie thinks it's the only one and running fullscreen.
    You can either modify ScaleformLayer to divide coordinated by 4 or send a number (1 to 4) to all your flash movies and add a CoordinatesRemapper to ScaleformInput on the Flash side where you would scale coordinates.

    Interesting configuration. What hardware do you use? How's the performance?
     
  50. czuczr

    czuczr

    Joined:
    Jan 23, 2014
    Posts:
    7
    We don't test it yet but maybe we should use a lower resolution.. ;-(
    Btw now I figured out how to position all the four swf and how to handle and remap touches correctly in ScaleFormLayer, thanks for the keywords! :)

    One more problem that I am not able to run multiple swf at the same time. If there is more than one ScaleFormLayer, TouchManager recognise it but it is not visible on screen and swf doesn't load. I used ScaleformLayer.cs from the examples of TouchScript. As I see this class comes from the original SFCamera.cs but modified and extended by TouchLayer.

    What is the right direction to get it work? Thnkx!