Search Unity

  1. Engage, network and learn at Unite Austin 2017, Oct 3 - 5. Get your ticket today!
    Dismiss Notice
  2. Introducing the Unity Essentials Packs! Find out more.
    Dismiss Notice
  3. Check out all the fixes for 5.6 on the patch releases page.
    Dismiss Notice
  4. Unity 2017.1 is now released.
    Dismiss Notice
  5. Help us improve the editor usability and artist workflows. Join our discussion to provide your feedback.
    Dismiss Notice
  6. Unity 2017.2 beta is now available for download.
    Dismiss Notice

Unity remote and touch phase ended

Discussion in 'iOS and tvOS' started by Daladier, Jun 22, 2011.

  1. Daladier

    Daladier

    Joined:
    Jun 18, 2011
    Posts:
    49
    Hi,
    It looks like thah Unity remote ignore touch phase ended. Or am I wrong?
     
  2. svenskefan

    svenskefan

    Joined:
    Nov 26, 2008
    Posts:
    282
    Unity remote is not very accurate when it comes to detecting input in general.
    You can get improved results by turning off "show image" at the bottom of the screen in the remote start screen.
     
  3. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    814
    Honestly, even builds running on the actual device will ignore "touch phase ended" pretty often. In my last game I was forced to write my own touch phase tracking system to have something 100% reliable.
     
  4. Daladier

    Daladier

    Joined:
    Jun 18, 2011
    Posts:
    49
    Thanks.

    Now I also noticed it too. I am doing a simple buttons for the racing game. Can differently detect phase ended?
     
  5. MikaMobile

    MikaMobile

    Joined:
    Jan 29, 2009
    Posts:
    814
    Keep track of a boolean for determining the state of your button. If input is detected on the button, flip said variable to "true". If Input is NOT detected on your button, but the variable is still true, then you know the finger has been released. So flip it back to false and do whatever phase.ended stuff you would have done.

    The same idea can be expanded to handle all kinds of states. In Battleheart, I used an array of a custom "touchstate" class that kept track of the starting position, whether the touch was tapped or dragged, and which fingerID it belonged to, etc. By linking your input to bools or ints that you're changing in an Update() loop, you ensure that nothing is ever "skipped" because of a framerate hiccup or something. If there's any lag or hiccups, your code won't break because it's in sync with the refresh of the screen.
     
  6. belias

    belias

    Joined:
    Nov 10, 2011
    Posts:
    35
    this doesn't occur to me in builds, only unity remote.
     
  7. MasterVamp

    MasterVamp

    Joined:
    May 18, 2011
    Posts:
    1
    Last edited: May 26, 2012
  8. ichini

    ichini

    Joined:
    Oct 13, 2012
    Posts:
    23
    I had the same problem until i realized my input code was placed in FixedUpdate instead on Update by mistake.

    Placing the code in Update seems to have cured the problem of dropped TouchPhase.Ended events (so far).
     
  9. moonjump

    moonjump

    Joined:
    Apr 15, 2010
    Posts:
    2,112
    Yes, FixedUpdate isn't the place for testing touch events.

    I find that Unity is reliable for touch events when multiTouchEnabled is set to false, but unreliable when set to true.

    Now I have a var that detects if there was a touch. As it is updated last, I can check the current touch state against the var. Then I know if touches have started or ended.
     
  10. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,418
    A little thread-necro here but I suspect that someone (such as myself literally just now) will have this exact same issue, and I have a bit more to add.

    I too have been quite successful with TouchPhase.Ended, having no problem with the event being recognized...until today. Scratched my head, searched google and ultimately ending up here.

    I was about to fire off some replies for some help when I decided to do a quick check of a boolean I already had in place as per Mika's advice. Sure enough it was reporting just fine, despite my calling of a function located in another script misbehaving.

    In other words a quick Debug.Log(<boolean>) in the Ended phase reported back exactly what it should, yet the function I was firing off was still malfunctioning. A quick tap and it didn't work, but touching the screen for a bit and sliding my finger around worked perfectly fine every time.

    It turned out that I had a while/yield nestled within an if (busydoing) return that was still "busy doing" while TouchPhase.Ended activated; while indeed my code recognized I was done touching the screen, my code was still doing its thing in that coroutine and ignoring the fact I stopped touching the screen and was screwing everything up.

    I nuked the busyDoing check and voila, everything is good again.

    Obviously your mileage will vary, and this particular problem may not be yours, but double checking the events that occur or are supposed to occur after the TouchPhase.Ended could save you some headaches.

    Cheers
     
    Last edited: Apr 21, 2013