Search Unity

[Unity 3.3.0] Why does a touch trigger Input.GetMouseButton<Up/Down>

Discussion in 'Scripting' started by Semuserable, Feb 17, 2012.

  1. Semuserable

    Semuserable

    Joined:
    Jun 4, 2011
    Posts:
    8
    I thought that each "Input" must be correctly specified (such as Input.GetMouseButtonDown() for mouse or Input.GetTouch() for mobile-devices) but it seems that its not exactly true.

    So the following line of code
    Code (csharp):
    1. if(Input.GetMouseButtonDown(0)) {}
    is the same as
    Code (csharp):
    1. Touch touch = Input.GetTouch(0);
    2. if(touch.phase == TouchPhase.Began) {}
    I ran into big issues about such behaviour (I needed both checks to run simultaneously in order to be able to test the game on a computer and device but I didn't know that they would trigger both when touch is occurred). After some pointless-time-consuming actions I eventually found a solution. To differentiate the actual touch (instead of a mouse press) I should use additional check
    Code (csharp):
    1. Touch touch = Input.GetTouch(0);
    2. if(touch.phase == TouchPhase.Began  touch.phase == TouchPhase.Stationary) {}
    Otherwise Unity triggers both (Yes I could split them in separate methods and use the preprocessor directives or Conditional attribute but it's not the case).

    The main question is the topic's one. The less minor ones are: Was it made intentionally by Unity developers? Is it a bug?
     
    Last edited: Feb 17, 2012
  2. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    Yes, it's intentional, so you can use simple interface code on touchscreens without having to rewrite it.

    --Eric
     
  3. Semuserable

    Semuserable

    Joined:
    Jun 4, 2011
    Posts:
    8
    Much appreciated, Eric.
     
  4. PixelLifetime

    PixelLifetime

    Joined:
    Mar 30, 2017
    Posts:
    90
    Personally, I think this should be removed. Because I want to use it with UnityRemote and it doesn't allow me to do that, it triggers my mouse input at the same time as touch input. Even though I used mouse input only for testing and it's in a platform-dependent compilation. Just change things like they should be.

    This seems inconvenient because it's not that hard to write a part of the code with touches, yet it gives you a lot of control like fingerId, phases, position, pressure, radius....

    Why would anyone even use it for mobile game (Input.Get..)? By using it, you are making things more complicated for yourself.

    It's good that you care about these things, though, it would be nice if you created separate method for this kind of behavior - like GetTouchDown(int fingerId);
     
  5. SoftwareGeezers

    SoftwareGeezers

    Joined:
    Jun 22, 2013
    Posts:
    902
    This behaviour craps on touch screens. Just encountered a problem on Windows touchscreens where Input.GetMouseButton(1) is true when a second finger presses. This is deliberate behaviour?? For simplicity in creating UIs, just add an OR statement, "if touch || mouse". Not hard. Having one line that merges the two different inputs makes it impossible to filter them, no?

    Edit: There's a flag settable

    "Input.simulateMouseWithTouches = false" disables mouse simulation on touch screens. Then they work as two discrete inputs as you'd expect.

    https://forum.unity.com/threads/second-touch-counts-as-mouse-press-bug.497310/#post-3232577
     
    Last edited: Sep 25, 2017