Search Unity

UI Buttons + GetTouch

Discussion in 'UGUI & TextMesh Pro' started by Aenah, Oct 8, 2014.

  1. Aenah

    Aenah

    Joined:
    Apr 1, 2013
    Posts:
    29
    Hello, everybody.

    I`m developing FPS controls for my mobile game. With new GUI, I created buttons for moving my character controller forward, back, right and left. I use, for them, and Event Trigger (PointerEnter event). This works perfectly.

    In the other hand, I have an script for rotate the gameobject head (this object has attached a main camera). If I detect that user is touching the half right side of the screen, I rotate the head (and its rotates the camera). This works fine too.

    My problem is that both systems dosnt work at the same time. When I use one finger for moving my character, and later I use a second finger for rotate the camera, my character stops and the camera rotates (and vice versa).

    More information:

    Script for camera rotation (from wiki):

    Code (CSharp):
    1. void Update () {
    2.  
    3.         if ( Input.touches.Length > 0  && Input.GetTouch(0).position.x > Screen.width/2 )
    4.         {
    5.  
    6.             if (Input.touches[0].phase == TouchPhase.Moved)
    7.             {
    8.                 Vector2 delta = Input.touches[0].deltaPosition;
    9.                 float rotationZ = delta.x * sensitivityX * Time.deltaTime;
    10.                 rotationZ = invertX ? rotationZ : rotationZ * -1;
    11.                 float rotationX = delta.y * sensitivityY * Time.deltaTime;
    12.                 rotationX = invertY ? rotationX : rotationX * -1;
    13.                 transform.localEulerAngles += new Vector3(rotationX, rotationZ, 0);
    14.             }
    15.         }
    16.     }
    Code for move my character controller (where cc is my character controller):

    Code (CSharp):
    1. cc.Move(Vector3.back * Time.deltaTime * speed);
    How can I do for boths systems works together.

    Thanks a lot.