Search Unity

Need help stopping second touch from changing rotation

Discussion in 'Scripting' started by SiMULOiD, Apr 19, 2020.

  1. SiMULOiD

    SiMULOiD

    Joined:
    Dec 23, 2013
    Posts:
    126
    Hi all,

    I'm using the following script to orbit the main camera on a parented object. It works well with the mouse and touch, but when I add another touch while dragging, it causes a strange jump / skip in the camera rotation.
    Appreciate some suggestions on how to prevent this second touch from causing this change in rotation.

    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4. namespace RuntimeSceneGizmo
    5. {
    6.     public class CameraMovement : MonoBehaviour
    7.     {
    8. #pragma warning disable 0649
    9.         [SerializeField]
    10.         private float sensitivity = 0.5f;
    11. #pragma warning restore 0649
    12.  
    13. private Vector3 prevMousePos;
    14. private Transform mainCamParent;
    15. private bool myBool = false;
    16.  
    17.  
    18. void OnEnable()
    19. {
    20.      sensitivity = 0.0f;
    21.     }
    22.  
    23.         private void Awake()
    24.         {
    25.             mainCamParent = Camera.main.transform.parent;
    26.  
    27.         }
    28.  
    29.         void Update()
    30.         {
    31.  
    32.  
    33. if (Input.touchCount == 1)
    34.  
    35.             if( Input.GetMouseButtonDown( 0 ) )
    36.  
    37.             prevMousePos = Input.mousePosition;
    38.  
    39.             else
    40.  
    41.          
    42.             if( Input.GetMouseButton( 0 ) )
    43.  
    44.  
    45.             {
    46.  
    47. Vector3 mousePos = Input.mousePosition;
    48. Vector2 deltaPos = ( mousePos - prevMousePos ) * sensitivity;
    49. sensitivity = 0.15f;
    50.                 Vector3 rot = mainCamParent.localEulerAngles;
    51.                 while( rot.x > 180f )
    52.                     rot.x -= 360f;
    53.                 while( rot.x < -180f )
    54.                     rot.x += 360f;
    55.  
    56.                 rot.x = Mathf.Clamp( rot.x - deltaPos.y, -89.8f, 89.8f );
    57.                 rot.y += deltaPos.x;
    58.                 rot.z = 0f;
    59.  
    60.                 mainCamParent.localEulerAngles = rot;
    61.                 prevMousePos = mousePos;
    62.              
    63.  
    64.              
    65.              
    66.             }
    67.         }
    68.     }
    69. }
    70.  
    71.  
    72.    
     
  2. PraetorBlue

    PraetorBlue

    Joined:
    Dec 13, 2012
    Posts:
    7,909
    Input.mousePosition will return the average of all touches in a multi-touch scenario. So when you add a second finger, Input.mousePosition jumps to the position between your two fingers. You need to use Input.touchCount and Input.GetTouch(n) to get individual touches.

    You will probably also want to be aware of fingerId: https://docs.unity3d.com/ScriptReference/Touch-fingerId.html as the touches returned from Input.GetTouch() are not guaranteed to be in the same order every frame.
     
    Last edited: Apr 20, 2020
  3. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    38,738
    What @PraetorBlue said above is right-on... I also recommend always using open/close braces to help you reason about your code flow, especially with nested if/else clauses.

    Furthermore, I would not rely on getting back .eulerAngles or .localEulerAngles as they are human-convenient values and subject to discontinuities, as they are simply derived from the internal quaternion state.

    Instead I would recommend having your own float to track the angle and change only that float with your finger, then set it into the transform.localRotation after making a Quaternion out of it, probably with Quaternion.Euler() (this is safe and defined).
     
  4. SiMULOiD

    SiMULOiD

    Joined:
    Dec 23, 2013
    Posts:
    126
    Thanks for the info and suggestions, guys. I didn’t realize Input.mousePosition worked the way you described and will use Input.touchCount and GetTouch moving forward.
    Also, thanks for the float suggestion, Kurt, I’ll give that a try too and appreciate your help.

    I'm now using:
    Code (CSharp):
    1. if (Input.touchCount == 1)
    2. {
    3.     theTouch = Input.GetTouch(1);
    4.  
    5.     if (theTouch.phase == TouchPhase.Began)
    6.  
    7.  
    8.     else
    9.  
    10.     if (theTouch.phase == TouchPhase.Stationary)
    11.    
    Is there a convenient way to keep Input.MousePosition / MouseButton for testing in the editor, rather than using the remote?
     
    Last edited: Apr 20, 2020
  5. Kurt-Dekker

    Kurt-Dekker

    Joined:
    Mar 16, 2013
    Posts:
    38,738
    In my proximity_buttons project I have something called a MicroTouch that you call to get all touches.

    If a mouse is present it abstracts that to be just another MicroTouch, so it makes it so you write the same application code either way.

    Check out the Virtual Analog Button (VAButton) class, which uses the MicroTouch class, in proximity_buttons.

    proximity_buttons is presently hosted at these locations:

    https://bitbucket.org/kurtdekker/proximity_buttons

    https://github.com/kurtdekker/proximity_buttons

    https://gitlab.com/kurtdekker/proximity_buttons

    https://sourceforge.net/projects/proximity-buttons/
     
  6. SiMULOiD

    SiMULOiD

    Joined:
    Dec 23, 2013
    Posts:
    126
    This is excellent, Kurt, thanks for providing this solution and the links!