Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Player look - inconsistent sensitivity

Discussion in 'Scripting' started by Etwus, Jun 6, 2019.

  1. Etwus

    Etwus

    Joined:
    Dec 12, 2016
    Posts:
    18
    I am calling my script from update method.

    Code (CSharp):
    1.  
    2. private void HandlePlayerLook()
    3. {
    4.     transform.Rotate(0, Input.GetAxis("Mouse X") * lookSensitivity
    5.         * Time.deltaTime, 0);
    6.     playerCamera.transform.Rotate(-Input.GetAxis("Mouse Y") * lookSensitivity
    7.         * Time.deltaTime, 0, 0);
    8. }
    When I change vertical synchronisation setting (1000+ fps difference), look sensitivity does change too. It becames more sensitive (my test proved that it is like 4 times more sensitive) with sync turned on.

    Why does this happen?
     
  2. palex-nx

    palex-nx

    Joined:
    Jul 23, 2018
    Posts:
    1,745
    It should not. Are you sure you're calling HandlePlayerLook only once per frame?
     
  3. Etwus

    Etwus

    Joined:
    Dec 12, 2016
    Posts:
    18
    Yes, I am sure.

    Edit: Strange thing is that with increasing framerate, the sensitivity decreases.

    Edit 2: It is somehow caused by Input.GetAxis method. I replaced it with constant and it was rotating with constant speed, even when framerate changed.
     
    Last edited: Jun 6, 2019
  4. Etwus

    Etwus

    Joined:
    Dec 12, 2016
    Posts:
    18
    Got it! Input.GetAxis is already independent on framerate, so multiplying it with Time.deltaTime makes it not work properly.
     
  5. Banaaani

    Banaaani

    Joined:
    Feb 10, 2021
    Posts:
    2
    Thank you so much! I spent so much time figuring out why I had problems. Finally camera movement works flawlessly.