Search Unity

  1. Dismiss Notice
  2. All Pro and Enterprise subscribers: find helpful & inspiring creative, tech, and business know-how in the new Unity Success Hub. Sign in to stay up to date.
    Dismiss Notice

Is Time.deltaTime accurate?

Discussion in 'Editor & General Support' started by unfungmz, Aug 19, 2018.

  1. unfungmz

    unfungmz

    Joined:
    Jul 29, 2017
    Posts:
    162
    I am having the kind of performance issues that make me want to rip my hair out. On PC, it's fine, but when I build for the Xbox One - I see some strange performance issues. I believe this is due to the XB1's CPU being so much slower than my 8700, so I just don't see those performance hits on PC.

    So let me try to explain what I see. I am building a horizontal shmup. Levels with less scenery run fast - around 120 fps. I have one scene in particular that runs at 80 fps. It has a lot more gameobjects (scenery). This is the scene I notice the lag spikes on. Understandably, but what I don't understand is why they're happening. All of the scenery is loaded and I just pan the camera across the scene. I'm not doing LOD or any kind of component swapping. For the most part, it runs smoothly, but every once in a while I run into brutal frame drops where I go from about 80 fps to around 15-20 fps then after a few seconds it goes back up.

    It also doesn't happen every single time. Sometimes it happens 2 times in a scene. Sometimes it doesn't happen at all. It does tend to happen close to the same points in the scene, but it's not precise on exactly when it happens and for how long.

    I have an FPS script running, but it's showing a steady 80 fps, which I know to be untrue. So I wrote another script which would essentially collect a list of all of the frames that took longer than 16.7 milliseconds to run. That way I'll know any frame that's slower than what I need for 60 fps. The problem is - it's not really showing anything. Even when I know some of these frames are taking 30-40ms, my script doesn't show it. So I'm wondering if Time.deltaTime is even accurate. Here's my script:

    float TotalTimePassed = 0.0f;
    GameObject fps_go;
    string TicksCollected;

    private void Start()
    {
    fps_go = GameObject.Find("fps");
    TicksCollected = "";
    }

    void Update()
    {
    float TimeSinceLastFrames = Time.deltaTime * 1000;
    TotalTimePassed = TotalTimePassed + TimeSinceLastFrames;



    if (TimeSinceLastFrames > 16.7f)
    {
    TicksCollected = TicksCollected + TimeSinceLastFrames + " / ";
    }

    if (TotalTimePassed >= 5000f)
    {
    TotalTimePassed = 0.0f;
    fps_go.GetComponent<Text>().text = TicksCollected;
    }
    }
     
  2. karl_jones

    karl_jones

    Unity Technologies

    Joined:
    May 5, 2015
    Posts:
    5,020
unityunity