Search Unity

Discussion Everyone loves Frames Per Second

Discussion in 'Scripting' started by StarBornMoonBeam, Apr 15, 2023.

  1. StarBornMoonBeam

    StarBornMoonBeam

    Joined:
    Mar 26, 2023
    Posts:
    209
    Here

    I need you guys to tear this apart for me.

    Code (CSharp):
    1. FPS = 1 / Time.deltaTime;
    you can do one word answer if you want.

    Bad or Good?

    Consider we might average this over a number of frames.

    When I wrote it to get the FPS in fixed and update I was thinking damn this is awful.

    But haha ha

    Tell me what you think of using this to get a frames per second. Instead of say a timer.
     
  2. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,144
    I asked GPT-4 (and GPT-3.5 which had the exact opposite response).

    upload_2023-4-14_22-14-1.png

    Then I asked it what it thought would be a good script.

    Code (csharp):
    1. using System.Collections;
    2. using UnityEngine;
    3. using UnityEngine.UI;
    4.  
    5. public class FPSCounter : MonoBehaviour
    6. {
    7.     public Text fpsText;
    8.     private int frameCount;
    9.     private float fpsSum;
    10.     private float lastUpdateTime;
    11.  
    12.     private const int frameRange = 60;
    13.     private const float updateInterval = 1.0f;
    14.  
    15.     private void Start()
    16.     {
    17.         lastUpdateTime = Time.time;
    18.     }
    19.  
    20.     private void Update()
    21.     {
    22.         frameCount++;
    23.         fpsSum += 1 / Time.deltaTime;
    24.  
    25.         if (Time.time - lastUpdateTime >= updateInterval)
    26.         {
    27.             float averageFps = fpsSum / frameCount;
    28.             fpsText.text = $"FPS: {averageFps:0.0}";
    29.             frameCount = 0;
    30.             fpsSum = 0;
    31.             lastUpdateTime = Time.time;
    32.         }
    33.     }
    34. }
     
    StarBornMoonBeam likes this.
  3. StarBornMoonBeam

    StarBornMoonBeam

    Joined:
    Mar 26, 2023
    Posts:
    209
    Omg it totally could have figured out that time delta was a float.. Σ⁠(⁠ಠ⁠_⁠ಠ⁠)

    Second script is cool.

    I have some speculation about it after seeing that script. I used average over frames.

    But talking about an average running an average of a list of 60 values, vs counting 1 to an int each frame.

    Pros,,
    • I don't have to wait 60 frames

    Cons,,
    • you do more stuff on one frame.


    ~ the cost of this frame
    ~ the cost of those frames
     
    Last edited: Apr 15, 2023
  4. StarBornMoonBeam

    StarBornMoonBeam

    Joined:
    Mar 26, 2023
    Posts:
    209
    Code (CSharp):
    1.     int LIMIT;
    2.     float AVERAGE;
    3.     float FPS;
    4.     void Update()
    5.     {
    6.         AVERAGE += (1 / Time.deltaTime);
    7.         LIMIT += 1;
    8.         FPS = AVERAGE / LIMIT;
    9.     }

    Have you ever seen anything like it?

    Code (CSharp):
    1.  
    2.     int LIMIT;
    3.     float AVERAGE;
    4.     public float FPS;
    5.     void Update()
    6.     {
    7.         Application.targetFrameRate = -1;
    8.         AVERAGE += (1 / Time.deltaTime);
    9.         LIMIT += 1;
    10.         FPS = AVERAGE / LIMIT;
    11.  
    12.         if (LIMIT > 1000)
    13.         {
    14.             AVERAGE = FPS;
    15.             LIMIT = 1;
    16.         }
    17.     }
    18.  
     
    Last edited: Apr 15, 2023
  5. Bunny83

    Bunny83

    Joined:
    Oct 18, 2010
    Posts:
    3,990
    Have I seen something like that? Yes. Is it good or bad?

    Well those two code snippets you posted are both bad. You should think about why you actually want to average the fps count. The main reason is to get a value that isn't fluctuating every frame which the framerate in reality does. Displaying a number that changes slightly 60 times per second (or even faster) makes it really hard to read. If the values are 60.2, 58.7, 61.4, 60.8, 59.8, ... all you will see is an overlapping mess of flickering numbers.

    Your first average is the total average over the whole runtime of your game. That means during the first frames you can get higher fluctuations but the longer the game runs, the less the value would change. However at that point you should ask yourself what's the point of that value. Say the game runs at 60 fps for 5 minutes. So we have accumulated 18000 frames (LIMIT is 18000) and your accumulated value would be about 1,080,000. When you divide those values you get a steady 60 fps. However now imagine something changes in the game and the performance drops to just 20fps for the next 60 seconds. Those are another 1200 frames. During that time we accumulate an additional value of 24000. Lets just add this all together. So LIMIT would be 18000+1200 == 19200 and your accumulated value would be 1,104,000. Again, lets divide the two numbers again and we get a value of 57.5. So even though the framerate was at a horrible low of 20fps for over a minute, the display would still show 57.5 fps. During that last minute the values would have slowly moved from 60 to 57.5. So what would be the use of this value?

    Your second code snippet does a similar thing and suffers from a similar problem but you do reset the counter in intervals. Though you don't gain much here. When you reset your counters, the first few frames the number would again fluctuate heavily and the more values you accumulate, the more stable the value gets. However when you hit your reset, the number would suddenly become unstable again. So you get a weird display pattern. Also your reset is wrong. You set your LIMIT to 0 but your AVERAGE to the current framerate. LIMIT should reflect the number of values you have accumulated. The next frame you add the current framerate again and increase LIMIT to 1 but you have double the value in your AVARAGE now. So the next 1000 frames the display would first show double the actual value and then quickly move towards the actual value. This happens in a cycle so the readings make no sense at all and aren't stable to look at.

    The point of the script that @Ryiah posted serves several purposes:
    • Get an accurate and stable fps value.
    • Update the displayed value only in fix intervals, so your eyes an brain actually have a chance reading it
    • The reduced update interval will also reduce the amount of garbage produced due to the tostring conversion that is required in order to display the text.
    You may want to reduce the update interval in that script to get a more reactive value (say 0.2f which would update the value 5 times per second which is usually more than enough).

    The fact is that one over deltatime is the most accurate FPS value you can get as it represents the actual performance at the current moment. So that means when this value would be stable for the next second, we get exactly that amount of frames displayed. Though the performance of your game can change from one moment to the next. The FPS value displayed should be an approximate indicator for the user how well your game runs. As already discussed, you can not display 60 values per second since it would be a fluctuating mess. Imagine you produce tons of garbage and your GC kicks in once every second causing a huge spike where your fps drops to 10 for one frame but otherwise runs fine at 60fps. You will only see the 60 as this brief flash of 10 can't be noticed. When you average over one second, the average would still be around 59.09 fps since it's just 1 frame out of the 55 frames. So the display is not that helpful to catch single frame glitches as they are averaged out. When you have an update interval of 0.2 you would get a stable 60 in 4 of those 5 updates, but the period that contains the glitch would drop to just around 52.9fps. You may ask why? Well we have one frame that takes 0.1s to complete because of the GC spike. That means since this happens in one update interval which has a length of 0.2s, we only run "normal" at 60fps for a timespan of 0.1s. So we render 6 frames plus the 1 with the GC spike. So in that update interval we get 7 frames, 6 with a fps value of 60 and 1 with an fps value of 10. So we get (360+10) / 7 == 52.9 fps.

    Usually the best performance indicator would be a short graph (over the last few seconds) where you plot the actual fps or deltaTime value over time. Usually it makes more sense to actually plot the deltaTime as you would see a drop in performance as a peak in the graph. There are already solutions for that.
     
    orionsyndrome likes this.
  6. StarBornMoonBeam

    StarBornMoonBeam

    Joined:
    Mar 26, 2023
    Posts:
    209
    I actually needed it for

    ETA

    Estimated time of arrival