Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice
  4. Dismiss Notice

Question Understanding how fps and frametime are being measured in the DOTSSample

Discussion in 'Entity Component System' started by coffeecatcoding, Oct 6, 2020.

  1. coffeecatcoding

    coffeecatcoding

    Joined:
    Sep 28, 2020
    Posts:
    10
    Hello everyone,

    I want to implement a fps and frametime counter for my builds, and I know there are different tricks and implementations when dealing with these.

    The first thing I did was to launch the DOTSSample and try to display the fps by typing show.fps 1 into the console but I didn't see anything, then I went to check the code and I dont really understand how it works.

    https://github.com/Unity-Technologies/DOTSSample/blob/master/Assets/Scripts/Game/Core/GameStatistics.cs

    If I understand correctly they use System.Diagnostics.Stopwatch to measure elapsed ticks and the number of ticks per second.

    Code (CSharp):
    1.     void SnapTime()
    2.     {
    3.         long now = m_StopWatch.ElapsedTicks;
    4.         long duration = now - m_LastFrameTicks;
    5.  
    6.         m_LastFrameTicks = now;
    7.  
    8.         float d = (float)duration / m_FrequencyMS;
    9.         m_FrameDurationMS = m_FrameDurationMS * 0.9f + 0.1f * d;
    10.  
    11.         frameTimeData.SetValue(Time.frameCount, d);
    12.     }
    This method seems to be calculating how many ticks have passed since it was last called and then divide the duration by frequency.
    What exactly is this frequency?
    And why are they weighting the frame duration like this: m_FrameDurationMS = m_FrameDurationMS * 0.9f + 0.1f * d; ?
    Is there a reason for this specific weighting?

    And finally there is the DrawFPS function which I guess is calculating min, avg and max frametimes and displaying everything.

    I am thankful for every reply helping me to implement a decent fps and frametime counter.
     
  2. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    3,986
    I can't speak for the sample, but I have my own graphic profiler with publicly available source (admittedly it is a bit messy as I sort of hacked it together) https://github.com/Dreaming381/lsss-wip/blob/master/Assets/_Code/SubSystems/Tools/FrameProfilers.cs
    Lines 114 and 115 is how I calculate a time span.

    I mostly wrote this profiler to get a rough idea of performance on builds and figure out if I was CPU or GPU bound. If I am GPU-bound, there's a particular section of the game loop (it is different between editor and builds) where the CPU blocks until the GPU finishes.

    If you have any questions, I will be happy to answer.
     
  3. coffeecatcoding

    coffeecatcoding

    Joined:
    Sep 28, 2020
    Posts:
    10
    @DreamingImLatios
    Hey thanks for the quick reply,
    I've looked over your code quickly and it seems like you are calculating min, max and avg cpu and gpu times and draw bar charts, have you considered using avg, 1% low and 0.1% low as metrics?

    Are BeginGpuWaitProfilingSystem and EndGpuWaitProfilingSystem responsible for the cpu blocking you mentioned ?

    Your calculation for the time span seems similar to the DOTSSample so I suppose this is the way to go.
     
  4. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    3,986
    I don't because I didn't need it for what I wanted to know. Spikes are measurable independent of the system the game is running on for the most part, so I could get those stats profiling on my system using Unity's advanced profiling tools.

    Yes. I subtract GPU blocking time from the total time to get CPU time.
     
    coffeecatcoding likes this.
  5. burningmime

    burningmime

    Joined:
    Jan 25, 2014
    Posts:
    845
    Looks like it converts it from ticks to milliseconds.

    The weighting is just an exponential falloff. It smoothes out bumps and gives something close to a "moving average" without needing to store all the samples.
     
    coffeecatcoding likes this.