Search Unity

Find how long Unity waits each frame for Application.targetFrameRate via script

Discussion in 'Scripting' started by Fragsteel, Sep 12, 2019.

  1. Fragsteel

    Fragsteel

    Joined:
    Nov 8, 2016
    Posts:
    6
    I'm making a ground truth simulator where I need to capture rendered images with a specific output frame rate. The app itself is quite performance-intensive but does not need to run in real time - for example, it's fine to take more than one second to capture a second of 60hz video.

    So I've built in an automatic throttling system. I'll set Application.targetFrameRate to the desired FPS (let's say 60hz again). But if we fall short of that, say 30FPS, a script will automatically reduce the targetFrameRate and also Time.timeScale to compensate - in this case, targetFrameRate to 30 and Time.timeScale to 0.5. The result is that after two seconds, I record the same output as I would if my computer could handle the full 60hz and I did no throttling.

    The challenge is doing the opposite - say we've throttled downward, but for whatever reason we can now increase the targetFrameRate and maintain the desired output level. In the profiler, it's easy to see when this is the case, because Unity maintains the targetFrameRate by a process called "WaitForTargetFPS" which appears to simply wait until the frame has been long enough.



    But I need to get this value via a script, and can't find any way to do so. I can access the Initalization.PlayerUpdateTime struct, which is apparently what calls the WaitForTargetFPS function (not sure how that works when it's a struct) but can see no relevant options there for getting this information.

    Any ideas on how to get this value? The only idea I have is to implement my own version of forcing each frame to the desired length, but without actually modifying Unity itself, I'd have to estimate the time I wait each frame based on the length of past frames, which is hardly accurate.

    Thanks, y'all!
     

    Attached Files:

  2. Antistone

    Antistone

    Joined:
    Feb 22, 2014
    Posts:
    2,836
    This isn't a direct answer to your question, but you might consider whether it would make more sense to restructure your application so that it doesn't care about game time at all, and instead let time "pass" based on the number of elapsed frames instead of the clock.

    In essence, find all the places in Update where you use Time.deltaTime and...stop using it.

    Then you can uncap the frame rate, Unity will generate new frames as fast as it can, and you can simply ignore how long that takes.
     
  3. Fragsteel

    Fragsteel

    Joined:
    Nov 8, 2016
    Posts:
    6
    So essentially replace all occurrences of Time.deltaTime with a global fixed value based on the desired frame rate, like 0.0166 if I want 60hz?

    That's not a bad idea, and in theory should do exactly what I want. I feel like I'll run into some issues with certain Unity internal classes using Time.deltaTime that I can't override, but there might be workarounds. Animations, for example, need to be locked to deltaTime or fixedDeltaTime, but the API gives you a pretty decent amount of control over them.

    Appreciate the idea, I'll wait a bit more to see if any direct solutions surface, but I'll try implementing that otherwise.