I'm making a ground truth simulator where I need to capture rendered images with a specific output frame rate. The app itself is quite performance-intensive but does not need to run in real time - for example, it's fine to take more than one second to capture a second of 60hz video. So I've built in an automatic throttling system. I'll set Application.targetFrameRate to the desired FPS (let's say 60hz again). But if we fall short of that, say 30FPS, a script will automatically reduce the targetFrameRate and also Time.timeScale to compensate - in this case, targetFrameRate to 30 and Time.timeScale to 0.5. The result is that after two seconds, I record the same output as I would if my computer could handle the full 60hz and I did no throttling. The challenge is doing the opposite - say we've throttled downward, but for whatever reason we can now increase the targetFrameRate and maintain the desired output level. In the profiler, it's easy to see when this is the case, because Unity maintains the targetFrameRate by a process called "WaitForTargetFPS" which appears to simply wait until the frame has been long enough. But I need to get this value via a script, and can't find any way to do so. I can access the Initalization.PlayerUpdateTime struct, which is apparently what calls the WaitForTargetFPS function (not sure how that works when it's a struct) but can see no relevant options there for getting this information. Any ideas on how to get this value? The only idea I have is to implement my own version of forcing each frame to the desired length, but without actually modifying Unity itself, I'd have to estimate the time I wait each frame based on the length of past frames, which is hardly accurate. Thanks, y'all!