I think part of the trick is that even if it drops from 60fps to 30fps and each deltaTime is 2x it will still look bad. Not jittery (velocity will be constant)... but it looks bad. Nothing you can do about it. But to me (not an expert either) the trick is that you don't know the deltaTime will run over the allowed 16.67ms until mid-way through the calculations. But before you start the GPU/update calcs you must assume a deltaTime ahead of time (to increment positions, animations, particle system, etc.). I think that's the big point you missed above. So if you assume deltaTime is gonna take 16.67ms and the GPU & Update loop and stuff takes 10ms to complete then you are OK. So you have 10ms calcs, 6.67ms of idle time, and you run at a constant 60fps. But then lets say you are half-way through your 10ms calculations and another program interrupts and consumes 8ms of time. Well now you "missed" your monitor frame and it will be applied 33.33ms from the last push to the monitor. But you are halfway through your calculations (which used a deltaTime of 16.67ms) so the velocity for that delayed frame to frame won't be right. There's no way to knowing in advance if an external application is going to interrupt your update loop and cause a delay. I think? Now unity may be able do fancy stuff by looking at past history of total process time and adapt. If total process time takes lets say >80% of monitorInterval then it may bump things up to 2 frames per update loop (33.33ms instead of 16.67ms). This is fine as long as process demands change gradually, but won't account for external app spike demands or Unity spike demands (i.e. an explosion that releases a lot of particles and drastically changes the GPU load from 1 frame to the next). In otherwords there's no sure way to know if it will take <16.67ms to do the total process GPU/update calcs *before* doing them. So handling dropped frames will never be perfect although I imagine some work can be done to minimize the effects. But bottom line in my opinion if your dropping frames there's no way it'll ever look good. I don't care much about this scenario much (it needs to be addressed to be sure, but we can't expect it to look good). All the focus should be on the operating condition where we are not dropping frames (which is where most games should operate 99% of the time)... in which case Time.deltaTime exactly equals 1f/MonitorRefreshRate. Just my take. I don't know much about this stuff.