I have a mode in my game where players can speedrun levels, and the times go into a Steam leaderboard. Some of the times are very close/competitive, and it made me take a moment to consider whether the time tracking approach I'm using is "fair", and that it wouldn't give an advantage to someone with a faster computer. But I seem to have found an insurmountable tradeoff: To eliminate framerate dependency, I have to use Fixed Time. But that reduces the granularity of the results. For example, imagine a race to the finish line. I can think of two ways to determine that the player has reached the finish: Put a trigger there, which will be triggered in FixedUpdate(). Continually check the player's transform to see if it's in the "goal" in Update(). Perhaps due to rigidbody interpolation, this could "trigger" the goal between two FixedUpdate calls. The first approach is "fair". No matter how fast (or slow) your computer, FixedUpdate is consistent. However, it means that two "scores" can differ by no less than the FixedTimeStep, usually around 0.02 seconds. That's not a huge amount of time, but for speedrunning it feels kind of big, as people shave tiny amounts of time off their scores. The second approach is would allow for finer times, but also means that someone with a faster system would usually have an advantage. To put it to an extreme level, if one player checks for victor every 0.001 seconds, while another checks every 0.05 seconds (due to slower framerate), the player with higher framerates would register their victor almost immediately, while the slower computer wouldn't see the victory for some milliseconds later. I think I need to just go with the first approach, but I was curious what other people thought. It effectively means the times are only accurate to 2 decimal places (and honestly, only "even" decimal places), and that does feel a bit ugly. Players will wonder why their times always even in .x2, .x4, .x6, .x8, or .x0, but never .x1, .x3, etc.