Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.

Question Leaderboard "Best Times", and framerate dependence

Discussion in 'Game Design' started by dgoyette, Jul 25, 2022.

  1. dgoyette


    Jul 1, 2016
    I have a mode in my game where players can speedrun levels, and the times go into a Steam leaderboard. Some of the times are very close/competitive, and it made me take a moment to consider whether the time tracking approach I'm using is "fair", and that it wouldn't give an advantage to someone with a faster computer. But I seem to have found an insurmountable tradeoff: To eliminate framerate dependency, I have to use Fixed Time. But that reduces the granularity of the results.

    For example, imagine a race to the finish line. I can think of two ways to determine that the player has reached the finish:
    • Put a trigger there, which will be triggered in FixedUpdate().
    • Continually check the player's transform to see if it's in the "goal" in Update(). Perhaps due to rigidbody interpolation, this could "trigger" the goal between two FixedUpdate calls.
    The first approach is "fair". No matter how fast (or slow) your computer, FixedUpdate is consistent. However, it means that two "scores" can differ by no less than the FixedTimeStep, usually around 0.02 seconds. That's not a huge amount of time, but for speedrunning it feels kind of big, as people shave tiny amounts of time off their scores.

    The second approach is would allow for finer times, but also means that someone with a faster system would usually have an advantage. To put it to an extreme level, if one player checks for victor every 0.001 seconds, while another checks every 0.05 seconds (due to slower framerate), the player with higher framerates would register their victor almost immediately, while the slower computer wouldn't see the victory for some milliseconds later.

    I think I need to just go with the first approach, but I was curious what other people thought. It effectively means the times are only accurate to 2 decimal places (and honestly, only "even" decimal places), and that does feel a bit ugly. Players will wonder why their times always even in .x2, .x4, .x6, .x8, or .x0, but never .x1, .x3, etc.
  2. angrypenguin


    Dec 29, 2011
    Don't most speed run games have fixed time steps anyway? I don't know a lot about speed running, but most of what I've hear about it comes from old games, such as Super Mario 2 which, as far as I know, ran at a fixed frame rate of either NTSC or PAL, depending on the target TV.

    If so then being measured in fixed frame sizes isn't anything new. The only difference between that and a game running with a modern physics engine is that the physics frames aren't tied to the display refresh rate, so they can have nice round numbers such as 0.02 rather than long fractions such as (1 / 59.94 = 0.01667) for NTSC interlaced.

    Also, if your game is meant for speed running and you're keeping the physics fairly light, you can turn your physics timestep duration down to tighten things up a bit, including making it an off-round number if for some reason you want.
    Martin_H likes this.