Is there a fundamental difference between the frames that define an animation and the frames that exist within the Unity engine at runtime? My first thought was that they would be the same but my tests seem to show otherwise and now I'm confused on how root motion works. Let's say you have a simple animation with 30 frames in the animation and to make it easy, let's say each frame moves the avatar the same distance in terms of root motion. Since the definition of animator.deltaPosition is "Gets the avatar delta position for the last evaluated frame" I figured that no matter the FPS of the game, animator.deltaPosition would return the same value as its returning the distance moved in one frame, regardless of how fast or slow those frames are being evaulated. After some testing I have found this isn't true. animator.deltaPosition returns higher values with lower fps and lower values at high frame rates. What is the actual relationship here?