Hey all, This one is a bit of a doozie, and I wish I could provide more info, but this is what I've got. I've been working on a game for 3 years which I've built on various versions of Unity over that time. I regularly playtest on various machines, and I've never experienced anything like this. I'm currently on 2019.1.0f2. I recently had someone playtest the game online (Windows 10 home, 2080 RTX 16GB RAM and i7 8700-k), wherein the internal Unity clock appeared to be running at "half time" speed for the duration of their playthrough. I want to be clear, the game runs just fine on all other systems I've tried it on, independently of hardware. It's a reasonably lightweight game. 2D, pixel art, metroidvania style. I can confirm nothing is bound to frame rate in the code, and that what the playtester was experiencing was basically "intended behaviour" across the board, only 1x timescale appeared as if it was ~0.5x. Physics, graphics, particles, animation etc all behaved as one would expect if I ran the game at a slower timescale. Additionally, parts of the game that use unscaled time were perfect. pause menu transitions, player input and navigation in those menus was just fine. In the game logic, I do set Time.timescale to 1.0 regularly (unpausing, loading new scenes) and change the value of the Time.timescale for various reasons (lerping in and out of bullet time). In all cases, the actual logic was as-expected, only it was happening at half the rate. Again, on all other machines I've tested, the exact same build runs without issue. We tried a demo of the game built in 2018 (I believe built on 2017.4) and it did not have this issue, nor did other unity games he played recently. So, TLDR: On just one playtester's PC, with builds of my game from just 2019.1.0f2, it appeared that Unity's Time.time property was being updated at 0.5 x the normal speed. Has anyone else experienced anything like this?