Search Unity

Why do so many AA+ Unity games feel bad at low FPS?

Discussion in 'General Discussion' started by frosted, Sep 25, 2021.

  1. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    One thing I've noticed pretty consistently is that many Unity games feel extremely sluggish when they hit framerate drops, especially dipping under 30fps. I'm not talking about low budget indie games. I'm recently playing Pathfinder: Wrath of the Righteous, a very well made AA game. In more demanding views (particularly w/ heavy particle/fog) when performance drops, control gets very sluggish.

    This is a common feature I've noticed in many Unity games. If CPU usage spikes especially, control really suffers, far more so than you generally see from Unreal or custom engine games.

    I can't be much more specific than "feels very bad at low FPS" - have others noticed this kind of thing? Not *all* Unity games that suffer, but an unusual number compared to most other engines. Again, I'm not talking about low budget indies, these are professionally made AA+ games.
    • Am I crazy or have others also noticed this?
    • Is this a code design decision that just tends to be more common in Unity games?
     
  2. EternalAmbiguity

    EternalAmbiguity

    Joined:
    Dec 27, 2014
    Posts:
    3,144
    I haven't noticed it but I go out of my way to avoid low framerates.

    Have you looked at the framepacing for those games? If that's uneven it would be more disruptive.
     
  3. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    If a game is relying on Unity's physics and FixedUpdate for gameplay, it will essentially perform "frame skipping" if the framerate becomes lower than the fixed time step. If the game also samples input in Update, you'll get one additional frame of input lag since the input will only be used in the next FixedUpdate, after the current frame was rendered.

    For example, if the game is running at 20fps and has a fixed time step of 0.02 (50 fixed updates per second), this is the approximate sequence of events:

    - Player sees frame 0
    - Player presses button
    - Game runs two fixed updates and physics updates
    - Game runs Update for frame 1, reads player input
    - Game draws frame 1
    - Player sees frame 1, but game didn't react to input yet
    - Game runs three fixed updates and physics updates using the player input read in frame 1
    - Game runs Update for frame 2, reads player input
    - Game draws frame 2
    - Player sees frame 2, finally seeing the game react to their input.

    In this example the game will only display the result of the player actions 100 milliseconds after they first pressed the button, and the game logic will run two or three times before considering their input.

    There are ways to make a game more responsive at lower than 30 fps, but it's not obvious nor straightforward. For example, sampling input after rendering but before fixed update would cut down the latency in half.
     
    Last edited: Sep 25, 2021
  4. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
    AFAIK Unity always felt like it had higher than average input latency. I'm saying "felt" because it has been discussed many times over the years on the forums, but we've never been able to quantify if there is extra frames of input latency.

    Then the new input system came, and it definitely had more input latency on certain platforms ( source : https://forum.unity.com/threads/und...ucing-input-lag-in-unity.762161/#post-5076617 ). Not sure what the state is now because we gave up on it a while back because in our eyes, they obviously had wrong priorities if reducing input lag wasn't in their top goals.

    Personally, it is my belief that Unity's input has a frame or two more latency than it should and at low frame-rate it becomes even more apparent.

    Combine that with the generally unstable delta time, which makes every thing feel jittery and weird (which I guess is partially addressed for some platforms now) and that makes Unity 30fps feel extra bad.

    Combine all the above with the fact that 30fps needs a bit of care on how you design your game, and most indies don't have that expertise. 3rd person forward movement can look fine, but side-scroller-like horizontal scrolling can look extra bad and stuttery. You generally need high quality motion blur to mask a bit of this, but again that's an area where Unity is lacking.
     
  5. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    I have not noticed this.

    Usually controlling things under heavy framerate drop is not going to be fun, regardless of the engine.

    By default at least some tutorials used to use Input.GetAxis. Here's the thing. GetAxis has inertia built in and does not respond to controls immedeately. The ones that are immediate are GetAxisRaw. When I was porting unity code to unreal, this kind of discrepancy came out as a surprise.
     
  6. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Yeah AAA just have better timing, optional triple buffering, loops, motion vectors, motion blur, input etc. They solved it all long ago while Unity didn't recognise any of it as a problem. Hard to if you're not making console games yourself.

    I remember talking about this subject on Sony's internal dev net forum many years ago (the Unity part). It was my post, but I can't share any of the answers from it.

    I think today's Unity is capable of AAA quality stepping, timing, movement but absolutely definitely not out of the box. A lot of work, and scouring GDC archives will be needed. All the little tricks like scaling your motion vectors differently for framerate vs continual motion and so on would be a great start.

    None of that is provided by Unity out of the box. It's just basic motion vector data that doesn't really compensate for framerate differences. You could probably make a formula to tweak the post effect settings but it would still need a lot of tuning.

    Then input buffering / timimg, physics etc...

    So yeah, I very much have noticed a big difference between Unity and AAA titles when frames drop and both have motion blur. For me, because I rely on sight so much, it's a pronounced difference. For others, possibly not. It was enough for me to research, at least.

    If anyone's doing research and tests, I'm really interested in the findings.
     
  7. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Unity doesn't support triple buffering AFAIK, unless in platforms where it's enforced by the OS like Switch and mobile.

    To make matters worse, on certain consoles setting target framerate does nothing, so your options are either see your game drop straight to 20fps if it dips bellow 30fps or unlock the framerate and live with screen tearing.
     
    AcidArrow and hippocoder like this.
  8. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Triple buffering is likely the biggest win for smoothing IMHO (and internal game loop timing being constant).
     
    MadeFromPolygons likes this.
  9. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,674
    When you say triple buffering, what exactly are you talking about? Traditionally, that meant having 3 rendering buffers where one is displayed, one is queued to be displayed and the last one is being rendered to. This doesn't map to all graphics APIs the same way, but Unity should behave that way in most cases.

    Dropping straight to 20 fps if you dip below 30 fps sounds like a bug, tbh. Unfortunately, these things tend to be very platform specific and sometimes slip through on some platforms. I know for certain that it shouldn't be the case on Windows Standalone.
     
  10. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    In all the time I've spent playing on PC, I've never seen triple buffering have any noticeable effect.

    Basically, traditionally it meant that instead of 2 buffers (front and back) there were three. The benefits of that were very dubious. If you're spending too much time painting with two buffers you'll be spending too much time painting with three. Especially when you do not necessarily blit, but swap.

    Technically the "triple buffering" option used to be popular in the time when there were tons of WWII games on Quake 3 engine. At some point it stopped appearing as an option. Like W-buffer.
     
  11. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    It's on PS4 where vsync in Unity works much like it did in the old days of PC gaming where vsync with plain double buffering would cause the game to wait for the next vsync if it doesn't finish the current frame in time. So if you set vsync count to 1 but your frame takes even 1ms longer than 16ms to complete, Unity will sit and wait for the next vsync, dropping the effective framerate from 60 to 30.

    The compositor on Windows and other platforms like mobile and Switch offers "free" triple buffering, so this isn't a problem on those. But on platforms like PS4 the games are supposed to implement their own sync strategies if they want anything fancier than plain old vsynced double buffering, AFAIK, which Unity doesn't do.
     
    angrypenguin likes this.
  12. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    It stopped appearing because OS compositor provide the same effect without games having to manage it: The third buffer is the copy used by the compositor for displaying.

    With double buffering and vsync on a 60hz display you can only run at 60fps, 30fps, 20fps, 15fps, and so on. If the game takes 18ms to render a frame, it will miss every other vblank and run at 30fps. But with a longer swap chain, it will be able to queue two frames instead of one, which can be displayed consecutively.

    Of course, eventually it will run out of queued frames and will miss a vblank again, causing the infamous "juddering" because every few frames a frame will persist so the GPU can catch up, but the industry seems to much prefer that over screen tearing, considering it's now ubiquitous on console games.

    On PC, if you always had hardware to keep games well above 60fps of course triple buffering had no effect.
     
    angrypenguin likes this.
  13. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,021
    Developers can make games that performance well or poorly with any game engine. I have personally built games in Unity that can run 300 FPS with 30ms button to pixel latency. As a reference, CSGO has a button to pixel latency of 71ms on the same hardware.

    If it seems like an unusually high number of Unity games have problems, then I can only guess that it is because Unity is very accessible for a lot of new developers.
     
    adamgolden likes this.
  14. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    No, there's limitations which aren't simply sucking at deltatime.
     
  15. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    The OP is not talking about performance, but how games behave when their framerate go below 30fps, something that is getting more common as games using HDRP (which has very limited scalability options and doesn't run well on integrated GPUs) are hitting the market.
     
  16. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,144
    Integrated graphics was never intended for gaming.
     
    Last edited: Sep 26, 2021
    PutridEx likes this.
  17. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
    And not because Unity has bad defaults and missing features?
     
  18. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Most of the top popular PC games go out of their way to offer scalability options that run reasonably on integrated graphics. If you're going for mass appeal on PC, it's a must.
     
  19. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,144
    Are you telling me there are modern AAA games able to run off of Intel HD at 60 FPS?
     
  20. frosted

    frosted

    Joined:
    Jan 17, 2014
    Posts:
    4,044
    Doesn't like Fortnite/Overwatch run on a toaster? Maybe not 60fps, but these are definitely playable on bad hardware.
     
    ShilohGames and EternalAmbiguity like this.
  21. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
  22. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    What's "modern AAA"? There were couple of posts on social/near-social networks discussing games and in the sad, sad, situation of neverending GPU shortage people proposed gaming on integrated GPUs. They were using witcher/gta 5, which are several years old. And the FPS was in ballpark of 40-60, not 60.

    I think that current situation with GPUs may result in more people targeting integrated cards and/or consoles. I mean, 3060 sometimes costs like Index right now and it does not look right a smart way to spend money for gaming purposes.

    Regarding integrated cards not being intended for gaming, PC wasn't intended for gaming in the beginning either. Yet here we are with tens of thousand games on steam...
     
  23. Neto_Kokku

    Neto_Kokku

    Joined:
    Feb 15, 2018
    Posts:
    1,751
    Console-first games? No. PC-first multi-player games? Most of them. Valorant maker Riot made a big deal showing the range of optimizations they employed to make their game run well on low spec machines. Fortnite recently released an update which allows it to run at 60fps on more humble iGPUs by leveraging UE4's mobile renderer.

    Also, 720p at low settings can get you 30-ish fps from a modern Intel GPU and AMD Vega APU in games that would have no business running on laptops without dedicated graphics.

    The horrible GPU shortage also means a lot of people are having to make do with iGPUs or dedicated GPUs like the 1050 and 1650.
     
    angrypenguin and hippocoder like this.
  24. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Intel Alchemist + XeSS and nvidia's DLSS are going to be changing how people think about budget GPUs going forward.

    AMD's FSR is not great (being a generic solution) but good enough in a pinch. All of this gets things moving on the low end but still does not solve quality of motion.

    I really want Unity to look properly at this problem. A good start was when Unity fixed when their timings occur internally, and the whole delta time thing is related to that. But that is one part of the puzzle, and only when you're already at your refresh rate. There's still a large number of things for quality frames in motion that should be addressed by Unity.

    I think it's possibly one of the reasons why devs with bigger pockets aren't releasing with Unity. They don't have this friction elsewhere. Going to be fine for hobbyists or people with lightweight titles, but using HDRP (or URP fully loaded up) will result in a fair amount of frame time variance.

    What will you do, Unity?
     
    Martin_H and NotaNaN like this.
  25. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,144
    We've had sufficient discussions on this that you should know what I'm referring to. I'm not going to have more discussions on this question because I don't want to tangent the thread.

    https://en.wikipedia.org/wiki/AAA_(video_game_industry)

    Yes, you can do that with old games. GTA V, for example, can achieve 40-ish FPS on Intel HD 530 at 720 on low. It's a far different story when the game is no more than a year old. In fact while looking around for benchmarks of new games I found out Deathloop (2021) refuses to even start on Intel HD.


    Diablo II: Resurrection which is basically the remaster of the original game only achieves 20 FPS.


    Resident Evil Village achieves frame rates in the teens.


    Mass Effect Legendary Edition achieves 40-ish FPS. It's UE3 though so no surprise there.


    There is a tremendous difference between Intel HD and a GTX 1050. UserBenchmark, which I don't consider to be a very accurate benchmarking website, says there is a 520% difference between a GTX 1050 and Intel HD 630. There are options available that don't involve buying a brand new GPU though.

    That said it's not really that horrible of a graphics card shortage now. A used GTX 1050 costs around $150 to $200 depending on how long you want to wait for the right deal.
     
    Last edited: Sep 26, 2021
    NotaNaN and stain2319 like this.
  26. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    I know the definition of the budget, the question was "which game/games specifically" did you have in mind.
     
  27. lmbarns

    lmbarns

    Joined:
    Jul 14, 2011
    Posts:
    1,628
    Is it the engines fault or the developers? Does the game in question offer any quality settings? If it's dropping under 30fps I think the dev should have optimized better or reduced fx on low end hardware if they plan on supporting it, or designed in a way it doesn't drop below 30.
     
  28. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Fault is a funny word, it implies blame. My answer is just an observation instead:

    It's just that Unity is a dumb solution. You get the features but none of it is actually tailored or really mindfully directed toward making an AAA console title. It's just way too broad spectrum and does not cover how things move when under variable framerates. Oddly Unity is hands off in too many places with that.
     
    NotaNaN likes this.
  29. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
    The issue we're discussing here is mostly "how can you make a 30fps game in Unity feel good" / "why do low framerate Unity games feel worse than other games" and not whose fault it is that it dropped that low.

    Because Unity games at 30fps feel bad.

    And it's an engine's role to have these things solved. It's not like it's an unsolvable problem, it's solved in many AAA games. Unity could have strong defaults, features and tools you can use to assist you and have a docs page titled "Designing a game for 30fps? Here are some things to watch out for!".

    So you can spend more time designing your game and less time solving solved technical issues...

    Isn't that why we all use a ready made game engine?

    Instead, Unity has holes in its feature-set and issues in its existing ones, and the information on what to watch out for and what steps to take, instead of being organized in a Unity docs page, probably only exists in Hippo's brain and a few other users here who seriously wanted to make a smooth feeling 30fps Unity game at some point and have been banging their heads against walls for years.
     
  30. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,144
    I searched for "2021 games" on Google and picked a few of the ones I remember hearing about out of the fancy side scrolling list. :p
     
  31. Gekigengar

    Gekigengar

    Joined:
    Jan 20, 2013
    Posts:
    738
    I haven’t used the new input system, but from what I heard, isn’t it framerate independant? should solve a lot of the latency issues on bad framerate.
     
  32. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,728
  33. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    Yes, you can do manual polling updates, or tell it to poll update at whatever rate you want. However, the steps between reading the input and using it in your logic are only a small portion of the overall input-to-display chain.

    Edit: Rather than "polling" I should be more clear and say "update". Input data comes from different devices at different rates. Many devices aren't polled at all, instead feeding events into buffers whenever they're received. "Polling" the input system itself at a higher rate simply flushes those buffers into internal events more often.
     
    Last edited: Sep 27, 2021
    NotaNaN likes this.
  34. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    While triple buffering increases smoothness, doesn't it contribute to bad feeling input latency? You have at least 3 frames already queued up ahead of any new input.
     
    EternalAmbiguity and NotaNaN like this.
  35. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I don't know how many AAA titles do this but they definitely do it for a lot of PS3 games, but I only know because developers on those games told me. Around 8 years ago I became annoyed at why my 30fps in Unity felt like crap vs PS3 titles back then.

    But I think maybe nowadays there's probably much better knowledge about this out here so I am thinking - why aren't we sharing this? I would love to hear from current gen experienced developers who know better and have worked on these problems. So I can learn and do better.
     
    AcidArrow, Joe-Censored and NotaNaN like this.
  36. lmbarns

    lmbarns

    Joined:
    Jul 14, 2011
    Posts:
    1,628
    Not current gen, it's dated now, but the original Shadowgun was released on Unity 3.5, it targeted ipad 1, iphone 3gs, and ipod 4, basically 256mb of ram.....600mhz - 1ghz cpu.

    They wrote every single shader and had all kinds of amazing magic tricks for reflections and lighting on such primitive hardware. Those guys were on another level and it was impressive what they could do with Unity.

    Targeting AAA on modern consoles you probably need to be doing what they were doing, custom shaders, not using built in Unity reflections or lighting.
     
  37. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Friend, you're 100% missing the point here. This is about games that run at 30fps. Shadowgun is famous for running 60fps on mobile. None of this is about running a game at screen refresh.

    Basically you probably wouldn't notice this problem if you have not noticed this problem.
     
    Martin_H, NotaNaN, JoNax97 and 2 others like this.
  38. lmbarns

    lmbarns

    Joined:
    Jul 14, 2011
    Posts:
    1,628
    Yeah I wouldn't subject myself to that on a 120hz monitor. I really don't understand why supporting sub 30 or even 30 is so important unless you're targeting africa or venezuela users. Even on Oculus go we were at 70fps because it couldn't do 90 but it was pretty important to the experience to stay above 60 with a little buffer.

    If you're targeting a console and know the hardware why would you max it out to barely hit 30fps? GC will strike at some point dropping it below. If you're a mobile developer you literally uncheck all the devices you don't want to target to avoid poor ratings from people on a 2011 phone trying to play some FPS. I don't get it.

    Even if you're targeting a 2015 phone, you can do things to bring it well above 30. Even a dang amazon firetv stick can run unity... but not a bunch of 4k textures so you don't use them if you're targeting a fire tv stick.

    edit:: op mentions it happens when there are a lot of particles and effects on the screen. That seems like an optimization issue more than a Unity issue, if a player isn't hitting 30fps they should reduce the particles emitted or downsize the textures or turn some of them off.
     
    Last edited: Sep 27, 2021
  39. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    Because, with a few exceptions, audiences care less about higher frame rates than they do about shinier graphics.

    And as hippo says,you're missing the point. Even if you don't care about 30hz feel, throwing more frames doesn't solve it, it just reduces it. Your 60+hz game won't feel as good as someone else's 60+hz game. And if you're targeting enthusiasts who care that could be an issue.
     
    NotaNaN and AcidArrow like this.
  40. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It is why Sony and other big players have a performance mode and a cinematic mode. Nearly all games have those on consoles because some people love big screen ULTRA SMOOTH FILM LIKE 30FPS and some love that crisp 60fps experience. I say lets support them both :)

    This is 30fps being chosen by users on a Playstation 5. Let that sink in!
     
    Martin_H likes this.
  41. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    It might make sense to bring up soap opera effect.

    Basically, high framerate can look cheap.
     
  42. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    I actually wonder how often this is the age old multiplying-mouse-movement-by-delta-time thing.

    I've seen people have mouse and gamepad stick input bound to the same control by default (edit: by which I mean the same piece of movement code, not just the same item in the controls menu). Unless there are smarts under the hood to handle each differently, that's a bad sign. They are fundamentally different input types which need to be handled independently for good results.

    The mouse is giving you a delta of summed movement since last check, so you scale it and use it as is. It already accounts for time implicitly. The stick is reporting a desired speed only, so you need to account for the duration yourself.

    Messing those up seems common, and I've even seen it in large games.

    Another possibility is mouse acceleration,or smoothing. Low frame rates also often fluctuate a lot, which can mess up the algorithms for those. Not sure on the solution, but my guess is to collect the input independent of frame rate and apply the accel / smoothing to the raw samples at the original (consistent) report rate before aggregating them into your game's frame data.
     
    Last edited: Sep 28, 2021
  43. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,674
    frosted and neginfinity like this.
  44. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    But for me this is about *choosing* a 30fps rate that is rock solid 30fps. This means that it's really around 40+ and when I do this, I don't get smooth movement. So I'm trying to figure out how best to achieve this.
     
  45. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619


    Why? How?

    Unity deliberately attracts rookies, and they can only be as good as the example that's set for them.

    And in reference to another recent conversation here, this is a prime example of why understanding what's going on "under the hood" is of practical benefit, and not just academic hat twirling.
     
  46. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
  47. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    Well, I was specifically answering the question I quoted from the OP. But while I'm here...

    If your display rate is 30hz but you're referring to your frame rate as being "really around 40+" then what specifically do you mean? The game runs at 40hz and you're vsyncing it to 30, or something else? What are your delta times? Are you enforcing them as 33.3ms, or are they some value around 25ms which could still be fluctuating? And for the sake of completeness, how are you handling the 40hz -> 30hz difference? (CroTeam had a big blog post on this some time ago which was great. Basically, beating your frame budget != smooth motion.)

    That also suggests that there's up to 8ms of time from before the previous frame was displayed which is contributing to input latency. On some platforms that's negligible (TVs can have display latency of 100ms on their own) but on others that could be a significant factor.

    If it's an option on your platform I'd also try disabling the triple buffering. If you know that your backbuffer is always going to be ready with time to spare and at a consistent simulated timestep then I'm not sure of the benefits of keeping one in reserve. (I think some effects may rely on it?) In this case that would be a whole 33ms latency saving, which would be significant even with something like a TV in the loop.

    Also, I recall that Unity made some updates to some platforms recently-ish which improved frame timing. Are you using a version and platform where that was addressed?

    And while I'm playing 20 questions... In what way is your movement not smooth? What type of controller are we talking about (I assume gamepad) and what is it controlling (aiming in an FPS, movement in a platformer, etc.)? As you can see above, classic and common mistakes are things like treating a mouse as if it's a gamepad...
     
    NotaNaN and hippocoder like this.
  48. Tautvydas-Zilys

    Tautvydas-Zilys

    Unity Technologies

    Joined:
    Jul 25, 2013
    Posts:
    10,674
    People make mistakes.
     
    frosted and hippocoder like this.
  49. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Those are illegal on these forums (if you believe most people) haha :D
     
    Martin_H and Joe-Censored like this.
  50. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    Of course, myself included. But that one made it through whatever QA you have in place to get into a publicly available template project.

    For anyone concerned about input latency, note that the inconsistent mouse sensitivity which a bug such as that will cause will have a much bigger impact on input feel than the small amount of overall input latency we're able to actually control.