I have a problem with input lag that I cannot figure out. On the 2017 .4.36f1 version of Unity there are no problems; this problem only comes up now that I am switching to the new versions of Unity. My game runs fine in the editor; however when I make a build there is noticeable input lag. This only happens on my laptop though, and there is no lag when I play on my desktop. My laptop is better than my desktop though. This is not a VSync issue, it is not Microsoft Game Bar, and it's not because my laptop is on power save settings, it's not. So to reiterate, I made a build with a 2017 version of Unity and it plays fine on my laptop, no problem. I made a build with the newest versions of Unity, and there is input lag. What has changed in the new version of Unity that could be causing this problem for me?
It's hard to say. What does your input code look like? You're capturing input in the Update() method? And how exactly are you quantifying the input lag? It just feels sluggish? The two main causes of that, as I've experienced it, is either low framerates, or VSync being enabled. What's your FPS while this is happening? If you don't have an FPS meter in your game, there are some free ones on the asset store. I use https://assetstore.unity.com/packag...ate-fps-counter-stats-monitor-debugger-105778. Anyway, if the FPS is greater than 45-60 or so, I personally don't detect any input lag. But when it's down to 25-30, I feel like the game has input lag, just due to the low update rate. You said it's not VSync, but I'd just really make sure of that. Again, the FPS meter could tell you. If you're getting a very steady framerate, that's a good clue that VSync is on, and VSync definitely causes noticeable input lag for me. Anyway, maybe show some code? Double check it's not VSync or low framerate?
I should mention that this issue even occurs with a brand new project using standard assets; so it's not my code. If it was a VSync issue, why wouldn't it happen when I make a build with the 2017 version of Unity? This only happens with 2019 versions of Unity.
Not sure. It's possible the default VSync setting in Quality Settings is different under 2017 compared to 2019, so a default 2019 project maybe has VSync enabled by default? It's also possible the game runs slower when built under 2019 versus 2017, which is a complaint raised many times, where games built in more recent versions of Unity perform worse with each new version. Anyway, probably need to see some numbers/screenshots/code to have any idea what's going on. What's your FPS when this is happening? What do you have chosen for VSync under Quality settings? Also, specifically, what is the impact of the input lag? Sluggish mouse movement? Movement delayed by some amount of time? Can you possibly try to quantify how much delay there is?
Try playing around with this : https://docs.unity3d.com/ScriptReference/QualitySettings-maxQueuedFrames.html Also, the old input system had a bit more lag than it could and the new input system is at best as "good" as the old system lag wise.
I have new information about this. Apparently when I run OBS and record the monitor, there is no lag in the build. This is completely baffling. I was going to record a video so I could show the lag, but for some reason it goes away when I start recording the screen. I am so lost...
Is it possible that Unity could be trying to run on my GPU and also on my motherboard's graphics processing simultaneously? When I run OBS specifically on my on-board graphics, the input lag problem goes away. This is how I have to run OBS to record my monitor.
I don't think it's possible to do that. At least on Windows 10, if you right-click the executable you want to run (your game, for example) it lets you pick whether to run it using on-board graphics or a dedicated GPU. So, you can choose which GPU will be used, but I don't believe there's a way to use both simultaneously. It's certainly possible for one program to use the dedicated GPU, and another program to use the on-board graphics, at the same time. But I don't think that's what you're asking. If running OBS on the build-in GPU improves things, it seems likely/possible that you're just overloading the GPU, and it can't handle the game and OBS at the same time. That seems unlikely in a simple test build. But again, I'd love to know the specific FPS you're getting in your builds when input lag is present, and also when it's not present. Also, if you have a simple/small build you can share, I could see if I also notice input lag running it, if you had some place to upload the project.
I get input lag whether I run the exe on my GPU or the on-board. I don't think it would help if I send you the project, because I get the same lag on a brand new project using only standard assets. So the problem isn't related to my project specifically. Just to reiterate; when I run the build without OBS, I get input lag, regardless of GPU. When I run the build, while running OBS through the motherboard (not the GPU), the lag goes away.
Here is a video to show what I mean. Sorry for the quality, but like I said I can't record off the computer directly, since OBS fixes the lag. I know the difference might seem minute, I don't know how noticeable it is here.
Ok even better, I got a frame rate counter. So, when OBS is off, bafflingly the FPS seems to be limited to 30 for some reason. Then when I run OBS on my integrated graphics processing, the framerate is unlimited as I intended it to be. I set target frame rate to 300 in the script and turned off VSync. I also made sure in my GPU settings that it never uses VSync. I don't understand why this is happening. EDIT: Additional information; the framerate is unlimited when I take focus off of the application, such as hitting the Windows button on my keyboard; however the framerate limits back to 30 when I refocus on the application. This happens regardless of OBS; so it is not actually related to OBS.
That's pretty interesting. I could understand if it was frame limited at all times, but having another application increase the framerate seems really strange to me. Something to try: Right click on the EXE for your game, and choose which GPU to use: Does using either one or the other make a difference? All I can think is somehow by default the game is using your low-end gpu, and for some reason OBS, is forcing the game to run on the other GPU. That still doesn't make much sense to me. If, with OBS off, you get 30 FPS whether you're using the high-end GPU or the low-end GPU, I'd probably check whether you get get updated drivers for your graphics cards. That's starting to sound like a weird hardware issue.
I've already tried running the exe on both GPUs, that was one of the first things I tried. The drivers are up to date. So, I realized something about my problem. I use my TV as a monitor on my laptop, with an HDMI cable. When I unplug it from the TV, frame rate is fine. So, now I'm confused about why the frame rate is limited when using the TV as a monitor; why OBS fixes it even when plugged into the monitor; why the old version of Unity didn't have this problem for me.
Sounds like a mess. I'm glad you're narrowing it down, though. I don't have any idea what would cause those issues, unless the HDMI cord can't handle more than 30 FPS, or the TV can't.
The TV is old, and probably not suited to higher than 30FPS; however what confuses me is the fact that with the 2017 version of Unity I don't have this problem; and also the fact that I get higher framerates when OBS is open. That is baffling.