Search Unity

CPU overheating Issue doesn't make sense, any advice?

Discussion in 'General Discussion' started by Deleted User, Mar 10, 2021.

Thread Status:
Not open for further replies.
  1. Deleted User

    Deleted User

    Guest

    Not sure if this is the right forum, mods may move or lock thread as they see fit.

    There's a server-based MMO title offered through a major gaming platform that looks and plays beautifully, at least from the trailers. It was released in mid-February, but multiple users are starting to report issues with CPU and GPU overheating while playing.

    These reports include temps exceeding 80 degrees C, capping fps and dropping resolution helps but moderate-to-severe frame dropping still occurs. Troubleshooting via running multiple other programs and Chrome tabs with the game off does not replicate overheating issues. The issues don't happen with any other games, just this one, and they only happen when playing the game.

    The minimum recommended specs are Win 7/8/10 with Radeon CPU and 8GB processing capability, which would suggest the game is potentially underoptimized. However, that's where the math stops making sense.

    My current laptop is a 4GB Radeon potato that can handle 6GB on GPU, it could probably pull 8GB with res drop but gameplay would suffer badly without overclocking or other fancy tricks... and the reports are mostly coming from users running higher-spec builds with sufficient cooling systems, fans, etc. instead of the folks running low-to-mid specs.

    In the game's discussion forum, players are being trolled with: cap your fps; check your cooling, CPU and GPU rigs; and "learn how to overclock properly". I know I'm new to Unity development, but my gut's telling me there's no reason for a game this polished to require higher-end specs to cap the output. When asking about the excess output created by capping fps, the trolling increased with more blaming the users, even more so when I raised the suggestion of server-based cryptojacking "because it's the only possibility left that makes any sense."

    Maybe I've been watching too many LTT videos on YouTube, and the whole NVIDIA thing's made me a bit twitchy from a cybersecurity perspective, but my gut's telling me something's off. The problem is, I don't even know how to check or what to look for if the dev's using players as a mining farm. So I can't say there is and I can't say there isn't... but I can say I'm probably overthinking things and could use a second opinion.

    On a side note, what is Unity's policies on using server-based miners in this case?
     
  2. Antypodish

    Antypodish

    Joined:
    Apr 29, 2014
    Posts:
    10,769
    Yeah most likely, as you haven't addressed if this is anything to do with Unity, other than very last sentence.

    And how this even is related to your PC issue?

    You should provide both game title and full PC spec. What CPU and what GPU you have too.
    Plus on what resolution you are playing?
    Also what are recommended game specs.

    Is this game even made with Unity?

    Other than that, you should keep addressing problem on developer forum.
    We can do nothing about the issue.
     
    Deleted User likes this.
  3. Deleted User

    Deleted User

    Guest

    Fair enough, thanks for letting me know.

    Mods, the thread's all yours.
     
  4. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,145
    A machine overheating is always the fault of the machine.
     
    angrypenguin likes this.
  5. Deleted User

    Deleted User

    Guest

    This thread was already answered. All further replies are disregarded.
     
  6. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,145
    Feel free to ignore the post if you like but as you're not the only one who will read the answers in this thread it's important to be clear about where the fault lies with an overheating machine. It's never the software.
     
  7. OrlovskyConsultingGbR

    OrlovskyConsultingGbR

    Joined:
    Mar 17, 2020
    Posts:
    63
    Unity as other major game engine's can be used in wrong way, but having said that i never experienced hardware problems cause by inefficient Unity3D programming.

    What you should get for your laptop is very powerful cooling , there not so many hardware laptop manufacture's which did take care of such problem.

    With current situation in the world , well maybe it make sens to buy like 4x120mm fans with max rpm and connect it to the external power source with switch on and off and then when laptop at high temp, you switch on cooling and then it cools down cpu, - 20 celsius from temperature would be great, like you thermal throttle at 65 but with "cooler" it should be like 40 to 50 , this is perfect ;)
    Having said that, you need make sure , that your laptop clean inside, sometimes it just to much dust and it works like "heat storage" , the dust can build up between internal cooler and heat-sink and the heat wont dissipated properly and as result you have 80 celsius temperature.
    You need to maintain your hardware like sword, clean it , train with it and sometimes go to battle with it, so i hope you would be able to do it right.
     
  8. neginfinity

    neginfinity

    Joined:
    Jan 27, 2013
    Posts:
    13,566
    ((nitpicking
    Environment matters. A computer operating in vicinity of a blast furnace or an oven will have harder time cooling down.
     
    angrypenguin, Ryiah and Joe-Censored like this.
  9. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Without vsync or any other form of frame rate cap, a Unity desktop build will run with as high a framerate as the hardware is capable of. The framerate will be the maximum possible until it hits some form of hardware bottleneck, what the specific bottleneck is depends on the hardware and the design of the game. But generally, you're either going to bottleneck on 100% GPU usage, or you are going to bottleneck on 100% usage of 1 or more CPU cores.

    100% usage of anything = more heat

    You didn't mention what part is hitting 80C, but 80C doesn't mean the part is overheating. In fact I'm not aware of any modern CPU/GPU that even thermal throttles at 80C. The lowest I'm aware of are Nvidia desktop GPU's which start to thermal throttle at around 84C. CPU's typically go much higher.

    But thermal throttling isn't overheating. CPU's/GPU's are designed to slow down below base clock when reaching certain temperatures, but these temperatures are still well within operating spec. It is perfectly normal for manufacturers to design their products with thermal throttling as an integral part of the thermal strategy. Apple laptops are notorious for this for example, where they max boost for a short period, but under sustained load they will stop boosting and then start thermal throttling. This is so the laptop can be thinner, lighter, and much quieter, and is done on purpose. It is pretty normal for a lot of PC laptops too.

    The same behavior applies to desktop PC's, but in this instance it is usually the user who chooses which case, which fans, how many fans, etc, which impact the thermal performance of the PC. If under max load, either the part can't remove heat into the case fast enough, or the case design and fans can't vent heat faster than the parts are dumping it into the case, then yeah you're going to see temperatures start rising, even to high levels. This isn't a problem with the game you're playing, this is how you set up the cooling system of your PC in comparison to how much heat the parts are capable of creating at max load.

    Higher spec builds today draw around double the power of a high spec build from just 6 years ago. All power used by the computer is converted to heat, all of it. You either have your PC's cooling system set up to expel all that heat into the room faster than the parts are capable of creating it, or you don't and you see temperatures in the computer rise. It is really that simple.

    Nothing you've stated would have me suspicious of crypto mining. Crypto mining isn't even anything special. It just uses your graphics card at 100%, pretty much the same as if you ran a game which runs your card at 100%.

    All it has me suspicious of is you've got a bunch of people playing this game who blew all their PC money on high spec parts, and then cheeped out on the cooling. Understand that a modern high spec PC build today can produce almost the same heat as an electric space heater, and you've got to remove all of that heat from the case. There's a reason excellent air flow cases exist. There's a reason many good looking and expensive cases with starving fans of airflow get terrible reviews from people who actually test them. There's a reason water cooling systems exist. There's also a reason good cooling solutions aren't cheap.

    Did these people pay up for good cooling? I'm going to guess that didn't happen.

    I don't believe they have any policy.
     
  10. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,619
    How does it suggest that? Those things all sound pretty normal to me? Sure 8 gigs is a lot, but it's common enough these days, and there are plenty of things which have legitimate use for it.

    This. And a well optimised game with no bottlenecks and no caps can use more of your system at 100%.

    But that's fine. The components are designed for it. When used as designed your CPU and GPU are able to either a) run at 100% indefinitely or b) look after themselves by throttling when getting too hot. If a system can't do that then it's faulty, poorly designed, or being used out of spec (eg, in a place that's too hot, being OC'd, inadequate cooling, etc.). As such, if your system works properly then no piece of software can use "too much" of it.

    Consider this: if a piece of code could "overuse" your system and break it then the 'net would be full of posts by rookie programmers who cooked their CPU by writing an infinite loop. There'd be thread after thread of "how do I stop my code from going too fast?" Programmers would need to be in the habit of keeping spares for when we accidentally burn them out. Bugs in software would regularly destroy hardware. And... well, just imagine what pranksters and hackers would get up to!

    None of which happen. It just isn't a thing.
     
  11. Joe-Censored

    Joe-Censored

    Joined:
    Mar 26, 2013
    Posts:
    11,847
    Yeah 8 gigs of vram just sounds like they are using either a lot of textures/models at the same time, or just some high quality ones. Whether that is optimized or not, depends on the context. But the context here isn't mentioned.

    For example, if you're using really high quality textures on distant objects the player never gets a close look at, well that sounds like it is a waste. That doesn't sound very optimized. You can probably lower the texture quality and not notice any difference. But those same textures used on close up shots of something so it looks amazing, well that's the appearance the developer wanted, and the texture is probably optimal for that purpose. The amount of vram used doesn't tell you whether it is optimized. What you've got the vram doing tells you that.
     
  12. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    6,012
    Anecdotally, I have seen multiple times in Steam reviews of Unity games about CPU getting very hot (or at least running at max), even in low spec games. It's totally possible it's the developer's fault, of course, but it seems like it might be a thing.
     
  13. zombiegorilla

    zombiegorilla

    Moderator

    Joined:
    May 8, 2012
    Posts:
    9,051
    Not the place for this, contact the game developers.
     
Thread Status:
Not open for further replies.