Search Unity

Can DOTS Cycle Threads Across Cores to reduce heat load and wear and tear on CPUs?

Discussion in 'Entity Component System' started by Arowx, Sep 7, 2019.

  1. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Just read this article that looks into Intels claims that AMD Ryzen chips are running too hot to last.
    There is a windows 10 update in the works to the scheduler that will cycle cores more effectively.

    However not all multi-core systems we make games for in Unity might have an OS that will cycle heavy loads evenly among cores and across game sessions to reduce wear and tear due to power and heat load on individual cores.

    So should DOTS have a core/thread cycling system that ensures the load over the cores are cycled thus minimising wear and tear on a single point of failure?

    Could such a cycling system even boost long game session performance as it would allow chip cores to cool and maintain optimal performance?

    On a side note do GPUs and Graphics APIs/Drivers have a cycling system to prevent wear and tear hot spots on GPU cores?
     
  2. desertGhost_

    desertGhost_

    Joined:
    Apr 12, 2018
    Posts:
    260
    I think this is more of an OS scheduler issue more than something for Unity to address.

    Switching between cores too often could prevent taking full advantage of the L1 (fastest) cache on some CPU architectures and could impact performance.

    I also wouldn't worry about it given the useful lifespan of a CPU is way less than the actual lifespan of one.

    The best way for a game to avoid raising the thermals of the hardware is to have an adaptive frame rate or frame pacing solution that reacts to loads / thermals of components.
     
    MNNoxMortem and hippocoder like this.
  3. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Everything will degrade at some point but for the vast majority of businesses and individuals, the CPU will be replaced with a faster one long before the degradation of the CPU.

    In fact even in industries where the CPU will be hammered pretty much 24/7, it will likely be a case of the computer or CPU being replaced by better alternatives as time goes by. Maybe if you're one of those people still using 10 year old computers it can happen.

    I think I would be more worried about my SSD (and I vaguely am since I hammer that every day, all day, and it's 6-7 years old now).

    Unity should focus all their efforts on stability performance first and foremost because that always matters. That is pretty much the best possible saving of a CPU as well because a job done sooner / more efficiently is more idle time for a CPU and that translates to less wear and tear.
     
  4. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Good point so maybe the core/job cycling could be inter-session, so every time you play the game different cores do different jobs, a simple random job/core assignment system would do the job.
     
  5. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I think it would cause the program to run longer, and thus be more inefficient overall, and cause more wear and tear really, especially when the OS is doing the exact best thing it should.

    The whole Unity engine Jobs and DOTS stuff doesn't fight itself over resources which is why people think they get better performance when using C# threading but when they run that in a game scenario, they get worse performance cos things begin to fight each other.

    I wouldn't pick a fight with Windows really.