Search Unity

Possible Impact to Game Development with Unity

Discussion in 'General Discussion' started by JamesArndt, Jan 4, 2018.

  1. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    I wanted to ask more knowledgeable people what might the impact be of this whole Intel chip flaw/and or exploit on our day-to-day game development? I have read that the major OS's are currently patching this flaw, but that it will slow down Intel processors up to 30%. The slowdowns are based on the type of tasks one does and I have read that code compiling will take a large hit in performance. So does anyone know what the impact might end up being from this in regards to development?

    For reference here is an article on the matter:
    http://www.independent.co.uk/life-s...ity-hackers-cyber-crime-amd-arm-a8141106.html

    And here is CERT's recommendation:

     
    Last edited: Jan 4, 2018
  2. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    Yeah, just replace your CPUs, you f-ing peasants.
     
    Socks, eses and JamesArndt like this.
  3. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    Especially considering one of these exploits affects ARM processors in our mobile phones. I did kind of want to upgrade my phone though.
     
    FMark92 likes this.
  4. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    I run i7 2600 on lga 1155 socket. I'm not upgrading half my system this month (to accomodate a new CPU). Also no way I'm getting a 7 years old AMD.
     
    Last edited: Jan 4, 2018
  5. snacktime

    snacktime

    Joined:
    Apr 15, 2013
    Posts:
    3,356
    Impact on anything that is a kernel level call will take a hit, so for desktops mostly just IO.

    Cloud providers are ******. Because virtualization is all kernel level. Like cloud servers aren't slow enough already.
     
    Martin_H and JamesArndt like this.
  6. Martin_H

    Martin_H

    Joined:
    Jul 11, 2015
    Posts:
    4,436
    How much of a risk is that vector of attack even if you run an up to date antivirus software and firewall?
     
  7. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    642
    I'm wondering how this will affect Unity build times for large projects...
     
    JamesArndt likes this.
  8. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
  9. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
    JamesArndt likes this.
  10. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    I posted up the same query on the Unity Reddit as well. So far this is a promising comment: "I'm running Windows 10 and installed the update earlier today. I haven't noticed any decrease in performance and have about ~600 script files, for reference."

    Then again no idea on this person's processor or system specs. Seems that newer generation processors will feel less of an impact. Also I have no idea what qualifies as "newer gen". I'm guessing Skylake or newer?
     
  11. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Seems like a conspiracy theory... they publish faulty chips, a "third party" seems to notify everyone of the flaw and the danger, creating fear and motivation, the solution is... buy a new computer.... well... so long as it's got intel inside. Hmm.
     
    JamesArndt likes this.
  12. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    AMD official update: https://www.amd.com/en/corporate/speculative-execution

    Quick summary there are 3 types of issue
    1. Rogue Data Cache Load: Cannot affect AMD due to architecture differences.
    2. Branch Target Injection: Not expected to be a problem and has not be shown by tests to affect AMD chips.
    3. Bounds Check Bypass: Can be fixed with an OS patch.
    So even AMD not fully immune from bug, the question is then when will the OS patches be released?

    Apparently this or a similar exploit can be used via web browser -> https://blog.mozilla.org/security/2018/01/03/mitigations-landing-new-class-timing-attack/
     
  13. LaneFox

    LaneFox

    Joined:
    Jun 29, 2011
    Posts:
    7,532
    1. Read.
    2. Verify.
    3. Remove tin foil hat.
    With a low level issue like this and the fix expected to (basically) impact processing time by 30% it's kind of just something we have to deal with. I don't think it really impacts us specifically as game devs as much as it does other businesses that heavily rely on the processor speed. I mean, we're more or less just subject to whatever our target hardware is so we'll just have to accommodate the speed adjustment in the future.

    That being said, a lot of businesses that still relying on existing processing speeds may not be affected at all if they do their work on private networks like some render farms and such. The main target seems to be cloud services where you can rent access to a server, pull a memory dump and move on to the next server. That, while terrible, does seem at least somewhat niche.
     
  14. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    It was already fixed in macOS in the version from a month ago, and didn't slow anything down (to any degree that's actually noticeable for users). Put down the conspiracy theories and back away slowly.

    --Eric
     
    Ryiah, orb and JamesArndt like this.
  15. Tzan

    Tzan

    Joined:
    Apr 5, 2009
    Posts:
    736
    But I just watched the new X-Files tv show last night.
    Conspiracy theories are all I have. :(
     
  16. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    AMD must be partying hard at the moment. Not gonna bother upgrading until my work needs it. There's a lot of drama and so on about this but ultimately it's only going to screw you if you visit dodgy websites and download dodgy cracks etc.
     
    Kronnect likes this.
  17. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    What? AMD Just released the Ryzens (even 16 Core Ryzens) for under a grand! Within like the last year or so.
     
  18. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    I'm not really one for any of the conspiracy theories out there. Because this seems like a highly technical issue at the processor level and I keep reading about processor slowdowns up to 30% my main concern is what the impact will be to running Unity itself, code compiling, baking lightmaps, etc (anything processor intensive). Again I lack understanding of how exactly or what CPU performance aspect this is impacting. I don't dev on a Mac nor do I intend to so I am mostly curious how this impacts the PC development environment using Unity.
     
  19. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    I personally think the risk of impact and the performance degradation numbers are being overblown. These flaws (Spectre and Meltdown) will be very difficult vulnerabilities for hackers to take advantage of. OS level software mitigation of these flaws will most likely NOT yield a 30% drop in performance. My guess is that we will see a 5-10% drop in performance for some use cases and possibly even a small performance gain in other use cases. And remember that most desktop and mobile computing involves the computer waiting for the user, file system, or network resources. Modern computers spend a lot of time in energy saving states just waiting for other things to happen anyway.
     
    JamesArndt likes this.
  20. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    The worst case slowdowns will most likely be in virtualized (cloud) computing environments running on much older hardware.
     
  21. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    I read Haswell or newer.
     
  22. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    This will most certainly affect intel's profits.
     
  23. Lars-Steenhoff

    Lars-Steenhoff

    Joined:
    Aug 7, 2007
    Posts:
    3,527
  24. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    JamesArndt and Martin_H like this.
  25. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    JamesArndt likes this.
  26. Meltdown

    Meltdown

    Joined:
    Oct 13, 2010
    Posts:
    5,822
    Great to know I now have a hardware vulnerability named after me :rolleyes:
     
    theANMATOR2b, Kronnect, Kona and 3 others like this.
  27. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    You're far behind the times. AMD is basically neck-and-neck with Intel right now.
     
    JamesArndt, N1warhead and Meltdown like this.
  28. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,620
    My understanding is that it's a 30% impact in the worst case in synthetic benchmarks? While that means that there's a genuine impact, that doesn't reflect what it'll be in everyday computing tasks, because everyday computing tasks don't find a worst case scenario and constantly hammer it.

    Even assuming that some things do get 30% slower, it'll only be a portion of operations or functionality that's effected, so the overall impact will probably only be a fraction of that for most use cases.

    Also keep in mind that our CPUs aren't at 100% most of the time anyway. So it could be that the impact of this manifests itself in ways other than a noticeable slowdown, though I admit to not knowing a lot about the details of CPU throttling. Certainly it'd be nice to see before / after tests that include more than just benchmark times and framerates - maybe there's a difference in temperature or CPU load during tests? (Edit: Though as someone pointed out in the comments to those benchmarks, game tests are likely to be GPU bound rather than CPU bound, so might not show up any difference anyway.)
     
    Last edited: Jan 5, 2018
    JamesArndt and Ryiah like this.
  29. SnowInChina

    SnowInChina

    Joined:
    Oct 9, 2012
    Posts:
    204
    but the news sound way more apocalypse now if they write 30% instead of "yeah, most users will have a performance hit of 1-2% at most"
     
  30. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    Are you telling me it will fit on a LGA1155, m8?? Actually I also made a mistake because LGA1155 is intel proprietary.
     
  31. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    My motherboard isn't.
     
  32. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    You're missing 2,939 pins...

    Threadripper-Chip.jpg Threadripper-Socket.jpg

    You're definitely right that your motherboard isn't neck-and-neck. LGA 1155 is ancient now.
     
    Amon and JamesArndt like this.
  33. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    Do I suck at sarcasm that much?
     
    theANMATOR2b likes this.
  34. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    21,190
    Do you want an answer? :p

    Honestly I love linking pictures of the socket because it's just ridiculously oversized for a workstation processor. It's left me wondering what will happen to motherboard design if they continue designing processors of that size. Will we need to go back to slot designs?
     
    Martin_H and FMark92 like this.
  35. DominoM

    DominoM

    Joined:
    Nov 24, 2016
    Posts:
    460
    TheRegisters coverage of this is more tech orientated than mainstream coverage.

    Syscalls are slower with the fixes, so gaming is probably one of the least affected areas, unless you have an sql backend where hits of 20% have been mentioned. The worst case I've read is a test mentioned in the LKML Kaiser notes where it looks like syscall performance is halfed!

     
  36. FMark92

    FMark92

    Joined:
    May 18, 2017
    Posts:
    1,243
    upload_2018-1-5_10-42-55.png

    Seriously, though. Yes.
     
  37. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    On the scale of one years it tells a different story
     
    angrypenguin likes this.
  38. JamesArndt

    JamesArndt

    Joined:
    Dec 1, 2009
    Posts:
    2,932
    Yeah I was under the understanding that the new AMD Ryzen processors were just as awesome as Intel's top of the line, but cheaper.
     
  39. Player7

    Player7

    Joined:
    Oct 21, 2015
    Posts:
    1,533
    Hmm which is a worse.. Windows 8/10?, Spectre or Meltdown...

    hmmm personally if you care about privacy and data theft, I'd be more concerned running windows S***ting 10, now that's a vulnerability and spyware infested piece of S*** looking os with Microsofts own service agreement being you allow them total access to all your files, emails and network connections etc if you think a converged organization are trust worthy think again. And good luck trying to disable all that infested S***, they practically laced the entire OS with cloud contacting crap at every level and background service, probably what they spent most the time on doing from Win7 > Win10 because there is really F*** all improvement to be seen on the surface or functionality of that trash idiocracy looking OS. Same goes for google and apple gardens and probably some linux distros//who the F*** checks all source anyway.

    So if your os/tech industry is like that, are consumers really that bothered about cpu hardware being even more crippled and left exposing them to high level attacks aimed at stealing information. This whole media breakdown of this issue seems to be more of a issue for big business, corporate networks, hosting cloud etc than the end users who have already been shafted and made vulnerable multiple times by all these retarded sociopaths at these converged agenda companies not really giving a S*** about security and privacy for consumers in the actions they've taken for years now..their idea of security and privacy is trust us, let us screw you and data gather on your privacy while telling you how we are protecting you from scheming lying criminals not as big as us.

    And Intel skylake practically had more hardware backdoor S*** in it than actual end user improving features, let alone speed improvements over previous gen chips.. this is a company that is hell bent on not really fixing designed flaws and is a known anti consumer monopoly really. Consumers should have been hoping this vulnerability did end up reducing performance by 30% for existing cpu's lol ... that would be a great incentive motivator for the lazy consumer to push a class action lawsuit against intel (who really deserve it on other grounds already, probably throw some other tech giant corps in the mix for good measure), who's own CEO has already cashed out I guess he wasn't optimistic of the future.

    Still I guess if new cpu's out this year are hardware 'fixed' for these problems, then this news must have come just in a time to spread the good news of how business and tech consumers should look at getting an upgrade on older cpu's later this year... or maybe they just be like don't give a S***.

    Anyway Google really are on the ball with this finding security/data theft vulnerabilities aren't they, that don't do evil according to old PR really shows in the good work they keep doing :p I'm sure they don't? no of course not :)
     
    Martin_H likes this.
  40. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    AMD wouldn't be challenging intel unless it had a decade mapped out for the fight as it's more profitable to be the best second unless you're absolutely sure of winning. AMD seems to be convinced it's worth putting up a fight right now. This means they are set to come out on top OR they've misjudged intel's 10 year plan.

    Because it's very expensive OR very profitable to be market leader, depending on internal state and competition.

    As AMD historically up until 2017 deliberately played second fiddle, it's clear their intention is to either push intel's hand or more likely, they think their hardware is in a place to make a good go of being market leader.

    General purpose computing is on a downward trend as well thanks to GPU programs. It looks like to me that AMD knows this and wants to blow all it's CISC based guns ASAP, which hints at big changes in 10-15 years time.

    There's supporting evidence from Apple with it's shift from intel and amd toward bespoke Arm architectures + gpu computing.
     
    Ryiah and Martin_H like this.
  41. Kronnect

    Kronnect

    Joined:
    Nov 16, 2014
    Posts:
    2,905
    From what I’ve read some web services and applications running on AWS are already experiencing a degrade in performance - I wonder if Cloud Build will be affected as well.
     
  42. Lockethane

    Lockethane

    Joined:
    Sep 15, 2013
    Posts:
    114
  43. ShilohGames

    ShilohGames

    Joined:
    Mar 24, 2014
    Posts:
    3,023
    Definitely. And this also underscores the fact that cloud services like AWS might not be the best solution for hosting game servers. Virtualization always added a layer of performance loss, but now the amount of performance loss in virtualized solutions might be unacceptable for networked games. It might be time for big games (like PUBG, Fortnite, etc) to move to dedicated servers instead of virtualized servers.
     
  44. Max_Bol

    Max_Bol

    Joined:
    May 12, 2014
    Posts:
    168
    People who think "AMD must be partying" haven't read the whole news yet.
    This is a case known under 2 names : Meltdown and Spectre (a.k.a. Specter).

    Meltdown is only affecting Intel as it's basically using even less "complex" data than Spectre due to the signatures it leaves.

    Spectre affect every processors in the world including AMD, Intel and anything related to them. The only ones who are "secured" are the ones who already involved the (fix) such as "some" of the military pre-encrypted processors. (This is one of the reason why some encrypted military hardware is slower than their civilian counterpart.)
    It affects consoles, PCs, tablet, smartphones, etc. It's not a software issue, but an hardware issue. You could consider it as a physical-based backdoor to all the data that goes on in your device.

    It's funny how it comes out today like news of the end of the world and like "nobody knew it was this easy" since I knew about this since I first learned what's a processor. In fact, most "spying" agencies around the world already used this backdoor plenty of time. It's a real "hole though all security measure" like an unsecured dog's trap door at the bottom of a fortified steel door. It's a "known" issue since the 80's in reality, but what made it "pass" is that it was though that nobody could understand what the data fetched could means and it would takes millions of years to decrypt it efficiently.

    What made the news is that a company was able to decrypt sensitive data from a couple of cloud servers using state-of-the-art security measure by using this "weak point" and, now, everyone is crying out loud murder and try to find where the >20 years old problem lies.

    Basically, the processor have a physical-based log that is created whenever it does a task on every of its threads. The log last only for a micro second until the thread is used for something else. Still, if you want, you could record this threads' activity. It's binary raw code by the way so it's not just some "easy-to-read" data.

    The security risk depends on what's going on in the processor. If the process handled by the processor is pre-encrypted or always involve an encryption whenever it's done, there's no problem.

    The way the processor "logs" thing is simple to explain. It logs results, but not the means to reach those results.
    Let's put this as maths example. While it's not like that in the processor, it allows anyone to understand why and how this is happening.

    Let's say you ask a processor to calculate this: 100-(10*5)+13. It will returns 63 as a result. It will log 63, but not the 100-(10*5)+13 part. If you force a copy of the processor's log (which can be done by inserting a hidden binary code within the any kind of file being read), anybody could get a copy of the processor activity returning 63.
    It's not like they could check all the past history of the processor though. If a thread is used for something else, after logging 63, then the 63 will be replaced by something else.

    The issues at hand is that most of the security measure used by today's devices uses an 32bits or 64bits or even 128bits encryption. That's not "possible" when you run things in binaries (2bits) so that means the encryption is done through the processor after it has already logged the past non-encrypted data. Record that activity and you get the binary code that represents the non-encrypted data in binary code before it gets encrypted. That's why it's an "hole" in most security in any processors like Intel and AMD and why it has existed for years (ever since the first 8bits encryption used in a software through soft data management).

    This is a copy/paste of an post I have wrote on the Fortnite forum explaining what's the current fix being applied everywhere.

    The fix is basically:

    Before: (RAW_DATA) > Processor > ENCRYPT_DATA > Processor > STORE_DATA > Processor, then when requested, DECRYPT_DATA > Processor > CHECK_DATA > Processor > ERASE_DATA > Processor.

    Now (fix): (RAW_DATA + ENCRYPT_DATA) > Processor > STORE_ENCRYPTED_DATA > Processor, then when requested, (DECRYPT_DATA + CHECK_DATA) > Processor > ERASE_DATA_FROM_RAM > Processor.

    With the fix, the processor isn't used as much, but each time, the calculated data is bigger.
    (If you don't get why it's slower... It's similar to how we think when we calculate. What's faster?
    (200+25) / 5
    or
    (A*20) + (B * 20 /10) where A is 0.5 and B is 17.5 which is confirmed in the (DECRYPT_DATA + CHECK_DATA) request and is not logged.

    While in both cases, the answer is 45, but before, someone who has access to the raw data and can reverse-engineer the data will immediately know it's 45 while the one who gets (A*20) + (B * 20 /10) won't know what to do with it because he/she lacks the A or B which is only "known" during the CHECK_DATA and not logged as it's part of the process request and not the results as CHECK_DATA returns a True or False (1 or 0) only.

    This is how secure military hardware already does it. Usually, by having either 2 processors in a buckle where 1 processors single task is to encrypt and decrypt data and the 2nd processor doing the "usual" processor's job or by having the processor divided into 2 series of separate threads which divide the processor capacity by half, but "scrabble" the data between the halfs.

    The 2 mains issues with this are the following:
    • If you get a copy of the Processor log's, you can basically know every keystroke done by the user if you can reverse engineer the data from its binary format. While slightly different, it's the same way with digital keyboard with touchscreen. The only "security" in this is that to reverse-engineer the data in readable data, you got to know how the data was originally being used. Thing is... keyboard (digital and physical) doesn't have variation in their "usage" as they all uses the same technology and signals hence why it's "easier" to key anything the user typed if you create a copy of the processor's log.

    • If you use a well known type of infrastructure for a software and someone got an understanding of it, that someone, if he has access to the processors' logs used in the infrastructure, he could basically read every bits of data extracted from it. This is why Cloud server owning companies are in panic right now as they all use similar technologies and infrastructure (they paid for it from someone else most of the time) and all their investment in security can be countered regardless of how much money they invest or until a difference kind of processor is released with integrated hard-wired encryption. Banks and even the Public Share Markets are also quite in a bind since their online server are now mostly cloud-based.

    By having access to that data, there's a risk that, if successfully reverse-engineered, someone could have the master keys to "their" Kingdom. The chance are super low, but that's generally what's going on. You might think "Who could decrypt such complex data?" Considering the "issues" at hand, we're talking about having access to all the data about all the money in a country's bank being possible or being able to have entry codes for anything that pass through 99% of the devices that uses Intel, AMD and some custom processors in the world.

    In fact, due to this "news", companies owning "super computers" will soon (if not already) get pestered (again) because they got hardware that "could be used" to reverse engineer the binary data and they are selling their "computing" services to the public.

    So, how is this affecting Unity? To be honest, unless you have a really successful game with millions of players constantly investing within the game lots of money, it's not much of an issue that has already existed for over 20 years. As for the impact of Meltdown on Intel CPU, it's only a matter of time before it's being fixed.

    For now, it's like ducktape on a cracked hourglass, but once they replace the glass for something more appropriate, it will be relatively forgotten.

    One thing you can be certain is that hardware companies and shops will take advantage of this where, as soon as a "solution" is put in store, you'll see lots of publicity about how your current hardware isn't secure and how you can be stolen of everything you have because of your PC old processor. Intel is now in a "drop" due to theirs being also affected by Meltdown, but as soon as they release a brand new processor without the flaws, you can be certain they will raise back up.
     
    Animallica, FMark92 and Martin_H like this.