Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

When Moorse Law Ends?!

Discussion in 'General Discussion' started by Arowx, Dec 15, 2014.

  1. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    All my life we have had computers, based on chips that get faster and smaller every year, but there is a physical limit to how small the transistors on the chip can go.

    We have managed to get down to about 20nm scale and Intel is pushing for 16nm but the road ends around the 10nm scale, as the parts are too small to physically carry an electrical current without burning out or the error rate being too high to do reliable computations.

    I'm a Software developer and it has been great to think that even if I write reliable but slow code, that next year with newer faster hardware it would be reliable fast code. ;)

    So could this or the next decade sound the death knell on Moore's Law, the cornucopia of the IT, games and graphics and software industries!

    But what will this mean for the games, mobile and graphics industry as a whole?

    A: 28nm features stacked 2 high gives you 14nm per transistor, or stacked 4 high gives you 7nm per transistor, so if you can stack them 8 heigh you would in effect have 3.5nm transistors. So we can circumvent the 5nm barrier even with 28nm technology. We just need to stack up or layer transistors. Doh! Nice little brain teaser.
     
    Last edited: Dec 16, 2014
    aaronhm77 and Ony like this.
  2. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    1. it's Moore's Law.
    2. It already ended technically.
    3. It won't, and didn't, mean anything.

    As for games, they already under-utilise hardware, it'd be wise to simply make multi-threading a more key part of your development process (though that has been the case for a long time, regardless of this theory).

    I don't see the need to make it seem so melodramatic, especially considering it has already happened.

    Nothing more to say than Wikipedia already covers, so I may as well link it. http://en.wikipedia.org/wiki/Moore's_law
     
    hippocoder likes this.
  3. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Think about it every year or two you can get a faster mobile phone, cpu or graphics card. OK every 6+ years for consoles but we have had an era of increased graphics and processing power to make better games.



    Well there are limits to multi-threading as you are limited to the number of cores on the chip which is restricted by Moore's law. You can go multi-chip but then you have inter chip bandwidth and memory bandwidth limitations.

    As for underutilized hardware, I would disagree, a lot of AAA games push the hardware to it's limits. Take the new Mantle API and the Frostbite Game engine used in Battlefield 4 for example.

    Maybe you don't understand the games industry, every year it needs to produce a newer better game to keep it's business going, if the hardware stops advancing then the industry can't just add more art, graphics and complexity to the next game.

    The next iPhone could be no faster or smarter than the previous one, they just start alternate the cases bevel from flat to smooth.
     
  4. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    Didn't realize how flat her chest was back in the day LOL LOL LOL.
     
    ManAmazin likes this.
  5. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Exactly you have Moore's Law to thank for those improved curves. And if Moore's law had ended in 1996 Lara would have stayed that way.
     
  6. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    Hehe yeah :p....
     
  7. GarBenjamin

    GarBenjamin

    Joined:
    Dec 26, 2013
    Posts:
    7,441
    I agree that way too many developers rely on more and more powerful hardware. I never agreed that they should do that. Personally, I always optimize my projects not only for my current hardware (which is usually behind the times such as my 6 year old dev laptop) but for even less powerful machines. It's always good to be efficient. I have that engrained in my head as a habit. Probably because I started dev on machines that had 16Kb of RAM and CPUs that maxed out at 1MHz.

    Innovation needs to come from the game (or whatever software) developers. Stop relying on "gee whiz" FX and focus on game play. CPUs are so incredibly fast today devs should be able to do anything they need to make an excellent game. The problem is every time hardware advances instead of them focusing on packing in more game play, better AI, populating the game worlds with more objects that can be interacted with, filling the cities in RPGs with semi-intelligent NPCs... what do they do? They waste all of the increased power focusing on graphics. "We must show reflections when the player walks by mud puddles!", "yes we are still only putting 7 dumb NPCs into the main city but they will look better than any other NPCs in other games". That is what has been happening for the past 20 years.

    Fortunately, there are many Indy devs who have been focusing more on putting the game back into games instead of focusing on visual advances to the detriment of everything else.

    So... it will be fine. I hope the hardware does stop advancing. If it does we will probably see some of the greatest games ever.
     
    Kiwasi and Deleted User like this.
  8. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884

    Hey now, I don't think my new Toon Shaders are anything that defines *Graphics* I made it in hopes just to get a more better look than a plain cartoon Shader with no depth of field to it lol.

    (Which I did update the Asset Store Forum if you want to check out the new screen shots).
    :p


    But I know what you mean hahaaha, I totally agree, people only focus on graphics and forget everything else.
     
    aaronhm77 and GarBenjamin like this.
  9. GarBenjamin

    GarBenjamin

    Joined:
    Dec 26, 2013
    Posts:
    7,441
    You innovated in software! I doubt your work is something that will cause a major bottleneck due to the graphics. That kind of thing is great! If your system is sucking up 50% of all frame time then boooo hsssss shame. Lol

    But yeah seriously. Look at things like Raid on Bungeling Bay and Sim City. I doubt many modern games (especially mobile games) even have logic that advanced. Modern RPGs are 3D and you can see the veins on leaves but are they any deeper than Phantasie 3 and Questron 2 from nearly 30 years ago? Sure they definitely look and sound better. They seem a lot more like movies. I just find it amazing how much presentation has changed over the years while the actual game AI, depth and such has progressed at maybe 1/20th of the speed.
     
  10. Gigiwoo

    Gigiwoo

    Joined:
    Mar 16, 2011
    Posts:
    2,981
    My wife waited for a bigger phone. A few weeks ago, I got her the Note 4. It has a quad core, 2.7 Ghz processor with 3 GB of Ram, and a 2560x1440 display. It's more powerful than the gaming rigs we bought 4 years ago and it fits in her pocket! The technical definition of Moore's law is no longer being met, however the spirit of the law is live and well.

    We live in wonderful times.

    Gigi
     
  11. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    I totally agree!
     
  12. Grimwolf

    Grimwolf

    Joined:
    Oct 12, 2013
    Posts:
    296
    Just because they can't keep making them smaller and smaller doesn't mean progress will halt and computers won't become stronger.
    At worst it just means you won't be playing Call of Duty on a wristwatch, which is impractical anyway.
    It's not like the current method of building them is even the absolutely necessary method. New tech could easily end up removing such limitations at some point.
     
    aaronhm77 likes this.
  13. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    Sure, there are very few games that fully utilise modern hardware, not sure what you are trying to say there. It has nothing to do with Moore's Law though, so I'll leave it alone. Unity in it's current state is very difficult to get utilising much of modern hardware (thankfully 5.x fixes this somewhat).

    Just because Moore's Law isn't being kept to doesn't mean hardware will stop advancing...
    My comment about multi-threading relates to the fact that once hardware hits the size limits it's likely it will diversify it's architecture, resulting in cores for specific purposes, similar to what GPU architecture does today (I have no source for this prediction, but it does make sense).

    Hardware won't stop advancing, it will simply advance in different directions (something that it has in fact already started doing). Remember Moore's Law was made in a time where mobile computing was a distant dream, and power consumption was not a consideration deemed important in design, it doesn't really apply in this modern era.
    This is all assuming we will still be using Silicon based architecture in 50 years, which is naive really.
     
  14. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Actually have you noticed that smart phones are getting bigger now, OK apparently it's the display, but I wonder how much is due to the slowing of Moore's Law and battery technology trying to catch up.
     
    Gigiwoo likes this.
  15. Gigiwoo

    Gigiwoo

    Joined:
    Mar 16, 2011
    Posts:
    2,981
    Lots of factors. And in the end, the hardware is still amazingly, blistering, completely mind-blowingly powerful. And it fits in my pocket and runs full screen videos and games for ~8 hours! Long, long way from my pocket football game at age 8.

    Code (csharp):
    1. -
    2.           -              +
    3.           -        
    L, L, Up, L, L, Down, L - Score!
    Gigi

    PS - Stupid code formatting .... sigh.
     
  16. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884

    I have the LG G3 and I have to say omg it's amazingly fast too! It's equivilent to my Laptop I use

    which as a 2.4 I Core 3 Dual Core
    4 gigs of ram
    1 gigs graphics card.

    So that's saying something about phones.
    The note has a bit faster processor than my phone, but I got the 4K resolution on mine :p
    Which does make things more crisp, but aside from that it's virtually useless I think
     
  17. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    But there has to be a benefit to the customer, if your tweaked chip is 50% faster then you can probably sell it but what if you only get a 10% improvement over an existing CPU, would you want to spend millions on building it, could you convince Intel or AMD to make it on such slim margins.

    Once we bottom out on Moore's Law then there is one option, larger chips. Chips can expand in area and maybe with improvements and advancements in 3D stacking of chip components expand in volume.

    But even none silicon hardware if it uses electrons will have similar physical limitations, maybe photonics, chips that compute using light or quantum computing. The thing is there is nothing at the moment that can do what silicon does and even if there are alternative options they could take decades to mature and catch up with silicon.
     
  18. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,071
    Yet that is the difference between some chip models.
     
  19. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    A lot of the time chips are built to the highest quality they can in the fab then they adjust the performance of the silicon based on the control circuitry, hence some chips are renowned for being overclockable. Simply because the apparent different models are in fact made to the same standard but only tested to the level they are sold at.

    In effect they market chips at reduced capacity, in some cases they have even added control circuitry to turn off features.

    This supports their premium priced products. It's a bit like a hotel charging different rates for the same volume of room just dressed slightly differently.
     
  20. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    It's not about benefit to the customer if you come at it from the manufacturers perspective, it's about how they market it. The generation gaps of CPU's these days are already incremental in performance terms, it's been that way for a while.

    You say it could take decades, but does that matter? As stated before, it's not as though improvements won't happen, they just won't happen at the rate Moore predicted. It could very well be that we move on to a material/method that ends up having rates faster than Moore's Law, we don't know.

    Also is larger chips a bad thing? It seems like you're holding Moore's Law up as this ideal situation to be in, which it isn't, it's simply an outdated method of predicting trends. Larger chips aren't necessarily a terrible idea, desktops and servers would be fine with a change that made chips larger but faster/more efficient. And like you said yourself, mobiles are getting bigger anyway, why not fill some of the space with more CPU ;)
     
  21. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Well there might be a couple of problems with larger chips. heat, power and yield (the number of chips that can be made on a slice of silicon).

    With Moore's law you got faster, smaller, cheaper, lower power chips. A Win, Win, Win, Win situation.
     
  22. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Hmm probably not it looks like it on paper but due to heat and battery power conservation, it's probably much lower in performance than your laptop. For instance your phone will reduce it's cpu power to keep the heat level down and to maintain battery life. But prove me wrong write a benchmark that test the computing power of your phone then built it for your phone and your laptop and compare the results.

    Or find one like Geekbench that you can run on your PC and Smartphone.
     
    Last edited: Dec 15, 2014
  23. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    There are other strategies other than just making stuff smaller. For example, adding more cpu cores. Parellization. Dedicated hardware for special jobs (gpu etc).
     
  24. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    Well I build a 20,000 x 20,000 terrain filled with trees, tons of grass, 20 K Poly rocks everywhere
    also used Unistorm with all the dynamic shadows, sun, storms, etc.

    Computer I was getting around 200 FPS and on phone was getting 160 (so close enough to compare lol).
     
  25. N1warhead

    N1warhead

    Joined:
    Mar 12, 2014
    Posts:
    3,884
    And used parralaxmapping and non-mobile shaders for everyhing
     
  26. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    I agree, however it's worth remembering Moore's Law isn't a solution, it's a prediction model, one that no longer works :p

    heat may actually be less of a factor depending on how you design the chip, if you don't have size as a limitation, you can spread the channels over a larger area and have better heat dissipation. I'd hazard the same for power, too. I'm not a hardware engineer though.
    Yield is certainly more of a concern, but chips have been far larger than they are today, it doesn't seem a stretch to me to think that with modern processor technology, larger chips could be made to be more efficient than they once were.
    I think the biggest preventative force for that happening would be the mobile market, where smaller chips and lower power consumption have been the driving factor up to now.
     
  27. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Actually look at the Ghz race that happened and then stalled and then when it stalled Multi-core chips started to happen.

     
  28. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Ahh but these are not older larger feature chips they would be larger areas of smaller feature chips. So in effect if there are impurities or defects on the silicon (that might have been too small to affect older chips) or errors in manufacturing then your yield will drop more as you are producing less per wafer.
     
  29. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    Yeah that's a good point, all the more reason to move onto non-silicon tech! :p
     
  30. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Or you could start learning assembly language or parallel programming to get the best out of a future of larger multi-core chips that won't be any faster.
     
  31. Ony

    Ony

    Joined:
    Apr 26, 2009
    Posts:
    1,973
    The Age of Spiritual Machines
     
    aaronhm77 likes this.
  32. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194

    If Moore's Law stops in 2020 we don't even get human level intelligence, we get a computer smarter than a mouse.

    Also you should notice that Kurzweil does not compare power/energy needed to power a mouse brained computer vs the mouse.

    The best I have heard of was a super computer simulating an area about 10% of the human brain. But what about the power used!
     
    Last edited: Dec 16, 2014
    Ony likes this.
  33. tswalk

    tswalk

    Joined:
    Jul 27, 2013
    Posts:
    1,109
    this will all be moot if they can figure out how to harness and control quantum computing....
     
  34. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Decoherence.

     
    Last edited: Dec 15, 2014
    tswalk likes this.
  35. Tomnnn

    Tomnnn

    Joined:
    May 23, 2013
    Posts:
    4,148
    Someone beat me to it lol. Moore's law will essentially reset and the limits will be pushed wayyyyy back once we move from bits to qubits. With cables that transmit light instead of electricity for internet, we already have the potential to send more than bits over a network, but we can't really make use of that on our machines. When quantum computing is portable and affordable, we'll have a leap in processing power that will probably satisfy our needs all the way up to the point where we transmit data by manipulating space.
     
  36. angrypenguin

    angrypenguin

    Joined:
    Dec 29, 2011
    Posts:
    15,500
    Does there? Apple sells new phones based on being faster and more powerful, but most people aren't downloading new apps to warrant having that extra power. (That's two different links, by the way.)

    I have a 7 year old desktop PC at home that's still in active use, as a gaming machine no less. There are workstations here still in active use which are of a similar age, too.

    You really only need new computing power for relatively niche stuff these days. As developers we here see the advantage because it helps us be more productive, and even there the actual power is typically only needed in short bursts. As gamers we see it because developers are competing to make the coolest stuff. For the majority of everyday users, though? A 10 year old PC is probably more than sufficient for most. Things are quite fast enough. There's probably more advantage in making them more efficient, instead.
     
    Kiwasi, Ony and GarBenjamin like this.
  37. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,789
    One word..one layer..Graphene.
     
    VIC20 and Dantus like this.
  38. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Quantum Computers + Enlighten... can't wait.
     
  39. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,071
    Not to mention those situations where it is more than simply skipping tests. Some lower tier models were actually made for higher tiers but may have been part of a batch that failed to pass certain tests.

    AMD's Phenom II series is a good example of this. Some of their quad core batches had a partially defective core and rather than throw the batches away they choose to simply switch off the core. The result was a slightly lower priced triple core model.

    This isn't limited to chip manufacturers either. I've bought Western Digital drives that were marked as desktop drives but were actually made originally for enterprise and simply failed to run properly at the higher spin speeds.

    It isn't even necessarily circuitry that is added but simply switches to the chip's firmware. I've known of situations where these firmware switches can be overridden at the risk of instability.

    It isn't so much about supporting premium priced products as simply cutting your losses as much as possible.
     
  40. TylerPerry

    TylerPerry

    Joined:
    May 29, 2011
    Posts:
    5,577
    Y'all are over reacting.
     
    angrypenguin and zombiegorilla like this.
  41. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    OK think about it this way Unity 4 had some nice features but Unity 5 has PBR better audio and newer faster PhysX.

    Now to get those new features you need better hardware to run better software.

    And yes you can optimise and improve software but that takes time, money and you get diminishing returns. As opposed to the "free" ride that is Moore's Law.

    Come on some of you must have thought damn I'd love to write this <insert amazing complex game idea> but when you do some prototyping you find that the hardware/software is not up to that level of complexity. Or in Unity too many things flying about with real physics bullets. So you might back off the idea. But you're thinking maybe this years PC/Mobile/Tablet version of Unity can't run it but give it a couple of years!

    Now imagine the amazing consumer engine that is the IT juggernaught, Microsoft, Intel, AMD, Nvidia, Samsung, Apple and the rest what happens when the new phone/pad/gpu/cpu is no faster than last years.

    It would end up like the Car and Fashion industry, they would bring out trendy new models but the engines/materials would be the same as last years model.

    They would probably diversify, e.g. dedicated Gaming / VR pcs morphing into a water cooled multi-core multi-chip monsters for the latest version of Battlefield/COD. Super smart mobile phones that aren't really that smart but phone up a Cloud version of HAL for advice, oh wait a minute that's Siri!

    On the plus side second hand but high end cpu's/gpu's should start holding their value better.
     
  42. 3agle

    3agle

    Joined:
    Jul 9, 2012
    Posts:
    508
    I think you're overstating an issue that doesn't even exist yet, and may never exist.

    I can't say I've ever thought of a game idea that can't be done due to hardware limitations. Can't be done due to my intelligence, perhaps, but hardware, no. In the field of games, everything is a shortcut and an approximation, you are never going to have a full simulation. Sure, better hardware means a more accurate simulation, but it shouldn't mean an idea is impossible, simply not able to occur at the accuracy you'd like. It's called budgeting.
     
    GarBenjamin likes this.
  43. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,071
    The impression I've been getting, from comments made by those who have Unity 5, is that there isn't much of a difference in resource usage. Unless you're using very low end hardware it shouldn't require upgrading.

    Interviews with the creator of Dwarf Fortress can be an interesting read for this reason. It is amazing the number of ways that supposedly complex games take shortcuts in order to facilitate the features they have.
     
    Last edited: Dec 16, 2014
  44. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    @3agle @Ryiah I understand your points of view and yes game development is often slight of hand not simulation. And a lot of the industry is not AAA, only the 'hard-core' gamer is in this category. But this is not just about AAA games and amazing physics all IT sectors will be impacted by this. The mobile phone industry, the console industry, the bedrock of the game industry the hardware will stop getting cheaper and faster.

    Intel is pushing the way with 10nm technology in the works and a pipeline for 7nm in the future (about 2018). But 7nm is about the limits of physics they can go down to. They might even make it to 5nm but that's the end of the line for silicon as we know it.

    So things will start getting more expensive, that new phone, if you need a faster one it will need a better battery and will cost more.

    You are right we can still make great games on old hardware, but the advances we have seen in graphics and physics will only be possible if people are willing to pay more for them.

    We are just at the beginning of a VR revolution with the potential of super high resolution HMD's. But running VR at those resolutions requires more processing power. And to give everyone VR we need that power cheap. Will we make it or will the end of the Road for Moore's Law prevent this new technology from being all it can be?
     
  45. Ryiah

    Ryiah

    Joined:
    Oct 11, 2012
    Posts:
    20,071
    Which is why companies are trying to find alternative materials such as graphene. I do think you're making a bigger deal out of this than it truly deserves. I don't doubt Intel will find an alternative path to continue improving their chips. It may simply be a more optimized core or we may progress back to more specialized hardware. We'll simply have to wait and see.
     
  46. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Well we are still some way from the limit yet, GPU's are still using 28nm technology and Intel has made it down to 16nm and is pushing for 10nm so we can still expect some amazingly more powerful GPU's and mobile CPU's at least over the next few years.

    They can also start stacking the transistors so that the chips stay small but are built up layer by layer. This is already happening in the flash memory sector. Imagine a stacked Intel CPU with 8,16,32 cores or a GPU that is actually multiple GPU's stacked together. We could reach the 8k+ VR screen resolution with a stack of 4 or 16 Nvidia or Radeon GPU's on a single chip. Might need some liquid Nitrogen to keep it cool though.

    Hopefully they will be powerful enough and cheap enough to bring us a 'Better Than Reality' VR experience! ;0)

    LOL What if you could get a future dedicated Minecraft Voxel GPU that is a big cube of transistors that represent the voxels in the game. ;0)
     
    Last edited: Dec 16, 2014
    Cogent likes this.
  47. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    And what about adaptable/programmable hardware, with memristors we could end up with our general purpose CPU having cores that use transistors and memristors so that they can be configured to run programs in hardware. For instance at the moment mobile CPU's often use a System On a Chip configuration so sections of the silicon are set aside to decode video for instance. With memristors and transistors they could have re-configurable cores that could on demand change from video decoding to bank encrypting.

    So precious transistor space could be re-usable and adaptable. With the potential benefits of being able to run programs faster in hardware.
     
  48. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    When moorse law ends, we'll have to use something other than moorse code.
     
    darkhog, Gigiwoo, Ony and 2 others like this.
  49. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Come on Hippocoder, this could be the end of the IT era not got anything more thoughtful to say!
     
  50. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Q: If we can't get chips with less than 5nm transistors due to actual natural/physical limitations then Moore's Law ends, and what impact will it have on the IT industry?

    A: 28nm features stacked 2 high gives you 14nm per transistor, or stacked 4 high gives you 7nm per transistor, so if you can stack them 8 heigh you would in effect have 3.5nm transistors. So we can circumvent the 5nm barrier even with 28nm technology. We just need to stack up or layer transistors. Doh! Nice little brain teaser I gave myselft. Let's just hope those bright sparks at Intel and AMD aren't as silly as I am.

    These Stanford types have it stacked
     
    Last edited: Dec 16, 2014