Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

When Moorse Law Ends?!

Discussion in 'General Discussion' started by Arowx, Dec 15, 2014.

  1. Gigiwoo

    Gigiwoo

    Joined:
    Mar 16, 2011
    Posts:
    2,981
    It's a fair point. At a meta level, I suppose technology is the manipulation of data, which has been improving exponentially since the big-bang. I find it likely that my inability to see the future is more of a reality than the idea that this exponential curve somehow just comes to a halt in my trivially small lifetime.

    It's hard to dismiss the specter's of old claims: "Mobile devices will never be good enough to play real games!"

    Gigi
     
  2. darkhog

    darkhog

    Joined:
    Dec 4, 2012
    Posts:
    2,218
    By the time we'll have quantum computers, we'll have realtime raytracers. Frankly, we have them now, but they're crappy - each time you move, there's white noise on screen. Like here:



    On top of that it requires very powerful gpus, but we'll get there. With realtime raytracing, there will be no need for Enlighten, only good renderer.
     
    hippocoder likes this.
  3. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,789
    "With 10nm, Intel hopes to carry the mantle of Moore's law forward to yet another node while continuing to decrease the price per transistor—in other words, we'll continue to see chips that consume slightly less power while also integrating yet more features onto a single die. 7nm, with a possible shift away from silicon, is more exciting; transistors fashioned out of III-V semiconductors can consume much less power while switching at much higher speeds. Individually, neither of these new processes are likely to raise the roof; but a 3D stack of 7nm dies... now we're talking."

    more at http://arstechnica.com/gadgets/2015...d-to-10nm-will-move-away-from-silicon-at-7nm/
     
    Gigiwoo likes this.
  4. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    @ippdev only the chip designers are no longer in charge of saying what a chips nm scale is it's the marketing department.

    Actual chip features are usually a lot larger, as going smaller means you have to do more to compensate with leakage and stability problems as Quantum effects start kicking in.

    What they are doing is stacking, building up layers of chip components. Then the marketing guys can take the number of transistors on a chip and the area of the chip and calculate the feature size.

    Don't believe the hype, check the benchmarks!
     
  5. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,789
    Ummm.No..When engineering costs billions per size variant the marketing department does not get to say how those engineering dollars are spent. They do have competition that can kick their butt in the marketplace and to listen to some high faluting Fuller Brush salesman instead of your engineers is business suicide. You have some funny notions about near everything. I guess you didn't read they are changing materials for the 7nm process either. Did the marketing guys pull that out of their butts as well?
     
    zombiegorilla likes this.
  6. Kiwasi

    Kiwasi

    Joined:
    Dec 5, 2013
    Posts:
    16,860
    Not sure how I missed this thread. But its nice to come into one where I have insiders knowledge that can be shared. There is no grand conspiracy to hold back and drip feed technology.

    As qualifications I'm a manufacturing engineer in a high tech, innovation driven company. We don't make computer chips, we make food. We use chemistry rather then electronics. But the general ideas are the same in any high tech manufacturing environment.

    There are two parts of the drip feed conspiracy. First is that the technology for the next ten years already exists.

    Currently our R&D labs have technology that is about five years ahead of the market place. We have various agreements with Universities that have technology about 10-15 years ahead of the market place. This technology takes time to develop and roll out. A PhD student who can make 0.01 g of a new molecule is great. But you can't make a profit on those volumes. It might take several more years to get it to the point of a scale up facility that can pump out a few kilograms. Then you've got several more years before you can get a full blown manufacturing facility up and running. You can sink huge amounts of time and money into new technologies, before you even sell a single unit.

    So does the next ten years worth of consumer technology already exist? Yes. Could the technology be brought to market today? No.

    The second part of the drip feed conspiracy is that it is more profitable for companies to hold back on technology.

    I can't speak for the chip industry, but the food industry is nasty. Its a constant case of lawsuits, price manipulation and one up man ship. All aimed at getting more of the market share then your competitors. If a product is available that gives even a five percept edge over a competitor it will be launched. If some new process that can cut costs is discovered it will be used. All sorts of tactics are employed to beat the competition. In the high tech game you are only as good as your latest product.

    Then there is the generics. Nipping at your heels the entire time are companies with low cost manufacturing facilities in low wage economies with low standards on safety and quality. Throw in a government subsidiary or two and they become unbeatable. These companies do not invest in any R&D, rather they reverse engineer competitors products after the patent has expired. This means between invention and end of profitable life on a technology you have twenty years. This includes any time spent in R&D, before sales are made. In order to make money in this short time frame you must produce as soon as possible. A technology sitting on a shelf for two years as part of a 'drip feed' strategy is a technology that you have two years less to profit off.

    There is the occasional time where you have more technologies then you have resources to develop, and technologies get dropped. But this is very different from the concept of holding back for a drip feed.

    Of course I could be wrong. Chips could be this nice, friendly industry where all of the big players agree not to undercut each others technology release. Governments in developing countries might be ignoring the massive economic potential of subsidised generics. You might actually be able to take a chip from concept to full production in a matter of weeks.
     
    Gigiwoo, HemiMG, lorenalexm and 3 others like this.
  7. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Don't take my word for it...
    http://spectrum.ieee.org/semiconductors/devices/the-status-of-moores-law-its-complicated
     
  8. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,789
    @Arrowx The article is from 2013. Rather an irrelevant dinosaur. Got anything from the past few months to bolster your assertions? The article I posted was from the last month.
     
  9. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Your linked article is derived from in intel press release.
    My linked article is a journalistic article looking into the die size issue and the industry with quotes from the engineers who are designing the next generation of chip features.

    But you want to rate the articles on age not quality and veracity?!
     
  10. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    It is drip fed, but it's led by market physics. Basically, intel already have graphine cpus which are 10,000 times faster with theoretical maximums of 500GHZ far smaller than current cpus (google it). Could they sell these now? yes. absolutely yes. Would they end up making less money and thus less research budget? yes also.

    So yes, they are drip feeding you and this has several good reasons which you should actually support:

    1. the price is too high
    2. they make less money, and do less research
    3. they'd make considerably less money if their competitor has to catch up with giant leaps, because it means they have to lower prices.
    4. surrounding hardware would need to catch up to make use of it.

    So while, yes they do have the next 10, maybe 20 years of tech loosely planned out and in various stages of completion, they and you, cannot afford to go too fast in the market.

    10-15% increases each time is working out for everyone just fine.
     
  11. ippdev

    ippdev

    Joined:
    Feb 7, 2010
    Posts:
    3,789

    Yes. because it was written in the middle of their debacle and problems with that generation of chip. Excuses and obfuscation is my determination. On @hippocoder ..they may have these but they cannot mass manufacture them. They are discovering for example that hemp based graphene makes supercapacitors super fast..so expect this breakthrough to come to chipmakers very soon (well..next one to four years)https://www.asme.org/engineering-topics/articles/energy/hemp-carbon-makes-supercapacitors-superfast
     
    Last edited: Mar 2, 2015
  12. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    420 GHZ.
     
  13. orb

    orb

    Joined:
    Nov 24, 2010
    Posts:
    3,033
    For CPUs it's not even that. More like 2-5% faster at same clock, but other benefits with each new generation, and possibly a higher frequency ceiling every now and then. And we're happy with cooler processors rather than faster for laptops, because they're pretty darn fast now :)
     
  14. Arowx

    Arowx

    Joined:
    Nov 12, 2009
    Posts:
    8,194
    Don't confuse RF circuits with processors http://arstechnica.com/uncategorized/2006/06/7117-2/

    Pure speculation there is always the next best thing, but will it make it to production?

    What is more interesting is the way they are managing to layer and stack the chip components, we are seeing on chip memory layers and bus interface chips being dragged onto chip.

    Now imagine if you could have your PC or Mac on a single chip, the CPU, GPU and MEMORY all on one little water cooled block. OK you have lost modularity so you can't swap out your GPU for a faster one (but maybe you can add additional chips), but think of how much faster your system could be!