Is the end of Moore’s law probable? Is it actually different this time?

Probably the most important development in the past few decades for electronics has been Moore’s Law. Really it should not be called a “Law”, as it’s not really a “formal law”. In essence, what it is an observation that transistors doubled approximately once every 2 years. A Law is something like “the Laws of Thermodynamics”, a fundamental law of the universe.

Intel’s current latest (as of early 2016) High End Desktop CPU, the 5960X Image Source: Sweclockers.com

Why does it even matter?

Imagine a world where next year’s model of computer, phone, and any electronic are not much better than the previous years. What really drove this was shrinking transistor sizes. Let’s call these nodes. Imagine if there won’t be another node next time. So in other words, next year’s shiny new toy isn’t going to be much better than this year’s. Sure there may be some incremental improvements, but they will be much smaller than before.

 

Does this sound crazy? We might be heading towards that world.  It may very well be that 28nm is the cheapest node. So smaller and faster nodes can be made, but not cheaper per node.  That remains a big “may” be though, and there are some technologies on the horizon (as always that are promising), but with immense technical difficulties. Certainly the difficulties that Intel has had since 14nm suggest so.

 

I think that the media in particular in how it has portrayed Moore’s Law has been dismissive. There have been a lot of claims over the years that it would die. I think we get this sort of “the boy who cried” wolf mentality and another reason is that many journalists do not have the technical background needed to comment on these issues. When engineers say that they think Moore’s Law is about to die, what they mean is that : “Here are some really big technical challenges and if we do not solve them, then Moore’s Law cannot continue. If we can solve them, then they can continue until we identify the next big challenge in the next node.” Previously in nodes, they have been solved and a smaller process came about.

 

At some point, I think it is inevitable that it will slow down. The problem is that the challenges get exponentially more difficult as you scale down.  They are literally at war with the laws of physics. Heat and leakage become the bane of electronics and at some point they will win. Either the next node will be simply too expensive in terms of capital costs or the technical problems will be insurmountable. At the same time, compounding the issue, the marginal benefit of the next node is become more and more questionable.

 

 

Immediate challenges

As you scale down smaller and smaller, the challenges become exponentially larger. As I have noted, you are at war with the very laws of physics itself to get the node to work. Past 22nm, semiconductor engineers have  become faced with pretty unique challenges like quantum tunneling.

 

Supposedly there were technologies out there that would come to the rescue:

  • 450mm wafers: Basically all of the silicon chips produced today are produced on 300 mm wafers.  Wafers today use 300mm ingots – 450mm (in theory at least) would allow for economies of scale.  This would allow for reduced die costs and other economies of scale. In practice, it has been delayed by various challenges and is not expected until the mid to late 2020s.
  • Extreme Ultraviolet Lithography (EUV): Computer chips today are made with a 193 nm laser that etches the changes onto the 300mm wafers I’ve described. Essentially EUV is a much higher frequency laser, which in theory, should allow for much greater precision.  There appear to be various technical problems that involve its deployment, including providing enough power. Even if it were to work, it will be very capital intensive.

Both of these are critical for the long-term continuation of Moore’s Law and both of them have faced significant technical problems. These have caused delays and may outright prevent their deployment.

 

Even further out, there are unconventional solutions like graphene (which has a huge problem – no band gap, which is necessary for a transistor), but these are truly ‘blue skies’ type projects with an uncertain (at time of this writing) hope for success.  Whether or not graphene will turn out to be the wonder material that it has hyped as remains to be seen. I certainly hope it does, but so far, it remains totally unproven. Other ideas like quantum computing remain long-term solutions that may or may not work out. Much like nuclear fusion power, it is always “a few decades away”.

 

In truth progress has already slowed down. Around 2005, Dennard Scaling, which meant that as the size of transistors got smaller, the power density was unchanged (so for example, a 2x smaller transistor would use 1/4 as much power). If you are interested, I’ve uploaded the original paper here. That no longer held after 2005, where power losses began to result in increases in power consumption and would have destroyed the chip had power consumption increased.  The end result was that although we could etch smaller transistors with lasers, power density had to be held a constant. That was one reason why we have multicore CPUs – to spread out the concentration of power consumption and prevent a thermal breakdown. Unfortunately, not everything tends to scale very well. It is only a matter of time before the rest slows down.

 

The net effect

I pay close attention to Intel since it is the leader when it comes to semiconductor nodes.

 

Intel had experienced considerable difficulties with the 14nm leading to delays and there is still a Skylake CPU shortage.  Meanwhile, the 10nm CPU node has been delayed and it is looking like the tick-tock model that Intel pioneered of an architecture  one year, followed by a node shrink,  then 2 years later another architecture has died.  Barring a major breakthrough such as EUV, this is looking like the status quo for now.

 

At some point, either the technical problems will be so problematic that they are insurmountable or the costs of the next generation fab simply will not be economically viable.

 

The only scenario where this changes is if something fundamentally new such as quantum computer or graphene completely changes the game. Whether that happens anytime soon though is beyond the scope of this post.

 

 

So what if it does end?

I hope that Moore’s Law can continue in some form or another and that we continue to see faster chips for decades to come. However, I think given the challenges, it is worth considering a “what if” analysis.

 

That is what I plan to do in a part 2 of this analysis.

3 Comments

  1. Benjamin David Steele

    It could be somewhat irrelevant.

    There are limits to how fast vehicles can be made to go and how much it would increasingly cost to get them to go faster. Yet for most people and for most purposes it simply doesn’t matter.

    Vehicles aren’t extremely different from when the first gasoline engine was mass produced. It’s only been lots of small incremental improvements, but so far nothing transformative. Still, the improvements are far from insignificant.

    It’s not so much the improvement in any single thing. Rather, it’s the improvement in lots of things and how they work together. Small cumulative improvements can add up to large changes over time. The changes we focus on likely aren’t the changes that will matter the most.

    Reply
    1. Chris Liu (Post author)

      There are pretty big reasons why we want Moore’s Law to improve.

      Computational power is a bottleneck for a lot of the problems of society. We need it for modelling, weather prediction, to run the data centers that drive our day to day lives, etc.

      Semiconductors are unique in that the scaling has been exponential and for decades. There have been relatively few applications like that. Perhaps we may have something similar to the era of aircraft – they’ve stopped getting faster and improvements have been much smaller, more incremental since the dawn of the jet age. That is not to say that there have been no improvements. Composite materials, higher temperature turbine blades (to allow for greater fuel efficiency) are examples of how things have improved.

      The problem is that we do have pretty big uses for that power – apart from next year’s smartphone becoming much better than the last.

      Reply
      1. Benjamin David Steele

        I’m out of my depths on this topic. But it is endlessly fascinating to contemplate.

        It is ultimately about whether the future is predictable based on whether further progress will be linear similar to the recent history of progress that has gotten us this far. That slow but steady progress could abruptly be halted in certain areas or else entirely unexpected developments could take us in new directions. It’s anyone’s guess.

        There are some indications that the future won’t be like the past. The last century has been mostly improvements within a basic paradigm of technology. But it appears that new paradigms are emerging. I haven’t a clue what that might mean, though.

        link to nytimes.com
        link to phys.org
        link to economist.com
        link to bostoncommons.net
        link to nature.com
        link to cns.utexas.edu
        link to nautil.us
        link to gizmag.com
        link to nytimes.com
        link to med.stanford.edu
        link to technologyreview.com
        link to technologyreview.com
        link to wired.com
        link to scottaaronson.com
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org
        link to en.wikipedia.org

        Reply

Leave a Comment

Your email address will not be published. Required fields are marked *