Mind Matters Natural and Artificial Intelligence News and Analysis
technical financial graph on technology abstract background
financial stock market graph on technology abstract background

Is Moore’s Law Over?

Rapid increase in computing power may become a thing of the past
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

In a recent piece, Jeffrey Funk and Gary Smith noted the surprising fact that AI has little measurable effect on the current economy:

Despite the hype, AI has had very little measurable effect on the economy. Yes, people spend a lot of time on social media and playing ultra-realistic video games. But does that boost or diminish productivity? Technology in general and AI in particular are supposed to be creating a new New Economy, where algorithms and robots do all our work for us, increasing productivity by unheard-of amounts. The reality has been the opposite. For decades, U.S. productivity grew by about 3% a year. Then, after 1970, it slowed to 1.5% a year, then 1%, now about 0.5%. Perhaps we are spending too much time on our smartphones.

Jeffrey Funk and Gary Smith, “Stanford’s AI Index Report: How much is BS?” at Mind Matters News

Does that fact reflect a trend? A recent article in MIT Technology Review contemplated the end, now in sight, of Moore’s Law, which predicted a doubling of computer power every two years:

A few years ago, leading economists credited the information technology made possible by integrated circuits with a third of US productivity growth since 1974. Almost every technology we care about, from smartphones to cheap laptops to GPS, is a direct reflection of Moore’s prediction. It has also fueled today’s breakthroughs in artificial intelligence and genetic medicine, by giving machine-learning techniques the ability to chew through massive amounts of data to find answers.

David Rotman, “We’re not prepared for the end of Moore’s Law” at MIT Technology Review

But, as editor-at-large David Rotman explains, fundamental problems have begun to catch up with the electronics industry:

In truth, it’s been more a gradual decline than a sudden death. Over the decades, some, including Moore himself at times, fretted that they could see the end in sight, as it got harder to make smaller and smaller transistors. In 1999, an Intel researcher worried that the industry’s goal of making transistors smaller than 100 nanometers by 2005 faced fundamental physical problems with “no known solutions,” like the quantum effects of electrons wandering where they shouldn’t be.

David Rotman, “We’re not prepared for the end of Moore’s Law” at MIT Technology Review

Most of the problems are less spooky, more predictable: Greater efficiency in code writing will squeeze more out of today’s chip for some years down the road. But specialized chips are less versatile and the trade-offs could slow innovation.

Rotman presses the question: “But what happens when Moore’s Law inevitably ends? Or what if, as some suspect, it has already died, and we are already running on the fumes of the greatest technology engine of our time?”

There are a number of signs of slowing down, whether or not an “AI Winter” looms, as Brendan Dixon has noted:

Recent AI developments, notably those lumped under the rubric of “Deep Learning” have advanced the state-of-the-art in machine learning. Let’s not forget that prior efforts, such as the poorly named “Expert Systems,” had faded because, well, they weren’t expert at all. Deep Learning systems, as highly flexible pattern matchers, will endure.

What is not coming is the long-predicted AI Overlord, or anything that is even close to surpassing human intelligence. Like any other tool we build, AI has its place when it amplifies and augments our abilities…

As Samin Winiger, a former AI research at Google says, “What we called ‘AI’ or ‘machine learning’ during the past 10-20 years, will be seen as just yet another form of ‘computation’”

Brendan Dixon, “So is an AI winter really coming this time?” at Mind Matters News

In other words, AI may become a part of our lives like the automobile but not really the Ruler of All except for those who choose that lifestyle.

How would a certain slow settling in of AI innovations affect fans of the Singularity foreseen by Ray Kurzweil? Perhaps it wouldn’t.

A belief that we will merge with computers by 2045, for example, is perhaps immune to the mere march of events. The belief that we will contact advanced extraterrestrial beings by a similar date is likewise immune. Entire arts and entertainment industries depend on the expression of such beliefs. Whether the Singularity is now nearer or forever impossible is a discussion on an entirely different plane from the question of whether a continued reduction of chip size is economically feasible. In the cultural world of the Singularity, spring is always around the corner.


Further reading on “AI winters,” past and foretold:

So is an AI winter really coming this time? AI did surge past milestones during the 2010s but fell well short of the hype (Brendan Dixon)

Just a light frost— or AI winter?
It’s nice to be right once in a while—check out the evidence for yourself

and

2018 AI Winter is coming. Roughly every decade since the late 1960s has experienced a promising wave of AI that later crashed on real-world problems, leading to collapses in research funding. (November 29, 2018)


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Is Moore’s Law Over?