Mind Matters Natural and Artificial Intelligence News and Analysis
George-Gilder-Uncommon-Knowledge-2018
George Gilder talking with Peter Robinson at the Hoover Institution
Video still from Uncommon Knowledge at YouTube

George Gilder on the Real-Life Prospects — and Limits — of AI

Gilder, who is organizing the COSM conference in Bellevue, Washington, in October, clears the fog about “the cloud”
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Recently, tech philosopher George Gilder, who is organizing COSM 2019 (Bellevue, Washington, October 23–25) on the future of artificial intelligence, sat down with Peter Robinson of the Hoover Institution to talk about how AI can help us to a better future, not a worse one.

About AI, not many people succeed in predicting a future in any kind of detail. Once we get past “In the long run, we will all be dead,” it usually gets pretty cloudy. Yet here’s Gilder who said — in Life After Television (1994) a quarter-century ago — “The computer of the future will be as portable as your watch and as personal as your wallet. It will recognize speech and navigate streets, collect your mail and your news.” And that’s what happened. Most average consumers who were on dial-up internet in 1994 would have found today’s online world pretty hard to believe.


Join George Gilder and some of the world’s leading tech minds in Seattle for COSM Technology Summit October 23-25, Bellevue, Washington! Interact with Peter Thiel, Ken Fisher, Ray Kurzweil, and Babak Parviz (Google Glass inventor) https://cosm.technology/

The service is better but in the meantime, we have somehow also given huge corporations vast power over the information that we give them and they give us.

Some highlights: Gilder thinks that the direction is wrong but that it is reaching a natural limit as well:

Peter Robinson: You spent some time in Life after Google describing the Dalles, if I’m pronouncing that correctly-

George Gilder: Yeah, yeah, yup.

Peter Robinson: … which is the huge Google data center up in Oregon. In Life After Google, you write of the diminishing returns of big data, so let me understand if I, let me make sure I understand at least one part of your argument correctly. The delusions come next. First there are certain physical, almost-physical constraints. We’ve reached the point now at which no matter how big your data center, improvements in parsing data are only going to be incremental. It’s going to be difficult to get enough power. It’s going to be difficult to cool these machines adequately, which is why Google’s big centers up at the Dalles because there’s a huge dam there, which means cheap hydroelectric power and cold water for cooling. So that’s argument number one, they’re bumping into physical limitations. Is that correct?

George Gilder: This is a symbol. The Dalles and all their data centers, parked like aluminum plants beside big bodies of water, or near glaciers, their various other means of cooling, just like an industrial plant of the previous era. I think that this cloud computing, which was a great triumph for its time, and dominated its time, is now reaching the end of the line. A great computer scientist named Gordon Bell ordained a proposition called Bell’s Law, which is that every 10 years, Moore’s Law, which is the doubling of computer power every year, yields a hundred-to-thousand-fold rise in total computer capability and requires a completely new computer architecture. I wrote about the cloud first. I hailed the cloud in an article in Wired in 2006, and said that it would dominate the next Bell’s Law phase. But it’s now 12 years since 2006, and that Bell’s Law regime of cloud computing, huge data centers, all parked by bodies of water, is coming to an end.

Peter Robinson: Okay. I just want to tie, make sure that I understand this. I want to emphasize this because I think I’m right about it. Cloud computing, I don’t know who the genius was, maybe I’m talking to him now, who first conceived of the notion of the cloud because it puts in the mind of the ordinary user the sense that somehow or other, computing has now become ethereal. It’s just up there. It’s not up there. It’s in big, industrial-scale operations at the Dalles in Oregon and other ….

George Gilder: 80 different sites around the world, I think Google has now.

Peter Robinson: Okay.

George Gilder: Big, big data centers.

Peter Robinson: So the cloud isn’t the cloud. It’s factories, essentially, of huge computers. Full transcript.

In short, the “cloud” GIlder predicted is things, not ideas. Thus it is subject to the limits of things, as opposed to the limits of ideas. It’s reasonable to ask where that limit is. It is probably both a question and an answer that we can understand.

Paypal’s Peter Thiel and publisher Steve Forbes will be speaking at the conference, to help figure out what our best bets are with high tech.

Here are some questions: What are the benefits and drawbacks of investing and doing business with China? Of blockchain, and crypto? What about biotech, 5G, and life after Google? And how do we buy privacy?

China, for example, has a completely different idea of some of the purposes of high tech: High tech promises total government control to prevent popular dissent. How will the world choose between that vision and the kind of vision Gilder outlines?

The discussion is just beginning and will continue long after the conference. Watch this space.


See also: COSM: George Gilder Hosts Top Tech Guys, To Ask, Where Is All This Going? Are machines replacing or helping us and how will we know the difference?

George Gilder: Cloud Computing Is Reaching Its Limits The “cloud” isn’t something ethereal “up there,” Gilder reminds us; it is giant factory floors of computers.

and

George Gilder explains what’s wrong with “Google Marxism” In discussion with Mark Levin, host of Life, Liberty & Levin

Featured image: Corridor of server room/monsitj, Adobe Stock


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

George Gilder on the Real-Life Prospects — and Limits — of AI