INFORMATION TECHNOLOGY SYMPOSIUM
187
everything will come to a halt. Another failure is to not anticipate the
interaction between different fields: how, for example, neurobiology
will affect the evolution of computer software.
If
you just look in one
narrow area you can anticipate certain trends, but you see much more
powerful things happening if you consider the interaction between dif–
ferent fields. And the most common, and I think, most grievous failure,
is to not anticipate the accelerating nature of change and the accelerat–
ing pace of technology.
That's something I want to talk about in a bit more depth. The pace
of technology and the pace of technological progress and its impact on
civilization are accelerating. Something that may have taken forty years
in the past isn't going
to
take that long now. In the future, a comparable
paradigm shift isn't going to take forty years- it may only take six
years. The pace of change is greatly accelerating, which is a key insight
into understanding the twenty-first century. The twenty-first century
will not be like the twentieth century; we'll make as much progress in
the next twenty years as we did in the entire twentieth century.
Now
to
Moore's Law. Moore's Law has been used synonymously with
the exponential growth of computing: the price/performance of comput–
ing is growing exponentially. You only have to open the morning paper
to
see how the computers you can buy today have made the ones that
you could have bought a month ago obsolete. That pace seems to get
faster and faster. How much longer can this go on? In order to answer
that, we need to understand Moore's Law. I don't mean the definition I
just gave, but what is it an example of, and why is this happening?
It
is
part of a broader phenomenon and there's very little information.
In
fact,
I can't really find anything written about the genesis of Moore's Law
other than the sort of strict, narrow, technical aspects of it.
I was at a computer conference called "Agenda
2000"
two weeks
ago. There, the chairman of Intel predicted Moore's Law will continue
for ten
to
twenty years . The general thinking is that we can shrink the
size of transistors for at least ten but no more thau" twenty years
because, at that point, the key features will only be a few atoms in
width, and that paradigm will break down.
So what happens then? Is that the end of the exponential growth of
computing? That's a key issue because one would come up with very dif–
ferent visions of the twenty-first century, depending on how one answers
the question. So, in order to gain some insight, one of the first things I
did was take forty-nine famous computers and calculating devices
(machines that manipulate information using electrical means), going
back to
1900,
and put them on a chart. I had the computing device that