566
PARTISAN REVIEW
Bill joy's article. So he and I often have been paired as the pessimist and
the optimist regarding the future of technology. However, I do recognize
the dangers, and I think we should not minimize them, as a lot of peo–
ple do. I was at one of these discussions at Harvard a few months ago,
and a Nobel Prize-winning biologist, (referring to the danger of nan–
otechnology run amok, self-replicating nanotechnology that could form
a nonbiological cancer), said, "We're not going to see that for a hundred
years." I said, "Actually, that's a pretty good estimate, of the amount of
technical progress needed to achieve that milestone at today's rate of
progress. But since we're doubling the paradigm shift rate every ten
years, we'll make a hundred years of progress in the next twenty-five
years." I don't think it's fruitful to argue that these things won't happen.
We are confronted with some grave dangers, and we need to examine
them. The closest at hand, abuse of biotechnology, the same technology
that is going to cure cancer and save billions of people from disease, can
also allow terrorists to create a bioengineered pathogen. It's not that dif–
ficult; the knowledge to do so is becoming available. A lot of people
would say, we shouldn't go down that path, because it's too dangerous,
and you can describe some pretty dire scenarios. I think if you described
today's world to people two hundred years ago, such as enough nuclear
weapons to wipe out all life on earth, people would think it mad to go
down this path. But, conversely, very few people today would really
want to go back two hundred years. Most of us wouldn't be here, for
one thing, with an average life span of thirty-seven years. Ninety-nine
percent of the human race lived lives of dire poverty, disaster prone and
labor-intensive. We've largely liberated ourselves from many of these
things. We have tremendous economic imperatives to go forward, which
is why we will go forward. An underlying economic imperative in my
mind is a moral imperative. We have not millions, but billions of people
who live in poverty, who are suffering from age-old diseases which we
will be able to overcome within ten, twenty, or thirty years. I think we
have a moral imperative to move forward.
All of these dangers have to do with self-replication. Disease is self–
replication, and we'll be able to manipulate disease so that we can
amplify the destructive potential of technology. Self-replicating nan–
otechnology will be even more powerful. Killer robots are several
decades away. Maybe these super Als won't be friendly to us anymore .
I think we can take some comfort from an experience we've had with
one new form of human-made pathogen, which is self-replicating and
which didn't exist thirty years ago-the software virus. When these
viruses first appeared, people said they were primitive, but that as they