20th World Congress of Philosophy Logo

Philosophy of Technology

Synthetic Biology: The Technoscience of Artificial Life

John Sullins

bluered.gif (1041 bytes)

ABSTRACT: This paper uses the theory of technoscience to shed light on the current criticisms against the emerging science of Artificial Life. We see that the science of Artificial Life is criticized for the synthetic nature of its research and its over reliance on computer simulations which is seen to be contrary to the traditional goals and methods of science. However, if we break down the traditional distinctions between science and technology using the theory of technoscience, then we can begin to see that all science has a synthetic nature and reliance on technology. Artificial Life researchers are not heretical practitioners of some pseudoscience; they are just more open about their reliance on technology to help realize their theories and modeling. Understanding that science and technology are not as disparate as was once thought is an essential step in helping us create a more humane technoscience in the future.

bluered.gif (1041 bytes)

Introduction

As soon as the new sciences of Complexity, Chaos Theory, and Artificial Life (hereafter referred to as AL), began to be noticed by the popular science press a kind of "honeymoon" period began. During this time these sciences were seen as the sexy new breakthrough theories that would eventually lead to our ability to solve all the problems of the world, from the cure for AIDS to the complete understanding and synthesis of living systems. (1) Recently a number of attacks have been leveled against the studies of Complexity and Chaos Theory in general and on the study of AL directly. The most damning of these attacks on AL has been launched by John Horgan in his article "From Complexity to Perplexity," printed in Scientific American (Horgan 6/95) and in his book The End of Science. In his article Horgan fiercely criticized the study of AL with the implication that the entire study is some kind of sham. Horgan states that:

Artificial Life — and the entire field of complexity — seems to be based on a seductive syllogism: There are simple sets of mathematical rules that when followed by a computer give rise to extremely complicated patterns. The world also contains many extremely complicated patterns. Conclusion: Simple rules underlie many extremely complicated phenomena in the world. With the help of powerful computers, scientists can root those rules out (Horgan 6/96, Pg. 107).

Horgan goes on to argue that this position held by AL researchers is untenable and that the kind of science practiced in AL is no where near the kind of science that is normally practiced in biology. I agree that mathematical models are not as accurate as one would like when studying natural systems, in fact I have presented arguments myself which attempt to refute some of the strong claims made by AL researchers as to the ability of mathematics to embody living systems (Sullins 1996). But I think that Horgan takes his argument too far. Horgan suggests that the synthetic nature of the study of AL is closer to poetry than it is to science (Horgan 6/96 pg. 107). Horgan seems to feel that this fact degrades AL in comparison to other sciences. I completely disagree with Horgan on this point. As I will attempt to show in this paper, AL is not fatally flawed in its methods of research and it practices science much the same way as any modern science. One can only hold a position such as Horgan's if one fails to understand that AL is an example of a technoscience, as are all modern sciences, and that synthesis is and always has been a part of these knowledge producing endeavors.

In order to prove this point we will go through some recent theories presented in the literature of the field of the philosophy of technology where we will look at the implications of how science relates to technology and how both effect the societies in which they are contained. Once this is accomplished we will see that it is not AL that is fundamentally flawed, rather, our problems rest in our old understanding of the relationship between science and technology.

Is Artificial Life a Technology or a Science?

When one thinks about the subject of Artificial Life it is hard to determine if AL is the study of a new science of life like systems or if it is the practice of a specialized subset of computer technology. The problem arises because it is not possible to imagine the current study of AL without the use of advanced computer technology.

Artificial Life is the attempt to model and synthesize life in a dynamic and complex way (Langton 1989). As it turns out the study of complex dynamic systems which exhibit a complexity resembling that of a living system is best accomplished by conceiving of a mathematical model and then letting it run on a suitably programmed computer. The modeled system's behavior can then be analyzed, allowing the researcher to then make a judgment about the model's similarity to a truly living system (Langton 1989). These complex models can be very taxing on computer resources and serious AL research can only be accomplished on powerful computing machinery. For this reason, the study of AL did not really develop as a separate scientific study until rather recently when the computer had become a well-established tool in science.

However, we must not forget that the study of AL has grown out of a long and fascinating history of ideas that predates the computer. AL is a science that has taped into certain basic drives in western culture that have sought to predict and control nature. Simon Penny suggests, in an essay printed in Scientific American, that the study of AL is motivated by a desire which has been deeply imbedded in western culture for centuries where:

"[A]rtists and inventors have attempted to imitate nature (a process known as mimesis) and to simulate the qualities of being human (anthropomorphism). These twin drives, which lie at the very heart of western culture, blur the lines between animate and inanimate, between human and machine (Penny 1995).

These cultural motivations first appear in the attempt to simulate life through artistic endeavors and leads eventually to the invention of mechanical simulations. As we will see these mechanical contrivances became more and more lifelike as our technology advanced. The anthropologist Stefan Helmreich has also noted the tendency for these deeply rooted drives to express themselves in the field of AL and has written persuasively on the subject (see Helmreich, 1994 Pg. 385). Helmreich sees AL as the current incarnation of a drive to create simulacra of living things that can trace its history through "...the Rabbi of Prague's Golem of clay, to the hydraulic and mechanical contraptions of mid-millennium, to Mary Shelly's Frankenstein, to the von Neumannian self-replicating automata...."(Helmreich, 1994, pg. 385). These cultural drives, anthropomorphism and mimesis, transcend disciplinary boundaries and inhabit the most sophisticated technologies available at any historical moment (Penny 1995). In the previous century there were many historical examples of mechanical simulations of life based on simple mechanical gears and pulleys animating a life-like puppet of some sort or another. These clockwork automatons, such as the famous mechanical ducks built by Jacques de Vaucanson in 1735 (Langton 1989, P. 9), could eat, drink, quack, etc., in a way similar to a real duck. These automata where very compelling in their time but would not be considered an adequate scientific model of a truly living duck today by any but the most theory-blinded mechanist. Still Vaucanson was an early explorer of AL and as his modern counterparts do today he labored, "Under the influence of contemporary philosophic ideas, he tried, it seems, to reproduce life artificially" (Langton 1989, P. 8).

Vaucanson was limited by the level of technology available to him in the creation of his models. His automata where typically mechanical puppets, which where controlled by levers and pulleys which passed through the feet of the contrivance and lead to a much more complex piece of machinery bellow the stage. The unseen mechanisms below were responsible for the complex behavior of the puppet above. Still the puppet was not controlled by a human puppeteer, all of its motions were mechanically preprogrammed like clockwork and this was a truly novel innovation for the time. (2) These contrivances where limited by the nature of their construction so that they could not, for instance, move very far and they where limited to a certain set of programmed motions. Any one seeing one of these mechanical ducks today would be impressed by the workmanship displayed in there construction (the wing alone was made of 400 parts), but no one would think of them as being alive or that these machines are a good model to study living ducks by.

In his article "Artificial Life," Christopher Langton sketches out the history of AL starting with the mechanical Ducks we have been discussing and going on to introduce other mechanical automatons. Although Langton does not state this explicitly, these automatons are the first attempts at making models that could be used to prove the theories of the philosophy of mechanism. But this pursuit did not produce anything more than drawing room curiosities until the modern era when the work of the great logicians such as "Church, Kleene, Gödel, Turing, and Post formalized the notion of a logical sequence of steps, leading to the realization that the essence of a mechanical process — the "thing" responsible for it dynamic behavior — is not a thing at all, but an abstract control structure, or "Program" — a sequence of simple actions selected from a finite repertoire" (Langton 1989, P. 10). This conceptual freedom allowed theoreticians to move beyond the confines of simple practical mechanical processes and enter the heady realm of theoretical mechanical processes.

It is at this point in the history of ideas that modern AL studies truly begin. After the late 1940's we begin to see theoretical mathematicians and physicists taking an interest in answering questions concerning the phenomena of life. At this time we have Erwin Schodinger's book "What Is Life?" where he attempts to apply the mathematical rigor of physics to the study of biology, and we also see the beginnings of the mechanical theories of life developed by John von Neumann with his theory for self replicating automata. As von Neumann's automata are conceived of as a thought experiments and not as practically constructable machines it is clear that we have moved beyond the eccentric parlor games represented by the mechanical duck and into what we would recognize as modern serious science (Sigmund 1993).

Von Neumann's automaton is a machine that is only conceivable as a logical process. The first version of his thought experiment has us imagine a machine floating around in a pond stocked with all the basic parts necessary to build an exact duplicate of the machine. Through a complicated logical process that von Neumann thoroughly describes, the machine is able to build an exact duplicate of itself, thus achieving self-replication. Von Neumann believed self-replication was the essential ingredient for a thing to be considered living (For a complete description of the von Neumann automaton see Langton 1989, Emmeche, 1994, or Sigmund 1993). Von Neumann eventually abstracted this thought experiment even further as he felt, paradoxically, that it was necessary to remove all reference to reality in order to determine the essential logical structure of self reproduction (Sigmund 1993). In order to achieve this abstraction from the real world von Neumann's final version of his self-replicating machine exists as a mathematical formula that is expressed through the mathematical tool of the cellular automaton (Sigmund 1993).

Cellular Automata are a type of mathematical formula in which the variables are best conceived of as cells in a domain of some certain description. For example, we could make a two dimensional Cellular Automaton by drawing a checkerboard pattern on some paper and creating "rules" which will tell us what "State" each square is in during each time step of the program. For instance, each square could be blank or filled with a veriable of some numerical value determined by the rules of the system (Rucker 1995). The rules used in determining the values of each "cell" in the automaton are usually something like this: if the cell is blank in this time step and three of its eight neighbors are filled then the cell is to be filled in the next time step.

These rules can be simple or complex depending on the needs of the programmer. Von Neumann's automaton is relatively complex, the cells in this simulation can be in any one of twenty-nine different states, when the cellular automaton's program is set in motion it can build an exact duplicate of itself using only the logical rules described in the program.

Von Neumann was able to provided us with a purely formal description of mechanical self-reproduction. From that point on the cellular automaton has become the dominant paradigm used in the study of artificial life. Today there are many programs that use the cellular automaton to model living systems. Unfortunately the problems with cellular automata are that they are tedious, if not impossible, to work out by hand, and even though the behavior of a cellular automaton is completely deterministic, since the interactions between many cells is so complex there is no way to predict the behavior of the automaton without physically running through the entire simulation.

This unfortunate situation has a happy ending though. There is an easy technological fix to the weaknesses of the cellular automaton. Modern computers are extremely adept at running through the myriad of tedious calculations required to set the automaton in motion and, as an added benefit, the screen of a computer monitor can easily provide a compelling visual accompaniment to the cellular automaton as the pixels on the screen are perfect for representing the abstract cells in the automata, with different colors representing different variable states in the cells (Rucker 1995).

Thus the technology of the computer has provided any interested AL researcher with the ability to acquire relatively cheep and speedy machines upon which he or she can then test out his or her theories. As one could easily guess, the most successful models of living systems have been created using the most advance computer technology available (Langton 1989 & Emmeche 1994).

One might ask why so much faith is invested in computer technologies as the proper tool for studying life. Christopher Langton answers this question as follows:

Computers should be viewed as second-order machines — given the formal specification of a first-order machine, they will "become" that machine. Thus, the space of possible machines is directly available for study, at the cost of a mere formal description: computers "realize" abstract machines (Langton 1989 P. 11).

Since life is preconceived of as a machine in mechanistic philosophies of life then it follows from the above that anyone with a good enough computer and a properly conceived program can make useful contributions to the life sciences.

So we can see that the study of AL is intimately tied to the current technology of computers. This means that the study of AL is completely determined by what the computer is currently able to do. The AL researcher not only has to be savvy in the fields of the life sciences he or she also has to be able to use computer technology to its fullest through a complete understanding of the hardware and the ability to program software. The AL researcher has to be or have access to a good programmer. Through the skill of programming the AL researcher, in a very real way, crafts the world that he or she plans to study. This fact has put some theoretical biologists off from seeing AL as truly contributing to the knowledge of living systems and biology (Emmeche, 1994, 160). The awkward situation of having a technology color a field of study so thoroughly places the science of AL squarely in the middle of a debate that has been going on for some time in the philosophy of technology. And that debate is over the question regarding what is the relationship between science and technology?

Science and Technology or Technoscience?

The battle between the perceived superiority of science over technology is an old one dating back to the ancient Greeks or at least to the exhuming of Greek philosophical ideas in the middle ages. Technics have, to paint a broad picture, been seen to be the province of science applied, not science practiced.

This is certainly true of the attitudes towards science and technology coined during the industrial revolution where, "...the now established nomenclature which relates the "pure" or "theoretical" sciences such as physics, to the "applied" sciences such as engineering were already in place (Ihde 1993, Pg. 15). This concept has been slowly breaking down under the pressure of how modern science is actually practiced with big money spent to pursue pragmatic problem solving goals. In the early phases of western history it might have been proper to argue that the scientific musings of the idle elite would have very little impact on the daily lives of the craftsmen and workers who kept the society operating, but today each new scientific discovery quickly suggests new technological applications and each new technology creates new tools to be used in scientific endeavors, high technology and science have become almost indistinguishable.

...[E]arly Modern Science was preceded by the beginnings of the world voyages and the discovery of the New World and the rise of science as distinct disciplines accompanied by the machine age of the Industrial Revolution, the twentieth century saw science become fully a technoscience now thought to be the "motor" which drove and developed what is called Modern Technology, presumably distinct from any ancient or traditional technologies (Ihde 1993, 17).

Without embarking on a lengthy detour we need to define more carefully what is meant by the term "technoscience." This term was first used by Bruno Latour and it is used to refer to his belief that modern science is embodied in its instruments (Idhe 1979, Pg. 75). What this means is that no longer is science practiced purely as an observation of natural events but that the entire experimental process is influenced entirely by the apparatus used to conduct the experiment, the scientist experiences the phenomena he or she studies through scientific instruments and without those instruments modern science is impossible, science and technology are now inseparable.

Artificial Life is the Technoscience of Biology

In contrast to other sciences like chemistry and physics, biology has long lagged behind in its ability to produce large-scale benefits to technology and society . There has long been a kind of physics envy in certain circles of the study of Biology. Traditionally Biology has not been the benefactor of big corporate or government funding and I would suggest that this is due to Biology's failure to define its ability to contribute to profitable technological advances in fields other than medicine.

The belief that modern science should be seen as a technoscience should come easy to any AL researcher. It is a common argument among ALifers that the new science of AL has opened up an entirely new set of questions for biology to study. No longer is theoretical Biology limited to questions concerning life-as-we-know-it, we can now, through the use of AL simulations on computers, look at questions derived from the study of life-as-it-could-be (Langton 1989, Pg. 2). Since, as we saw above, the science of AL was brought about through advances in computer technology, it is easy to assume that with each new advance in computer technology new techniques of studying AL will be enabled allowing more, as yet undreamed of, questions to be answered. AL's relationship with the technology of computers is reciprocal. The study of AL has already produced spin-off technologies that are beginning to see common uses in computer graphics and in programming techniques. An example of this is the growing use of the programming technique called the Genetic Algorithm Paradigm. The Genetic Algorithm Paradigm is a relatively new form of computer programming that attempts to harness biologically inspired Darwinian evolutionary pressures in the creation of computer algorithms (Koza 1989). This new programming technique was developed when AL researchers wanted to mimic the transfer of genes between two members of a species when matting occurs. In order to duplicate this phenomenon programming methods where developed where parts of the code that are used to described one organism could be "spliced" as it were to other parts of code which described a second organism thus creating a new hybrid computer program that described a third progeny organism. This was seen as an adequate model of sexual reproduction on a computer. This model has since been modified and applied to more traditional problems in computer science and has been developed into a new technology that allows a computer to be set to a specific task and then starting from a group of random programs it "evolves" new programs by combining the "seed" programs in myriad's of combinations until one emerges that can solve the problem, thus the machine automatically programs itself (Koza 1996). This is a truly novel invention and it stems from research in AL.

From this quick overview on can easily see that the science of AL is a perfect example of a technoscience as it is described in the literature of the philosophy of technology. The questions that the science of AL attempts to answer are formed by the technology used to answer them, and in turn the technology used by the science is modified by the demands and breakthroughs of the science. Since AL is an outgrowth of the study of theoretical biology the useful additions to computer technology should allow theoretical biology to begin to move into the powerful interests of technoscience which previously has been the exclusive realm of physics and chemistry.

Some Thoughts on Technoscience

The concept of a technoscience is unsettling to those who believe that science has a privileged access to the demonstrable truths of reality. Once we begin to accept that science can mix with, and in some ways, be determined by an other human praxis like technology it makes it harder to argue that science is not just one human praxis among many that have the ability to make adequate statements about reality. This is the kind of argument that frightens commentators on science such as Horgan, because these arguments would seem to open the door to uncontrolled relativism and a resulting epistemological paralysis.

This is another point in which a lengthy digression could follow but instead I will just mention that this concern has been quite effectively ameliorated by feminist theories of science such as Sandra Harding's Whose Science? Whose Knowledge? In her view, the realization that traditional science is not an automatically privileged view of the truth is a liberating experience that allows us to pursue a new science that will be "less partial and less distorted," and therefore more true to the actual goals of the pursuit of knowledge (Harding 1991).

I tend to agree with the theories critical of the view that science has an unreproachable privileged access to the truth that no other form of human knowledge can approach. The belief that a progressive positivistic science and technology will somehow automatically save us from all the ills of our times is seen to be laughably naive today. It is of no use, as Horgan is want to do, to pine for the lost days of the heroic age of science, they never really existed anyway. It is much more useful in a pragmatic sense to acknowledge that science is now (and probably always was), dependent on technology and that science is a human product just like technology and does not posses the strange mystical ability to discover disembodied transcendent truths. Understanding the weaknesses of a synthetic science is an important step in making that science work to our best interests. Seeing science as technoscience allows us to create a clearer conception of science as compared to technology and their proper relation to each other.

bluered.gif (1041 bytes)

Notes

(1) See Out of Control, by Kevin Kelley 1995, Artificial Life by Levy 1992, and Complexity by Levin 1992, and Chaos by Gleick 1987, for an overview of the initial excitement surrounding these topics.

(2) For a description of the public reaction to these devices see Langton 1989, P. 8.

Bibliography

Clark, Stephen R. L., "Tools, Machines and Marvels," in Philosophy and Technology; Royal Institute of Philosophy Supplement:38, Rodger Fellows Ed. (Cambridge: Cambridge University Press, 1995).

Emmeche, Claus, The Garden in the Machine, The Emerging Science of Artificial Life, Trans. Steven Sampson. (Princeton: Princeton University Press, 1994).

Gleick, James. Chaos, (Viking Penguin, 1987).

Harding, Sandra, Whose Science? Whose Knowledge? (Ithaca: Cornell University Press, 1991).

Heim, Michael, The Metaphysics of Virtual Reality, (New York: Oxford University Press, 1993).

Helmreich, Stefan, "The Historical and Epistemological Ground of von Neumann's Theory of Self-Replicating Automata and Theory of Games," in Towards a Practice of Autonomous Systems; Proceedings of the First European Conference on Artificial Life, Francisco Varela and Paul Bourgine, Eds. (Cambridge: MIT Press, 1994).

Horgan, John, "From Complexity to Perplexity," in Scientific American, Pg. 104, June 1995.

Horgan, John, The End of Science,

Ihde, Don, Philosophy of Technology, an Introduction, (New York: Paragon House Publishers, 1993).

Kelley, Kevin, Out of Control: The New Biology of Machines, Social Systems and the Economic World, (Reading: Addison-Wesley, 1985).

Langton, Christopher G. (ed.), Artificial Life, (Redwood City: Addison-Wesley, 1987).

Levy, Steven, Artificial Life: a Report From the Frontier Where Computers Meet Biology. (New York, New York: Vintage Books Edition 1992).

Lewin, Rodger, Complexity: Life at the Edge of Chaos, (Macmillian Publishing, 1992).

Sigmund, Karl, Games of Life; Explorations in Ecology Evolution and Behavior, ( Oxford: Oxford University Press, 1993).

Penny, Simon, "The pursuit of the Living Machine," in Scientific American, Pg. 216, September 1995.

Rucker, Rudy, Infinity and the Mind, (Princeton: Princeton University Press, 1995).

Suchman, Lucy, "Do Categories Have Politics? The Language/Action Perspective Reconsidered, in Computer Supported Cooperative Work (CSCW), Volume 2 no.3 Pg. 177, 1994.

Winner, Langdon, The whale and the Reactor (Chicago: The University of Chicago Press, 1986).

bluered.gif (1041 bytes)

 

Back to the Top

20th World Congress of Philosophy Logo

Paideia logo design by Janet L. Olson.
All Rights Reserved

 

Back to the WCP Homepage