paideia3.gif (8894 bytes)

Proceedings of the Twentieth World Congress of Philosophy

Volume 10: Philosophy of Science

Introduction

Tian Yu Cao

At this stage of evolution of our discipline, philosophy of science, there seems no single great theme that has attracted the attention of most practitioners in the field. Rather, scholarly works in the field are quite diffused. Traditional topics, such as reductionism and the unity of science, remain to be carefully examined from various perspectives. The debate over realism versus instrumentalism, although dismissed by some as uninteresting and unproductive, is still taken by many active scholars as vital in our understanding of the nature of scientific theories and in our appraisal of scientific enterprise. Of course, the new spirit of our times has also shown itself, sometimes in the discussion of some major theses in the post-empiricist philosophy of science, such as incommensurability, underdetermination and constructivism, on the basis of detailed examination of scientific materials, other times in the shift of the attention from static aspects of science, such as theory justification and the nature of explanation, to more dynamic aspects, such as negotiations within a scientific community and its function and role in the development of science. Even more to the point in this regard is the great attention paid to the place science has occupied in modernity, the defect of modernist understanding of science and its possible remedy. Not surprisingly, this current state of affairs in our field has also manifested in the essays collected in this volume of contributions to the Twentieth World Congress of Philosophy. Thus, an inspection of these essays, although it will not provide us with a focus of the discipline, may give us some sense as to what is going on in the field.

I. Reduction and the Unity of Science

According to logical empiricists, the unity of science can be realized through the reduction of all scientific disciplines to physics. Daniel Bonevac rejects the idea of such a global reduction and the associated physicalism. Instead, he argues for the idea of a local reducibility and the associated fainthearted physicalism. The latter, according to Bonevac, either comes from multiple realizability or from supervenience. Since the supervenience is usually supported by the inductive evidence-not through conclusive proofs-of connections between properties at the base and supervening levels, it offers a kind of physicalistic explanation of, or provides sufficient reason for, the higher level properties. Although this sufficient reason, due to its inductive nature, is defeasible, when it is defeated, it is defeated for reasons.

Such a principle of defeated sufficient reason, according to Bonevac, represents one way of understanding the completeness of physics. The idea is that a higher-level scientific fact is never a brute fact. There is "always some physical ground for it. That ground is generally but defeatedly sufficient for the fact in question."

One merit of Bonevac's discussion is that he takes the higher-lower level dependence as an empirical fact, subject to inductive investigations. For most reductionists, however, the dependence has a much stronger or even metaphysical sense. For example, Bruce Glymour and Marcelo Sabatés in their argument for macro-level indeterminism, appeal to Jaegwon Kim's principle of causal inheritance, according to which the causal powers of a realized (higher level) state are identical to the causal powers of its realizer (lower level). The counter-intuitiveness of macro level indeterminism shows, reductio ad absurdum, the defects of the principle of causal inheritance. It is possible that because of special way of organizing, or because of the influence of the environment, which is negligible in a micro state but not negligible in a macro state, the causal powers of a realized macro state may not be identical to the causal powers of its realizer (micro state).

By arguing that ontology is supervenient on scientific theories, C. Ulises Moulines tries to reduce the question of ontological reduction (physicalism) to that of theoretical reduction (unity of science). However, Moulines in alert enough to note that the unity of science does not entail the unity of being, because even within a physical science, Newtonian mechanics for example, there can be several different categories of being, such as those of matter, force, space and time, none of them can be reduced to others.

The interesting question raised in Moulines's essay is, assuming both theoretical and ontological reduction are impossible, whether a unitary system of ontology, or a unitary picture of the world is attainable. His answer is that all depends on whether the fundamental theories reducing the rest are compatible to each other or not. If they are, then we can still talk about a unitary ontological system, about one world, defined as the disjunction of the domains of fundamental theories. If they are not, then our picture of the world is fragmented. Contrary to Moulines's assertion, according to a careful ontological analysis of the existing fundamental theories and their interrelationships, the latter is precisely the situation of present-day science. The reason is not what Moulines assumes-namely, that the principles of present-day fundamental theories, general relativity, and quantum field theory, cannot be derived from each other. Rather, their ontological commitments, one to the fixed Minkowskian space-time background, the other to a dynamic gravitational field, whose properties constitute chromo-geometric behavior of other material existence.

New light is shed to the same set of questions (physicalism, reduction, emergence and layered image of reality, etc.), in Manuel Liz's essay, by exploring the possibility that there can be new physical properties in the world. A property would be a new physical property from certain time if and only if (1) up to t, the property in question did not exist or, if it did exist, its possible instances did not have the appropriate causal effects, and (2) from t the property gets to have possible instances showing the appropriate causal effect. Liz shows convincingly that there are four general situations in which we have to admit the existence of new physical properties.

One consequence of Liz's idea is to set a limitation to reductionism, and thus justifies non-foundationalist physicalism without committing to ontological dualism. It also justifies the idea of emergence and the layered image of reality, which is in line with P. W. Anderson's discussion on what is to be taken as basic science.

A more systematic discussion on reductionism, although put in the context of biology, is given by Michael Ruse. By ontological reduction he means the belief that all the entities in the world are of the same logical type, which, although quite weak in the study of mind, is widely accepted these days in biology because of its suggestive power. By methodological reduction he means the attempt to explain bigger things in terms of smaller things, which he takes as the general philosophy of scientists ever since the scientific revolution. Ruse argues that the actual message by the critics of methodological reductionism is to assume that higher levels must remain unexplained at lower levels, and this is a big mistake because it is a recipe for not looking at a problem as hard as one might.

The case for theoretical reduction, which means the explanation of one theory in terms of another theory by ways of logical deduction, is much more complicated. While a fair amount of intra-theoretical reduction (e.g. Mendelian biology to molecular biology) is possible, there are serious problems about the possibility even in principle of inter-theoretical reduction (e.g. the reduction of biology to physics). Thus, even without the intervention of Kuhn's theory-replacement to replace theoretical reduction, the prospect for the unity of science has been faded away if not completely unattainable.

II. Realism and Instrumentalism

Lawrence Sklar's original study of the role of idealization in science is highly relevant to the realism/instrumentalism debate. Some philosophers of science, such as Nancy Cartwright, argue that the ineluctable need for idealizations in science, which are only partial descriptions of physical systems in the world, involve limiting procedures, and treat physical systems as causally disconnected from the rest of the world, has made it inappropriate to take scientific descriptions as true. They claim that all of our laws of physics lie because their application is always relative to unspecifiable ceteris paribus clauses. Thus, what science aims is not to describe the real systems of the world, but to characterize models, which has some resemblances of the real systems.

In contrast with these instrumentalist claims, Sklar argues that the issue of idealization is not the issue of misrepresentation. Instead, the question is which of a number of alternative idealizations correctly represent the fundamental causal structure of the world, and thus can serve as the core element in choosing the explanatory structure of the phenomena in question.

Elliott Sober's discussion of epistemological instrumentalism has brought subtleties involved in statistical inference to the fore. Within the context of population biology, he examines the realist motto "nothing is more predictively accurate than the truth." Since each hypothesis is an infinite disjunctions, although the motto is correct to members of the disjunctions that contain no adjustable parameters, it may fails to families of hypotheses for mathematical (statistical) reason. In the latter case, we are confronted with the situation in which some hypothesis is true but less predictively accurate, others may be false but predictively more accurate. Using this fact, Sober argues that epistemological instrumentalism, according to which predictive accuracy is the only criterion for theory evaluation, is more rational than realism, to which what matters in the end is truth, although he acknowledges that in the semantic sense, namely whether a hypothesis has a truth value, realists are on a firmer ground than instrumentalists. Reading this interesting essay, readers may be intrigued to pay more attention to the nature of statistical inference and its implications to the realism/instrumentalism debate.

Another subtle point in the realism/instrumentalism debate is treated skillfully by Rryszard Wójcicki. The instrumentalist often takes advantage of the fact that false statements in a scientific theory may not affect the adequacy of the theory. By carefully defining the concept of truth and adequacy respectively, Wójcicki is able to give the fact a realist explanation.

While all propositions have a definite truth value, conceptual compositions, the intersubjective counterparts of cognitive schemata (e.g., Minski's frames) can only be defined in terms of its scope, namely the set of questions to which the composition is applicable. It is not proper to talk about the truth of a composition. But if all the questions within its scope get true answers from the composition, then the composition is said to be adequate. Obviously adequacy is a relative idea: a composition can be inadequate to one scope, but adequate to another. Thus if false statements give no answers to any question within the scope of a scientific theory, the adequacy of the theory would not be affected by these false statements.

Wójcicki's study shows that realists have enough theoretical resources to meet the challenge of the instrumentalist. In fact, as Theo A. F. Kuipers has shown, among the most important epistemological positions in the realism/instrumentalism debate, there are good reasons for the instrumentalist to become a constructive empiricist, who is in turn forced to become a referential realist. There are also good reasons for the referential realist to become a theory realist of a non-essentialist nature, namely a constructive realist.

III. Case Studies for Philosophical Points

As philosophical reflections of scientific theories, works by philosophers of science often rely on the examination of scientific theories. But the reliance is particularly heavy in the four case studies, included in this volume, for philosophical points. Eduardo H. Flichman examines Newton's dynamics in order to put some new flavor to Kuhn's incommensurability thesis. Alberto Cordero carefully examines three empirically equivalent alternative programs to the standard version of quantum theory (Bormian quantum mechanics, the many decohering worlds approach, and the spontaneous collapse approach), and explores the implications of the underdetermination thesis on the basis of this case study. Tian Yu Cao tries to clarify the basic ontology of quantum field theory, and uses this as a ground to argue for the structural and constructive nature of our knowledge in mathematical sciences. Gary S. Rosenkrantz examines various situations in biology to develop a new concept of a natural kind, a natural kind of compound physical object, namely carbon-based living organism.

It is impossible to appreciate these essays without getting into the technical details, which is impossible in this brief introduction. Still, it is desirable to highlight the most important philosophical points Cordero's essay tries to convey. The three objectivist-realist alternatives to the standard version of quantum theory that Cordero examines turns out to be empirically equivalent. Usually, anti-realist philosophers of science would use this situation to argue against realist understanding of scientific theories and argue for the limits of science, namely that our knowledge cannot go deeper than observable phenomena. Cordero's careful examination shows, however, that the underdetermination encountered is contingent upon the best picture we current have of the world (via the best theories we accepted now, the best instruments available to us, and the auxiliary assumptions we have made). It may be broken by new instruments, by further theorizing and by new auxiliary assumptions. In this and other concrete situations, empirical equivalence, and thus underdetermination, is always relative and temporary, but never absolute. As a result, the limitations set by the underdetermination thesis is of the same nature.

IV. From Deduction to Discussion

In dealing with traditional topics, such as explanation and theory justification, the idea of deduction has played great role. As David Gruender points out, both Aristotle and Hempel took the deductive necessity as a crucial requirement for explanation, which, according to contemporary fallibility view, has to be rejected. David Grünberg's comparison of C. Glymour's bootstrap approach, in which the values of the theoretical quantities deduced from the hypothesis to be tested are used to test the hypothesis, and J. D. Sneed's eliminationist approach, in which theoretical terms are eliminated by means of the so-called Ramsey-Sneed sentence, to the question of hypothesis-testing gives him a ground to argue that the deduction-based bootstrapping approach has provided the only formally correct procedure for calculating theoretical quantities, and thus contributed to the solution to the problem of testing theoretical hypothesis involving these quantities.

But according to post-empiricist philosophy of science, what is the most important is not logical deduction, but informal discussion or controversy. In this spirit, Manuel Comesaña argues that one of important roles philosophy of science plays is to give unending discussions on unsolvable problems. The goal of the discussions is to set the criteria for evaluating scientific research. For this reason, Comesaña argues, what a philosopher of science should do is not to apply his philosophy as methodology of science to science, but to advise specialists during the stages of pre-science and crisis, and advise those who administer scientific enterprise.

In his lengthy study of controversies, Marcelo Dascal argues that only the rigorous study of controversies can provide us with an adequate description of the history and praxis of science. This is so because controversies as natural dialogical context where theories are elaborated and their meaning crystalizes are indispensable for the formation, evolution, and evaluation of scientific theories. Moreover, the study of controversies will help us to determine empirically the nature of crises that allegedly introduce an element of rupture in the evolution of science, against a continuing background. His detailed examination of whether extant epistemological options can handle controversies in a satisfactory way has further put controversies into the center of our discipline.

Appearing as a logical continuation of Dascal's essay, Miriam Solomon in her essay stresses the social nature of controversies. Looking at the controversies from a Millian fallibilistic point of view, Solomon argues for an epistemology of dissent. It is not hard to detect that in the contemporary works on the philosophy of science, such as Solomon's, the boundary between the philosophy of science and social philosophy has been blurred to a great extent, mainly due to the stress on the social nature of scientific activities.

V. Science, Modernity, and Beyond

The social nature of scientific enterprise is further emphasized in the last group of essays collected in this volume. Vladislav A. Lektorsky takes the advantage of the interaction between the history of science and philosophy of science and stresses the fact that modern experimental science is the product of a definite cultural and historical situation, namely as a result of the rise of certain system of values and ideals concerning the relationship between man and nature. Namely that man as making and cognizing agent can control natural processes, and is seemingly situated outside the objects of his activities, and nature is a resource of human activity, a plastic material that admit human interference. Lektorsky emphasizes that the ideals and norms of science arose from such a historical and cultural conditions may not work in other historical and cultural conditions. And this has raised a possibility of radically reassessment of the ideals and norms of science.

Jesús Mosterín also emphasizes the historical and cultural specificity of scientific ideals and norms, and points out that with the radical changes in those conditions, such as the destruction of biodiversity and the depletion of natural resources, the opinions of scientists, including their ideals and norms, have to be submitted to critical scrutiny, helped by humanities and value judgements expressed in them. Philosophy, according to Mosterín, should act as a bridge between science and humanities.

The same issue, the issue of bridging the gap between the sciences and the humanities, and between facts and values, is explored in a significant way by Evandro Agazzi. According to Agazzi, a satisfactory paideia must include a metaphysics, understood as a problematic horizon including fundamental perspectives about the world, the nature of humanity, its position in society, its responsibilities towards itself and the others and a critical reflection on all these, that is, the awareness of the different kinds of foundation that we must be ready to accept for convictions on various things. But the present paideia of modernity is in crisis because of its exclusive reliance on scientific rationality with values being excluded. As a result such a paideia has no unifying ground for effectively producing a global worldview.

Agazzi argues that although modern science helped modernity to overcome the risk of fragmentation of the intellectual horizon implicit in its second characteristics, individualism, and gave the ground for affirming the existence of a universal human nature, in terms of which the idea of progress can be defined, it has not been able to explore the non-cognitive values, without which we have only a narrow perspective.

What we should do is first to extend scientific rationality to all questions where reliable knowledge is required including the fields that we traditionally inscribe in the domain of humanities. And second, we should be aware of the cognitive status of science, the historical and social contextualization of science, which is an indispensable condition for an humanistically valuable approach to science. Agazzi stresses that if we can profitably exploit the deep connections of science with its socio-historical conditions that substantially link it with the paideia of the different ages according to a feedback loop, namely that science is influenced by the metaphysical, ideological, moral, political, social economic characteristics of its environment, but at the same time influences all these sectors of its environment, then a bridge between science and humanities can be built, and a way out of the crisis of modernity paideia can be found out.

redblue.gif (1042 bytes)

 

LinkTop.gif (1431 bytes)

paideia3.gif (8894 bytes)

Paideia logo design by Janet L. Olson.
All Rights Reserved.

Managing Editor: Stephen Dawson

Page Created: July 2, 2000
Last Modified: July 2, 2000

 

LinkHome.gif (1151 bytes)