Printer Friendly

Theories and Models in Scientific Processes.

Steven French Division of History & Philosophy of Science, Department of Philosophy, University of Leeds

In her essay 'Classes, Collections, Assortments and Individuals' Ruth Barcan Marcus coins the term 'assortment' for an arbitrary finite selection of objects. Although this volume is some way from being an 'assortment', it is not entirely a collection 'gathered together under a concept' either. Culled from papers presented at the 1994 conference on 'Theories and Models in Scientific Processes' and the inaugural meeting of the Association for Foundations of Science, Language and Cognition that same year, the book is ostensibly divided into three parts: 'Models in Scientific Processes', 'Tools of Science', and 'Unsharp Approaches in Science'. The first at least shows a high degree of coherence, including as it does Czarnocka's 'Models and Symbolic Nature of Knowledge', which explores the relationship of 'quasi-isomorphism' holding between a mathematical model and its object, Hartmann's 'Models as a Tool for Theory Construction', Herfel's emphasis on 'concrete' models in 'Nonlinear Dynamical Models as Concrete construction', Kaluszynska's thoughtful 'Styles of Thinking', and Psillos' typically well-researched 'The Cognitive Interplay Between Theories and Models'. Biased towards model-oriented accounts as I am, I think this part of the book serves up the more interesting fare, although the ghost at this particular banquet is the semantic approach, which barely gets a look-in.

The second section really is an assortment and of widely varying quality, as we pass from Cartwright, Shomar, and Suarez's 'The Tool-Box of Science', which acts as a bridge with part 1 above, to Havas' neo-Hegelian 'Continuity and Change: Kinds of Negation in Scientific Progress', Krajewski's trainspotters' guide to philosophical schools of thought in 'Scientific Meta-Philosophy', and Nugayev's intemperate attack on postwar physics in his 'Classic, Modern and Postmodern Scientific Unification Patterns', which fails to live up to the promise contained in its title. Then it's another jump to 'Translation and Scientific Change', in which Rantala reconstructs Kuhn through speech - act theory, followed by Schurz's interesting and complex 'Theories and Their Applications: A Case of Non-monotonic Reasoning' before this part of the book ends with Torosian's 'Are the Ethics and Logic of Science Compatible?' which appears to suggest that the foundational problems of quantum mechanics and relativity theory might be resolved by introducing 'life' and ethics into the picture.

Finally, in part 3, things come back into focus with an emphasis on approximation, inexactitude and fuzziness. This broad umbrella covers Adam's summary of his earlier results in 'Problems and Prospects in a Theory of Inexact First-order Theories', Balzer and Zoubek's 'On the Comparison of Approximate Technical Claims', whose arid technicalities will serve to remind many why they find the structuralist programme so distasteful, Pykacz's rediscovery of a neglected paper by Lukasiewicz and its application to quantum mechanics and Zapatrin's short but suggestive 'Logico-Algebraic Approach to Spacetime Quantization'.

These are just some of the topics spanned by the total of twenty-six papers; here in more detail is a handful of some of those that I found particularly interesting and thought provoking.

Most of the essays in the first part adopt a 'deflationary' account of models, abandoning the attempt to come up with some unitary framework in favour of tracking their different kinds and uses through the sciences. Thus Hartmann in his 'Models as a Tool for Theory Construction' distinguishes the 'static' view, according to which models are used for theory testing and qualitative probing, from the 'dynamic' view, which sets out the essential role they play in theory construction. With regard to the former, he turns to Bunge's analysis of a 'theoretical model' as a hypothetico-deductive system concerning a 'model object', itself represented set-theoretically as a relational structure. On the dynamic side, we are presented with a taxonomy which, although not as fine-grained as Black's or Achinstein's, has the virtue of being supported by suggestive examples from recent physics. Perhaps it is a disinclination to erect over-arching frameworks that blocks a coming-together of the static and dynamic views but it would be interesting to analyse the latter in terms of Bunge's approach, in particular focusing on his set-theoretic account of analogy.

One of the advantages of Bunge's view, according to Hartmann, is that it allows for the theoretical background to be taken into account. This is also a concern of Psillos' in one of the most detailed and well-thought-out essays in the collection. His explicit aim in 'The Cognitive Interplay Between Theories and Models' is to 'revive, articulate, expand and illustrate' an 'analogical' approach to models defended by Hesse and Achinstein. Psillos follows Hesse in tying the heuristic role of models to the exploration of the space of neutral analogy and argues that a heuristically successful model may come to be believed as literally true, a typically robust realist claim he supports with a careful discussion of nineteenth-century optics. This is all good stuff but I'm not entirely convinced that Hesse and Achinstein make such comfortable bedfellows, given the latter's criticism of the former's position as situated within the received view and his explicit assertion that only limited aspects of the nature and role of models in science can be captured via analogy.

Defending realism is also the aim of Grobler in 'The Representational and Non-Representational in Models of Scientific Theories', where he suggests that van Fraassen's human-relative concept of observability should be replaced by Shapere's 'theory-infected' one. This, he argues, may allow us to claim that along with the empirical substructures, other substructures of models can be considered to represent as well. I am very sympathetic to such claims, although it is not clear to me that van Fraassen couldn't respond to this suggestion as he did to Churchland's proposal that we extend the boundaries of our epistemic community to include beings with electron microscope eyes. The mathematical structure of a theory is taken to be - not surprisingly - non-representational but Grobler's mention of alternative mathematical formulations may actually undercut his mathematical instrumentalism, since different formulations of the 'same' theory may produce different empirical consequences and thus generate different empirical content.

The contention that theoretical structures may represent is supported by Grobler's account of idealization which consists in 'specifying in advance the kinds of predicates expected to occur in claims being made in a given context about objects of a given kind rather than in referring to some fictitious, idealized objects' (p. 42). An extreme form of the opposing tendency is represented by Nowak who, in his 'Anti-realism, (Supra-)realism and Idealization', adopts a kind of 'super-platonism'. Nowak wants to break with the 'one world' consensus of both the realist and non-realist and go boldly beyond Lewis in positing a multiplicity of worlds which embrace the kinds of idealized entities we find in science, such as mass-points and inertial systems. Idealization is distinguished from 'abstraction' in that stripping away the intrinsic dimensional properties of bodies does away with the physical body itself, yielding a mass-point, for example. According to Nowak, statements involving such entities are literally true descriptions of existing, ideal worlds. By the end of this essay you may certainly feel as if you have gone where no metaphysician has gone before, but like many an alien ontology, this form of supra-realism is nothing if not coherent. The issue then is whether you buy into the initial presuppositions, and in particular into the reliance upon a version of the bundle view of individuality, according to which, if certain elements of the bundle are stripped away, you no longer have a physical body at all, but something else, to be ontologically situated somewhere else. From dubious metaphysical presuppositions one can only get a dubious metaphysical world(s)-view.

Coming back to this world with a bump, Cartwright is also well known for her concern with abstraction and idealization. In 'The Tool-Box of Science', (dis-)jointly authored with Shomar and Suarez, the view of models of phenomena as 'deidealizations' of theoretical models is rejected in favour of a 'toolbox' account according to which theory is just one tool among many for building this old house of science. In the first part of the paper, explicitly attributed to Cartwright, the focus of attack is the 'covering-law' account which views models as related to theories by the 'midwife' of deduction (p. 139). Who in these post-Kuhn, post-Post, post-post-'received view' days would disagree that logical deduction is inappropriate for capturing the heuristic construction of models? As for the formal relationships that can then be discerned between 'lower'- and 'higher-level' structures, the semantic approach offers a far more sophisticated analysis than the rather crude and stereotypical sketch presented here.

Contrary to Grobler, Cartwright believes that 'fundamental theory represents nothing and there is nothing for it to represent' (p. 140). 'Real' things are represented by models and the theory-independent construction of 'phenomenological' models is itself represented in the second half of the paper by Shomar and Suarez's example of the London and London macroscopical interpretation of superconductivity. This is an interesting and suggestive case study but anyone with a knowledge of the history of superconductivity might well be suspicious of its being conscripted in support of Cartwright's views; it is worth noting, for example, that F. London himself explicitly rejected the model as 'phenomenological' in his classic monograph Superfluids (Wiley, 1950).

A similar line is maintained by Kaiser, the title of whose paper, 'The Independence of Scientific Phenomena', pretty much gives the game away as to its central thesis. Kaiser argues for a separation of phenomena from both theory and data, using as his case study the by now well-known history of plate tectonics. Idealization again plays a crucial role in this separation: theory refers, not to particular data sets, but to phenomena, which are regarded as having an 'abstract or idealized nature' (p. 193). The geomagnetic reversals which were the subject of the Vine - Matthews hypothesis constitute such a scientific phenomenon, more general than the data themselves, yet independent from the theory of plate tectonics. According to Kaiser, such phenomena are generally more robust than high-level theories, making them hard to establish to begin with but then, equally, hard to overturn (p. 190). This is all nicely worked out, but, as with Cartwright, the sharpness of the distinction between phenomenological models and theories is questionable: as in the superconductivity case, the construction of the model was certainly not wholly independent of theory, as Kaiser notes. Moreover, given the role of seafloor spreading in supplying the mechanism of continental drift, to separate out geomagnetic reversals surely guts this theory, reducing it back to Wegener's 'speculation'. Resisting such isolationist tendencies, we can certainly maintain a unitary view of complex theory structure conjointly with lower-level robustness through 'theory change', as Post, for example, has shown.

Finally, Cattaneo, Dalla Chiara and Giuntini's 'The Unsharp Approaches to Quantum Mechanics' is also engaged with attempting to bridge the gap between 'an exact mathematics and a fuzzy experimental world' (p. 345). The fuzziness here is represented by events which are 'genuinely' ambiguous in the sense that the ambiguity lies with the properties rather than the states themselves, and which can be obtained by referring to 'unsharp' measurable sets (p. 353). Quantum effects are identified with such events and the structures obtained suggest different forms of paraconsistent and 'fuzzy' quantum logic. The epistemic framework is that of Dalla Chiara and Toraldo di Francia's conception of a physical model as a 'non-homogeneous' system containing both mathematical and experimental elements which are connected by translation functions. Here we see a well-thought-out philosophy of science arm in arm with technical philosophy of physics but the connection between 'unsharp' quantum logic and the notions of 'experimental', 'idealized experimental' and 'theoretical' truth could have been spelt out a little more clearly. Furthermore, what does it mean to say that a property is 'ambiguous'? Elsewhere, Dalla Chiara and Toraldo di Francia have analysed the indistinguishability of quantum particles in terms of 'quasets' and 'fuzziness' arises from a non-standard membership relation such that, in entangled states at least, definite property values cannot be attributed to individual particles. Perhaps these ambiguous properties could be identified with Teller's 'non-supervenient' relations but the metaphysics of fuzziness remains, well, fuzzy.

The above selection is interest-relative, of course, but I hope that it conveys something of the flavour of this semi-assortment. If you like the odd lucky dip, then this may be the book for you.
COPYRIGHT 1996 Oxford University Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1996 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:French, Steven
Publication:The British Journal for the Philosophy of Science
Article Type:Book Review
Date:Dec 1, 1996
Words:2047
Previous Article:The Scientific Letters and Papers of James Clerk Maxwell, vol. 2, 1862-1873.
Next Article:Science, Reality, and Language.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters