Printer Friendly

THE MONIST: Vol. 102, No. 4, October 2019.

Quantum Bayesianism Assessed, JOHN EARMAN

The idea that the quantum probabilities are best construed as the personal/subjective degrees of belief of Bayesian agents is an old one. In recent years the idea has been vigorously pursued by a group of physicists who fly the banner of quantum Bayesianism (QBism). The present paper aims to identify the prospects and problems of implementing QBism, and it critically assesses the claim that QBism provides a resolution (or dissolution) of some of the long standing foundations issues in quantum mechanics, including the measurement problem and puzzles of nonlocality

Statisical Mechanics: A Tale of Two Theories, ROMAN FRIGG and CHARLOTTE WERNDL

There are two theoretical approaches in statistical mechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-a-vis the other. The authors answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical mechanics (GSM) is an effective theory, and they describe circumstances under which Gibbsian calculations coincide with the Boltzmannian results. They then point out that regarding GSM as an effective theory has important repercussions for a number of projects, in particular attempts to turn GSM into a nonequilibrium theory.

Individualist and Ensemblist Approaches to the Foundations of Statistical Mechanics, SHELDON GOLSTEIN

The author contrasts the two main approaches to the foundations of statistical mechanics: the individualist (Boltzmannian) approach and the ensemblist approach (associated with Gibbs). He indicates the virtues of each and argues that the conflict between them is perhaps not as great as often imagined.

Two Kinds of High-Level Probabliity, MEIR HEMMO and ORLY SHENKER

According to influential views the probabilities in classical statistical mechanics and other special sciences are objective chances, although the underlying mechanical theory is deterministic, since the deterministic low level is inadmissible or unavailable from the high level. Here two intuitions pull in opposite directions: One intuition is that if the world is deterministic, probability can express only subjective ignorance. The other intuition is that probability of high-level phenomena, especially thermodynamic ones, is dictated by the state of affairs in the world. The authors argue in support of this second intuition, and they show that in fact there are two different ways in which high-level probability describes matters of fact, even if the underlying microscopic reality is deterministic. Their analysis is novel but supports approaches by, for example, Loewer, Albert, Frigg and Hoefer, List and Pivato. In particular, the reductive view they propose here can be seen as a naturalization of the above approaches. They consider consequences of their result for nonreductive physicalist approaches, such as functionalism, that admit multiple realization of the kinds that appear in the special sciences by physical kinds. They show that nonreductive physicalism implies the existence of nonphysical matters of fact.

Determinism, Counterpredictive Devices, and the Impossibilty of Laplacean Intelligences, JENANN ISMAEL

In a famous passage drawing implications from determinism, Laplace introduced the image an intelligence who knew the positions and momenta of all of the particles of which the universe is composed, and asserted that in a deterministic universe such an intelligence would be able to predict everything that happens over its entire history. It is not, however, difficult to establish the physical possibility of a counterpredictive device, that is, a device designed to act counter to any revealed prediction of its behavior. What would happen if a Laplacean intelligence were put into communication with such a device and forced to reveal its prediction of what the device would do on some occasion? On the one hand, it seems that the Laplacean intelligence should be able to predict the device's behavior. On the other hand, it seems like that device should be able to act counter to the prediction. An examination of the puzzle leads to clarification of what determinism does (and does not) entail, with some insights about various other things along the way.

Naturalness and Emergence, DAVID WALLACE

The author develops an account of naturalness (that is, approximately: lack of extreme fine-tuning) in physics that demonstrates that naturalness assumptions are not restricted to narrow cases in high-energy physics but are a ubiquitous part of how interlevel relations are derived in physics. After exploring how and to what extent we might justify such assumptions on methodological grounds or through appeal to speculative future physics, the author considers the apparent failure of naturalness in cosmology and in the Standard Model. He argues that any such naturalness failure threatens to undermine the entire structure of our understanding of intertheoretic reduction, and so risks a much larger crisis in physics than is sometimes suggested; he briefly reviews some currently popular strategies that might avoid that crisis.

Idealization and the Ontic Conception, CARL F. CRAVER
COPYRIGHT 2019 Philosophy Education Society, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2019 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Publication:The Review of Metaphysics
Date:Dec 1, 2019
Previous Article:MIND: Vol. 128, No. 512, October 2019.
Next Article:THE PHILOSOPHICAL QUARTERLY: Vol. 69, No. 277, October 2019.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters