* * * * * * * * * * * * * *
* * * * * * * * * * * * * *
Notes on some papers available from the physics e-print archive which are relevant, or significant, or recommended in the broad context of the many-minds interpretation of quantum theory presented on my home page: http://www.bss.phy.cam.ac.uk/~mjd1014
* * * * * * * * * * * * * *
* * * * * * * * * * * * * *
Items commenting on papers by the following authors have been added or modified recently:
* * * * * * * * * * * * * *
A.E. Allahverdyan, R. Balian, and T.M. Nieuwenhuizen, “Understanding Quantum Measurement From the Solution of Dynamical Models” arXiv:1107.2138.
The core of this long paper is a thorough analysis of a model of a quantum measurement in which a single spin is coupled to an apparatus consisting of a magnet formed from a large number of spins in contact with a heat bath. This provides an excellent description of how a microscopic quantum property can be correlated with a stable and detectable macroscopic property. I am not however persuaded by the authors’ claim that their framework is sufficient to demonstrate the production (as opposed to the apparent production) of a unique answer for the outcome of each run of a measurement. Allahverdyan, Balian, and Nieuwenhuizen adopt what they call a statistical interpretation of quantum states. They apparently believe that, at least for macroscopic systems, this licenses them to interpret quantum probabilities as if they were classical probabilities once the apparatus registers stable outcomes and the chance of observing interference between distinct outcomes has become negligible. In particular, despite the approximations involved in these invocations of stability and of negligibility, they feel able to describe the idea of a branching universe as “superfluous”. They claim to adhere to a subjective interpretation of probability, and yet they deny that the state is a construct of the observer and they also make no attempt to include the states of observers in their model.
The model Allahverdyan, Balian, and Nieuwenhuizen construct has been carefully designed to cause one unambiguously defined quantity to correlate with another unambiguously defined quantity. It is of course important to understand that this can be done, but this does not imply that all real observations involve pointers which are so cleanly delineated. In particular, I argue in Donald 2002 that there are non-trivial ambiguities in how we might describe the observations we make of our own brains because of the varieties of, and the different coarse-grainings of, possible quasi-classical quantities which might underpin our experiences. (87)
H. Price, “Does Time-Symmetry Imply Retrocausality? How the Quantum World Says ‘Maybe’ ” arXiv:1002.0906v3.
A fundamental assumption in the derivation of Bell's inequalities is that the value of a hidden variable carried by a particle must be independent of any choice of measurements subsequently made on that particle. In a series of papers, Price has questioned this assumption. Here he considers a model involving a photon passing through a polarizer. He argues that the time-reversal of a situation in which a discrete random input produces a photon with polarization which an experimenter can then fix modulo π/2, is a situation in which the result of a measurement is not fully predictable and before that measurement an experimenter can fix modulo π/2 the polarization of a photon before it enters her polarizer. Price claims that this model shows that realism plus time-symmetry plus discreteness implies retrocausality. It seems to me that in his argument he invokes an implausible form of discreteness. Whatever picture of measurement we have, the fact that reinterference is possible prior to a measurement, strongly suggests that before a measurement a photon state should be described by a wave-function rather than by a classical bit. If measurement does need to be included in the picture, then I would require a detailed analysis of it before I attached any plausibility to Price's time-symmetry requirement.
I also find it difficult to understand just what Price means by “causality”. In “A Neglected Route to Realism About Quantum Mechanics” gr-qc/9406028v1, he presents a picture in which hidden variables attached to correlated spins are affected by subsequent measurements. Causality in physics seems usually to be expressed by giving a picture of how one can choose initial conditions for the dynamical equations of the theory. Often, this comes down to claiming that one can choose arbitrary conditions on any given spacelike hyperspace and that that choice will then determine the solution throughout spacetime. If this is a description of the initial conditions “causing” the solution to the future of the hyperplane, then it will simultaneously be a description of the same conditions “retrocausing” the solution to the past of the hyperplane. Some dynamical equations, such as the heat equation, do not, in general, allow such retrocausality, but a very satisfactory formulation of the idea is available in algebraic quantum field theory. The spacelike hypersurface is an essential component in almost all versions of the idea, and it is this component in particular which Price's account of causality seems to ignore. In his picture, the “initial conditions” at a time prior to a measurement would appear to be defined partly in the past, by conventional physics, and partly in the future, by Price's retrocausality. Such mixed conditions are a nightmare for a supposedly fundamental physical theory, easily giving rise to over-determined or under-determined problems. Price speaks of holding fixed only the “accessible” past, but he does not explain what accessibility means, nor why it should not be observer dependent.
In my own idealistic many-minds theory, future and external past exist, other than as possibilities, only in as far as they can be deduced from an individual observer's historical structure. Time-symmetry may be a property of the global Hamiltonian, but the symmetry is completely broken in an observer's development out of the assumed simplicity of the universal wavefunction which constitutes the primary initial condition. When dinosaur bones are uncovered, that is an observation which constructs part of the past for that observer, in just the same way as the observation of a spot on a screen reveals the apparent path of an electron. In both cases, the probabilities of the observations were constrained by the information which constituted the structure of the observer at the time of the observation. She might, for example, have known whether she had opened one slit open or two, or whether she was digging in strata of the right age. These information constraints can be encoded in partial quantum states associated with the observer. However this is not a Cauchy problem in which those states have a unique extension to pre-observation states given the new observations. Rather it is merely a heuristic aspect of a theory which has as its fundamental core the idea that new localized individual observations, or information, should add to prior information to yield unique probabilities for subsequent localized individual observations, or information. (88)
L. Marchildon, “Can Everett be Interpreted Without Extravaganza?” arXiv:1001.1926.
Without going into any details, Marchildon considers some of the ways in which people have tried to make sense of the idea of multiplicity of outcomes in Everettian approaches to quantum mechanics. He argues (as I also do in Donald 2002 and elsewhere on this site) that there are unresolved ambiguities in attempts to distinguish possible outcomes by invoking decoherence.
Marchildon poses some questions for many-minds interpretations: “What kinds of mind split? Only human minds, or also cats’ minds? What, in the quantum mechanical formalism, singles out brain states?” I answer these questions with the claim that “brain states” (or histories of brain states) are singled out in the quantum formalism by having a specific type of abstract structure. Both cats and humans have brains which can have state histories with such structures. (86)
A. Kent, “Real World Interpretations of Quantum Theory” arXiv:0708.3710.
Given a natural preferred tensor product decomposition of a global Hilbert space, and given, at each moment, a natural preferred complete set of orthogonal projections on one of the component spaces, the problem of the interpretation of quantum theory might well be almost solved. However, there is little point in just postulating the existence of such structures, as Kent does here. No doubt they would have “a precise mathematical definition”, if only we could find out what it should be. Suggestions such as “the set of projections onto the simultaneous eigenstates of single-particle position operators” have been considered endlessly and unsuccessfully in non-relativistic quantum mechanics, and are even harder to do anything with in relativistic quantum field theories. Kent also ignores the locality experiments which his supposed single chosen final outcome would make appear weirdly predestined. (84)
H. Wiseman and J. Eisert, “Nontrivial Quantum Effects In Biology: A Skeptical Physicists' View” arXiv:0705.1232.
Wiseman and Eisert discuss some of the ways in which it has been claimed that quantum theory could revolutionize biology. Their dismissal of the possibility of biological quantum computing is authoritative and here I agree entirely (also here). On other matters, I agree with their scepticism, but their review is sometimes fairly shallow. For example, I think that the origin of life is less well understood than they suppose, although again I agree that it did not involve any kind of “quantum search”. Contrary to what they claim, the weak anthropic principle does not by itself explain the cosmological co-incidences, unless there is reason to believe that unobservable values for cosmological constants actually occur elsewhere. By analogy, given that there is a wide range of actual star-planet distances, it does not seem puzzling that we find ourselves having evolved on a planet which is the right distance from a star for water to be liquid. The appropriateness of the Earth-Sun distance would seem far more of a mystery if we had no direct evidence for any other stars or planets. In their final section, Wiseman and Eisert suggest that there might be a relation between free will and the unpredictability of quantum events. By itself, unpredictability is a pretty empty sort of “freedom”. However, the concluding suggestion, that some philosophical understanding could be gained by simultaneously invoking the (non-local) determinism of the Bohm interpretation, seems merely confused.
While it seems unlikely that quantum theory will revolutionize biology, it may be that the workings of some small-scale biological systems can only be understood by using ideas from advanced quantum theory. In physics/0611205, J.C. Brookes, F. Hartoutsiou, A.P. Horsfield, and A.M. Stoneham ask “Could Humans Recognize Odor by Phonon Assisted Tunneling?”. They provide a model, estimate parameters, and discuss evidence for the idea that scent molecules might mediate inelastic electron tunnelling between a donor and an acceptor. In “Quantum Zeno Effect Underpinning the Radical-Ion-Pair Mechanism of Avian Magnetoreception” arXiv:0804.2646, I.K. Kominis explores mechanisms that might underlie the ability of migrating birds to detect the Earth's magnetic field. According to the model he analyses, the magnetic field produces an effect in the eye of the bird by affecting the final reaction products of a photo-induced chemical reaction which begins with the creation of a pair of correlated spins. The reaction products depend on whether the spins are in a singlet or a triplet state, and the magnetic field affects the proportion of these states. Kominis argues that the effect requires a slower decay of singlet-triplet coherence than is predicted by semi-classical models. He presents a quantum-mechanical master equation for the density matrix of the spins prior to depopulation according to which the decay rate is reduced in certain magnetic-field dependent eigenmodes. Kominis provides more details of his model in arXiv:0804.3503 and arXiv:0805.3081. He analyses some recent data in arXiv:0806.0739, arguing that it confirms his theory and contradicts the semi-classical approach. The model that Kominis proposes seems to fit the evidence well. I am however slightly doubtful about the way that he interprets his model in terms of quantum measurements and the quantum Zeno effect. The master equation at the heart of his proposal might less dramatically be interpreted as simply expressing the interactions between the environment and the spin state.
When someone claims that a serious knowledge of quantum mechanics is required to understand the working of some biological system, the first question one should ask is whether it is possible to imagine the suggested mechanism having evolved in the warm and wet cellular environment. Photosynthesis or odor reception or magnetoreception are areas which might call for quantum biochemistry because highly sophisticated mechanisms can be developed and refined in small scale systems involving individual electron transfers or the reception of individual photons. (83)
S. Gröblacher, T. Paterek, R. Kaltenbaek, C. Brukner, M. Zukowski, M. Aspelmeyer, and A. Zeilinger, “An Experimental Test of Non-Local Realism” arXiv:0704.2529.
The violation of the Bell inequalities by two photons with polarizations entangled in a singlet state rules out models in which the outcomes of all possible measurements on the photons are determined by the values of pre-existing localized properties carried by the photons separately, independent of the measurements made on their partners. This paper reports on an experiment which rules out some models in which the individual photons carry their own polarization vectors, even if the outcome of a measurement is allowed to depend on a distant setting. (81)
D. Cohen, “EPR, Bell, Schrodinger's cat, and the Monty Hall Paradox” arXiv:0704.1087.
The Monty Hall problem describes a classical situation in which the “ignorance probabilities” which a person should assign to particular possible future events, given the knowledge she has, change when she is provided with new information. Cohen suggests, in these brief extracts from lecture notes, that quantum probabilities change in an analogous way. In order to justify this analogy, he needs to explain what he supposes quantum probabilities to express ignorance of, and whose ignorance he supposes to be in question. Cohen includes in his extracts a derivation of a Bell inequality. The observed violation of such inequalities indicates that quantum probabilities cannot be explained as ignorance of pre-existing events without contradicting conventional assumptions about locality. The natural interpretation of Cohen's suggestion may be to suppose that each separate observer is ignorant of the outcomes of their own measurements, but in that case characterizations of “observers” and of “measurements” are needed, because the sort of universal unitary evolution invoked by Cohen calls into question the existence of any “observed” physical system. We also need an explanation of how and when compatibility between different observers can be achieved. I believe that a many-minds picture can provide all this and preserve locality, but such a picture involves considerably more than Cohen's conclusion that we have “merely a unitary evolution”. (80)
N.D. Mermin, “In Praise of Measurement” quant-ph/0612216.
There are many situations in quantum mechanics in which we can describe “measurements” merely as black box processes, and yet still make accurate predictions of future observations and their probabilities. Along these lines, Mermin argues that the role of measurement in quantum computation can be adequately described in terms of copies of a basic one qubit black-box measurement device. This is all very well, but achieving a genuine understanding of quantum theory requires opening the boxes, describing their internal workings in terms of fundamental physical laws, and explaining, in equally fundamental and consistent terms, the nature of the outcomes observed and of the observations by which they become that which is observed. (82)
J. Conway and S. Kochen, “The Free Will Theorem” quant-ph/0604079.
The apparent ability of an experimenter to choose which experiment to perform plays a significant role in discussions of the apparent nonlocality of quantum theory, through statements like, “If, in the context of measurements on a suitably correlated pair of spin-one particles, Alice had chosen to measure the square of the spin component in the x-direction and had found a value of zero, then if Bob had chosen the same direction, he would have found the same value”. Such statements rely on our intuition about freedom of choice to compel belief in the suggested counterfactuals. Conway and Kochen use an extreme form of this intuition to assume, in these circumstances, that Alice makes her choice of measurement direction in a way that is undetermined by past history. They derive the consequence that the response of the particles is also undetermined by their local history.
A serious exercise of free will involves a thoughtful decision. We have to use our brains; not shrug our shoulders and allow chance to determine our actions. A brain is not a deterministic machine (Donald 2002), but neither is it so carefully constructed that it can be assumed, when, for example, Alice twiddles a knob without thinking about it, that she is producing a measurement direction entirely uncorrelated with her past history. It is much more plausible to claim merely that she could, with different probabilities, actually have produced a range of different measurements with the same past. However, although this weaker claim would still be sufficient to imply that the observed spin values are locally undetermined, it would be begging the question to derive it from quantum randomness in the brain. Conway and Kochen, on the other hand, just assume that free will requires independence of past history. They deny that what they mean by free will is simply neural randomness. They assume instead the existence of an “active kind of free will that can actually affect the future”, but they do not explain what sort of extra-physical homunculus can make active choices without input from a physical past.
The idea that quantum events are genuinely random is only one part of the nonlocality mystery. The other part is what appears to be the subtle conspiracy by which it is ensured that Alice and Bob's observations are always compatible. In my opinion, this is most plausibly explained by the many-minds idea. In this framework, the compatibility is only apparent and is only achieved, for each separate observer, at the moment when it is observable, which is when that observer finally becomes aware of the other's results. Alice's observation of the presentation of Bob's results is then as much a “quantum” observation as her observation of her own measurement device. The many-minds idea requires, and can be given, a detailed theoretical underpinning. The issue is not just, as Conway and Kochen suggest, a question of grammar. Nor, as they also seem to suggest, is the fact that the reductions produced by different observers appear to be the same, an uninvestigated problem. It is the ordinary, endlessly-discussed, problem of understanding how each separate individual's observations are governed by a history of correlated quantum probabilities. (69)
K. Svozil, “Staging Quantum Cryptography with Chocolate Balls” physics/0510050.
Using a choice of coloured eyeglasses to model the inability of an experimenter to measure complementary variables and using the edibility of chocolate to model the evanesence of quanta, Svozil provides an instructive demonstration of the working of a quantum cryptographic protocol. (78)
C.G. Timpson, “The Grammar of Teleportation” quant-ph/0509048.
In this careful discussion of quantum teleportation, Timpson looks particularly at whether, leaving aside the classical signal which passes between the teleporters, it makes any sense to talk about the process as one involving the movement of information. He draws attention to the fact that analyses of quantum processes may be interpretation dependent, and introduces the idea of the simulation fallacy to describe the common mistake of thinking that if some process requires certain resources (such as non-local signaling or multiple computations) to produce a classical simulation, then those resources must somehow also be required in a quantum manifestation of the process.
Several authors have produced interesting models of teleportation. In “Disentangling Nonlocality and Teleportation” quant-ph/9906123, L. Hardy stresses the relevance of the impossibility of cloning a quantum state. His model expresses in a very simple and direct way the idea that a measurement can only provide partial information about the real state of a system. In “Classical Teleportation of Classical States” quant-ph/0310017, O. Cohen demonstrates how a classical probability for a single event can apparently be transferred between two separate locations by using a shared classical correlation. Cohen's model is discussed and generalized by T. Mor in “On Classical Teleportation and Classical Nonlocality” quant-ph/0511172.
A many-minds analysis of quantum teleportation would emphasize the idea that different information may be available to different observers depending on their history. After Alice has told Bob which measurement result she has observed, she and Bob will have some common knowledge. It may be however, that because of her original preparation procedure, the teleported state may appear to be known to Alice while still being unknown to Bob. This situation is comparable to an EPR experiment, in which Alice can appear to know in advance the results of some of Bob's measurements. Nevertheless, there is no non-local transfer of information, both because there is a limit on how much information Bob can extract from measuring his state and because Alice and Bob must communicate further if they are to compare their results. For Bob, what appears to be teleported is just one of the possible states which Alice might have prepared. A many-minds version even of Cohen's probability transfer would require an analysis of what each individual knows about the probability at any moment, of how the probability was chosen, and of the possible alternatives to that choice. (75)
F.H. Thaheld, “Does Consciousness Really Collapse the Wave Function? A Possible Objective Biophysical Resolution of the Measurement Problem” quant-ph/0509042.
If there is no physical process of wave-packet reduction, then quantum dynamics is described by a global unitary map. A unitary map on a tensor product Hilbert space need not, however, be unitary on the component spaces. In this paper and in “The Argument for an Objective Wave Function Collapse: Why Spontaneous Localization Collapse or No-Collapse Decoherence Cannot Solve the Measurement Problem in a Subjective Fashion” quant-ph/0604181, Thaheld points out that the dynamics of the interaction between photons and a human eye is not locally unitary. This is clearly correct, but unless the distinction between local and global unitarity (or local and global wavefunctions) is taken into account, it has no relevance to the question of the relationship between consciousness and the measurement problem. Thaheld repeats the same elementary error in “Comment on "Experimental Motivation and Empirical Consistency in Minimal No-Collapse Quantum Mechanics"” quant-ph/0602190 and in “Are We Getting Closer to a Resolution of the Measurement Problem?” quant-ph/0607127. (77)
S. Weinstein, “Anthropic Reasoning in Multiverse Cosmology and String Theory” hep-th/0508006.
Weinstein suggests that the anthropic principle should be restricted to a narrow version which merely rules out theories which are incompatible with our actual existence. It seems to me that this is over-restrictive. If I have won the lottery, then the fact that someone had to win does not explain why it was me. It is natural and appropriate for me to wonder whether there was any reason for my success. Such wondering is formalized by the versions of the anthropic principle which Weinstein rejects. Of course, in the lottery case I would presumably be wrong to assume that my winning was anything other than chance, but I do think that it would be a mistake for me not to consider the question. (73)
Y. Aharonov and E.Y. Gruss, “Two-Time Interpretation of Quantum Mechanics” quant-ph/0507269.
Aharonov and Gruss suggest that the apparent indeterminism of quantum mechanics could be removed by the imposition of a suitable future-time boundary condition. It seems to me that their proposal has little value. Constructing the future boundary condition would be at least as difficult as choosing, with appropriate probability, a result for every quantum measurement ever to be performed. By constrast, at least in a many-minds interpretation, a past boundary condition can assumed to be simple. In their technical analysis, Aharonov and Gruss ignore the thermal nature of real measuring devices and environments and assume that device and environment are cleanly distinguishable. They also seem to assume that all quantum branchings are caused by identifiable measurements, although they do not explain how a measurement is to be identified. My many-minds interpretation requires multiple-time “boundary conditions”. In my opinion, it is hard enough to decide what a quantum event might be without losing sight of where and when such a event occurs. I also think that it is far from clear, even taking account of decoherence, that observed reality can be fully described by an unambiguous branching of the initial quantum state with each branch identifiable by its end-point. (79)
M. Schlosshauer, “Experimental Motivation and Empirical Consistency in Minimal Non-Collapse Quantum Mechanics” quant-ph/0506199.
Schlosshauer reviews empirical evidence for quantum superpositions of mesoscopically distinguishable states in coherent quantum tunnelling in SQUIDS and in the interference of fullerene molecules. He also discusses the possibility of superpositions involving distinguishable particle numbers in Bose-Einstein condensates. The experiments Schlosshauer describes provide significant evidence for the validity of Schroedinger dynamics and decoherence mechanisms at ever-increasing scales. This supports the idea of a universal wavefunction which never collapses, but it leaves unanswered many questions about the nature of observation. Schlosshauer remarks that neural decoherence rates are extremely fast. He then supposes that decoherence is sufficient to explain the emergence of subjective perceptions of single outcomes. While I agree that decoherence will be important for any such explanation, I have argued in Donald 2002 that it is far from sufficient. (66)
N.P. Landsman, “Between Classical and Quantum” quant-ph/0506082.
An excellent review of a variety of ideas about the relationship between classical theories and quantum theories. Landsman begins with a well-informed discussion of the significance of such ideas in the early development of quantum mechanics and in the Copenhagen interpretation. He then turns to three relevant areas of mathematical physics and discusses theories of quantization, theories of classical limits, and some aspects of infinite system theories. I think his descriptions of these difficult but important theories are of exemplary clarity. He concludes with some remarks about decoherence and consistent histories. (67)
K. Jacobs and H. Wiseman, “An Entangled Web of Crime: Bell's Theorem as a Short Story” quant-ph/0504192.
Members of a criminal gang are being held in separate cells. Their lawyer manages to limit the questions the police can ask. Around this scenario, Jacobs and Wiseman build a complicated tale, which is contrived and far-fetched and yet is completely conventional in physical terms. The result is that they produce circumstances which are such, that in way which would be impossible without quantum effects, the gang would be able to take advantage of quantum non-locality to harmonize their stories without actually commmunicating with each other during the questioning. Jacobs and Wiseman's tale takes the form of an amusing parody. This is done well, except that the moral tone would perhaps better suit Runyon than Conan Doyle. (64)
B. Carter, “Micro-Anthropic Principle for Quantum Theory” quant-ph/0503113 and “Anthropic Interpretation of Quantum Theory” hep-th/0403008.
Carter seeks to define a probability for an individual perception. This, he says, involves refraining from assuming in advance that one has some particular identity. It seems to me that much of what he is doing amounts to making some preliminary steps towards a many-minds interpretation. However, he does not analyse the nature of perception; I think he misplaces the boundary between subjective and objective; and he does not provide adequate motivation for his suggested probability. In my opinion, a mind is defined by its history (which requires an abstract characterization as a pattern of information). The probabilities at a particular moment of the possible short-term futures of an individual mind are of much greater significance than the cumulative improbability of that individual's lifetime history. (70)
S. Aaronson, “NP-Complete Problems and Physical Reality” quant-ph/0502072.
Aaronson provides a clear, wide-ranging, and enjoyable discussion of ways in which the solution of apparently intractable computational problems might or might not become possible in the framework of various exotic physical theories. He then suggests that requiring the intractibility of NP-complete problems places interesting constraints on physical theories. (60)
W.T.M. Irvine, J.F. Hodelin, C. Simon, and D. Bouwmeester, “Realisation of Hardy's Thought Experiment” quant-ph/0410160.
Using two Mach-Zehnder interferometers sharing a central beam splitter, Irvine, Hodelin, Simon, and Bouwmeester provide yet another forceful experimental demonstration of the failure of local hidden variable theories. However counter-intuitive it may be to do so, in my opinion one really has to stop thinking of individual particles as either going one way or the other around an interferometer; whether those particles are photons, as in the present paper, or atoms, as in “Lithium Atom Interferometer using Laser Diffraction: Description and Experiments” by A. Miffre, M. Jacquey, M. Büchner, G. Trénec, and J. Vigué, quant-ph/0410182. Physical situations are more accurately described in terms of quantum states and observations, than in terms of particles choosing paths.
In “Bell's Theorem and the Experiments: Increasing Empirical Support to Local Realism” quant-ph/0410193, E. Santos counters this opinion by arguing for the existence of loopholes in tests of Bell's inequalities. He suggests that local realism is so important as a scientific principle that we should abandon it only if faced with the most direct and unambiguous evidence. I think that Santos's conception of realism is unnecessarily narrow. Also, I am inclined to accept the overall theoretical structure of quantum mechanics because of the wide range of empirical support for all sorts of detailed aspects of the theory. There may be loopholes, but it seems to me implausible that they are jumped through. Nevertheless, Santos's paper is valuable as a thoughtful defence of an unpopular position. (56)
C.R. Shalizi, “The Backwards Arrow of Time of the Coherently Bayesian Statistical Mechanic” cond-mat/0410063.
The information-theoretic analysis of thermodynamic entropy suggests that entropy is a measure of our ignorance. In classical terms, the more states that a system can occupy given the constraints imposed by our knowledge of macroscopic parameters, the higher the entropy. Shalizi argues that if entropy is a measure of how much we don't know, then it should decrease as we learn more. His mathematics is impeccable, but his argument founders on the fact that only in the rarest of cases in which statistical mechanics is relevant (such as spin-echo experiments), is precise past knowledge not washed out of the set of available observed physical states. Ordinary macroscopic thermal systems are certainly not closed to the extent that he requires.
In my opinion, the idea that entropy is a measure of ignorance is a useful heuristic which can usually be justified by investigation of a system's internal and external interactions, by facts about “typical states”, and ultimately, by the structure of quantum probability (Donald 1992). The higher the entropy of a quantum state satisfying our observational constraints, the more likely we are to make future observations compatible with that state. (71)
E. Dennis and T. Norsen, “Quantum Theory: Interpretation Cannot be Avoided” quant-ph/0408178.
It is a shame that good arguments supporting realist interpretations of quantum theory so often end up in promotion of the Bohm interpretation without due regard to its problems. It is also a shame that it seems to be so little understood that a theory in which observers are fundamental can be realist -- in the sense of being based on perception-independent truths (amounting to “objective realities” or “physical laws”) -- as long as it provides a complete characterization of the structures and the possible temporal developments of the observers.
Further discussion of the case for non-relativistic Bohmian mechanics is provided by R. Tumulka in “Understanding Bohmian Mechanics: A Dialogue” quant-ph/0408113.
In “Why Bohm's Quantum Theory?” quant-ph/9812059, H.D. Zeh argues that Bohmian mechanics is essentially irrelevant in the light of decoherence theory. He points out that the global wave-function, which in Bohm's theory is supposed to pilot the particle trajectories, is identical to Everett's (uncollapsing) “universal wave-function”, and has exactly the same branching components; even if all but one of these components is supposed to be empty of Bohmian particles. Similar points are made by H.R. Brown and D. Wallace in “Solving the Measurement Problem: de Broglie-Bohm Loses Out to Everett” quant-ph/0403094.
In “Why Isn't Every Physicist a Bohmian?” quant-ph/0412119, O. Passon reviews a range of objections to Bohm theory. (52)
P. Hayden, D.W. Leung, and A. Winter, “Aspects of Generic Entanglement” quant-ph/0407049.
The idea of typical states with given properties is fundamental in classical statistical mechanics, as is the idea of typical sequences in classical information theory. I believe that a similar idea will also turn out to be of fundamental importance in quantum theory; in particular in helping us to understand the apparent states of macroscopic objects and in encouraging us to give up the assumption that general physical systems ought to be described by pure states. The mathematics needed for working with this idea is gradually being developed, as exemplified by the paper of Hayden, Leung, and Winter. This paper is introduced by Hayden in “Entanglement in Random Subspaces” quant-ph/0409157. Significant earlier steps in this development include the papers gr-qc/9305007 of D.N. Page and hep-th/9601132 of S. Sen on “Average Entropy of a Subsystem” in which it is shown that the entropy of a typical unconstrained state on a small subsystem is close to maximal.
In “Canonical Typicality” cond-mat/0511091, S. Goldstein, J.L. Lebowitz, R. Tumulka, and N. Zanghi argue that a typical global pure state with a given approximate value of global energy has a reduced density matrix on a small subsystem which is close to the canonical ensemble density matrix -- exp(-H/kT)/Z -- for the appropriate temperature. A much more general result along the same lines is proved by S. Popescu, A.J. Short, and A. Winter in “The Foundations of Statistical Mechanics from Entanglement: Individual States vs. Averages” quant-ph/0511225. (57)
L. Marchildon, “Why Should We Interpret Quantum Mechanics?” quant-ph/0405126.
From time to time, it is suggested that quantum states are simply states of knowledge. In Donald 2002, I draw attention to the problems this raises with our understanding of psycho-physical parallelism. Marchildon argues that it raises problems with our understanding of the relationship between macroscopic and microscopic objects. I agree that calling the reality of microscopic objects into question also calls into question the reality of macroscopic objects, but whereas in“Bohmian Trajectories and the Ether: Where Does the Analogy Fail?” quant-ph/0502049, Marchildon looks to a realist interpretation of the trajectories of microscopic objects, I suggest that a realist interpretation of the structures constituting knowledge is sufficient. (55)
J.C. Baez, “Quantum Quandaries: a Category-Theoretic Perspective” quant-ph/0404040.
Baez describes similarities between a category significant for quantum theory and one significant for general relativity. The similarities are at the level of abstract mathematical structure. Baez claims that the mathematics “accounts for many of the famously puzzling features of quantum theory”. This is correct only in the sense that the mathematics provides a framework within which those puzzling features somehow arise. It does not address the most puzzling questions: specific questions like “Precisely what might we see happening next?” and “Precisely how is what we might see restricted by what we are, or by what we do?”. (54)
P.C.W. Davies, “Quantum Fluctuations and Life” quant-ph/0403017.
Davies reviews some of the more speculative literature on the relevance of quantum theory to biological processes. In my opinion (as exemplified in Donald 2001), many of the ideas he discusses lack detailed development and have little plausibility. Davies himself attempts to apply the uncertainty principle to a couple of situations in biology. This leads him to suggest a maximum time over which a protein can fold and, allowing for a error in his presentation, a minimum velocity for DNA base polymerization. Both his arguments and his conclusions are misconceived. Complex processes involving thousands of atoms cannot be modelled as single quantum fluctuations. (51)
D.M. Appleby, “Facts, Values and Quanta” quant-ph/0402015.
Appleby discusses some ideas about the nature of probability which are relevant to the understanding of quantum mechanics. He criticizes the idea that probabilities can be thought of simply as long term relative frequencies. He also criticizes orthodox methods in statistics, pointing out that many assumptions are implicit in those methods. He presents some illuminating examples showing how these implicit assumptions are similar to those made explicit by Bayesian methods. I disagree with Appleby's view that probability is epistemic and I do not believe that he has made a convincing case against the idea of probabilities as propensities. Nevertheless, this is a thought-provoking paper, with much to recommend it, particularly as an introduction to the subject.
Appleby provides another discussion of these issues in “Probabilities are Single-Case, or Nothing” quant-ph/0408058. In “Properties of the Frequency Operator Do Not Imply the Quantum Probability Postulate” quant-ph/0409144, C.M. Caves and R. Schack mount a much more technical attack on the suggestion that quantum probabilities can be explained in terms of relative frequencies in infinitely repeated trials. Appleby criticizes infinitely repeated trials as physically unobtainable. Caves and Schack show that expressing the same idea in precise mathematical terms requires us to work in a realm radically different from conventional quantum theory. They argue that, in this realm, there are ambiguities in the definition of an infinite-repetition frequency operator which can only be resolved by assuming the conventional definition of quantum probabilities.
I agree that probabilities for finite numbers of trials cannot be explained by reference to infinite numbers of trials. However, I suspect that all explanations of probability involve some circularity. The laws of large numbers are essential to such explanations. We want to say that we expect to see a fair die showing six about one sixth of the time, but this depends on expecting our observations to be reasonably typical. (47)
H. Greaves, “Understanding Deutsch's Probability in a Deterministic Multiverse” quant-ph/0312136.
Greaves is concerned by the problem of what probability can mean in a branching universe in which every possible outcome of an experiment actually occurs. She argues that a “rationality principle” needs to adopted, requiring that we should care about future branches according to their relative quantum amplitude-squared measures. It is not clear to me that such a principle adds anything to the unsophisticated understanding of probability in such a universe, which requires that we act according to whatever mathematical structure we think defines the probability of branches, in exactly the same way that we would act according to the corresponding structure in a corresponding collapsing theory. The nature of reality may be different in the different scenarios, but it is consistent to assume that the meaning of “expected experience” does not change. More pressing questions, in my opinion, may be found by probing the adequacy of the picture that leads Greaves to believe that probabilities can be defined by squared amplitudes and, more generally, may be found by trying to understand the nature of the branches and the mathematics of their probabilities. Greaves is certainly right to query assumptions which are based on a simplistic model of measurement. (63)
M. Schlosshauer, “Decoherence, the Measurement Problem, and Interpretations of Quantum Mechanics” quant-ph/0312059.
This long paper provides an excellent discussion of the relevance of decoherence to the foundations of quantum mechanics. Schlosshauer does not examine the details of the theory or of the models which underlie it, but he does give a careful analysis of its implications for several important interpretative programs. G. Bacciagaluppi's entry, “The Role of Decoherence in Quantum Mechanics”, in the Stanford Encyclopedia of Philosophy, provides a much briefer discussion along similar lines. (46)
A. Bershadskii, E. Dremencov, J. Bershadskii, and G. Yadid, “Brain Neurons as Quantum Computers: in vivo Support of Background Physics” q-bio.NC/0311026.
Experimental evidence is presented for a similarity between scaling properties of moving averages of frequencies of background firings of single neurons in anaesthetized mutant rats and analogous properties of a simple model quantum system exhibiting “quantum chaos”. The authors suggest that their results might be relevant to the question of whether, at some level, neurons function as quantum computers. However similarity in behaviour, particularly in systems which differ in many other respects, does not imply similarity of cause. Moreover, even if quantum chaos were, somehow, at the root of the background firings, that is hardly indicative of quantum computation. (44)
J. Eisert, “Exact Decoherence to Pointer States in Free Open Quantum Systems is Universal” quant-ph/0311022.
Eisert analyses the Wigner function of a free particle linearly coupled to a heat bath of harmonic oscillators and shows that after a sufficiently long time the state of the particle can be represented as a mixture of minimal uncertainly Gaussians. This is typical of results in decoherence theory, in that it is shown that, in many circumstances, there are decompositions into quasi-classical states. Nevertheless, these decompositions are not uniquely defined. (59)
M.A. Rubin, “There is No Basis Ambiguity in Everett Quantum Mechanics” quant-ph/0310186.
At the core of Everett's original analysis of quantum mechanics is a simple model of a measurement process. In the context of that model, Rubin demonstrates that an ambiguity in the decomposition of entangled wavefunctions can be avoided by including information about dynamical structure. This is an interesting result. Nevertheless, the model studied is a long way from the realistic situations which need to be considered if we want to show that Everett's ideas can be relevant to real macroscopic observers and their measurements. This means that, at best, Rubin's paper can only be a preliminary step towards a treatment of the preferred basis problem in the extended sense in which it is introduced in my summary, where it is considered as the problem of finding a theory to explain what observable possibilities can arise as a result of realistic “measurements”.
In “Spatial Degrees of Freedom in Everett Quantum Mechanics” quant-ph/0511188, Rubin generalizes his model to allow system and observer to have spatial wavefunctions. He also considers correlations between different observers. However, he still assumes that the system and the observers are already identified, that the interactions and the initial wavefunctions take precisely specified and idealized forms, and that the interactions act for precisely specified periods. (43)
C. Rovelli, “A Dialog on Quantum Gravity” hep-th/0310077.
We need to remain sceptical in this era of scientific fashion, when much of theoretical physics has become highly speculative, when it can be hard to distinguish between rival physical theories by direct experimental test, and when theories are often so difficult to analyze that significant questions are left open indefinitely. Some of these issues arise even in the foundations of quantum theory where the mathematics is comparatively simple, and in response I have emphasized the weight that should be given to a theory's overall consistency and lack of vagueness. In quantum gravity, the technical problems are awesome. Rovelli compares string theory with loop quantum gravity. His arguments, that string theory has failed as yet to live up to the often extravagant promises of its proponents and that other avenues are also worth exploring, echo the opinions of P. Woit in “String Theory: An Evaluation” physics/0102051.
In “Pascual Jordan, His Contributions to Quantum Mechanics and His Legacy in Contemporary Local Quantum Physics” hep-th/0303241, B. Schroer warns that the monoculture of string theory could lead to the achievements of local quantum field theory being lost. Schroer's paper is an interesting essay on those achievements, emphasizing the goal of an intrinsic formulation of quantum field theory not dependent on starting from the quantization of a classical system. Schroer's biographical notes on Pascual Jordan are also worth reading. Jordan was a brilliant physicist and a Nazi. That such a combination is possible is not something we should forget or ignore.
R. Hedrig provides an introductory review of some more recent critical analyses of string theory in “String Theory - From Physics to Metaphysics” physics/0604171. In particular, he discusses, from a philosophical standpoint, problems with the idea of string theory having a vast number of solutions, one of which is selected anthropically. (41)
A. Peres, “Einstein, Podolsky, Rosen, and Shannon” quant-ph/0310010.
The conclusion of this short note is that “reality may be different for different observers”. The question of the nature of an observer is left open.
M. Smerlak and C. Rovelli reach the same conclusion in “Relational EPR” quant-ph/0604064. They suggest that any physical system provides a potential observer, but this still leaves open many fundamental questions about what it is that we are and why our experiences should take the form that they do. (39)
L. Polley, “"Measurement" as a Neurophysical Process: a Hypothetical Linear and Deterministic Scenario” quant-ph/0309166.
Polley models the opening of a neural ion channel as a configurational tunnel process activated by thermal fluctuations. He argues that in his model the probability of an opening will vary widely between different channels on different occasions. Supposing that only neural firing events are “observed”, he suggests that this can provide a mechanism for the selection of single observed events from a superposition. However, according to this proposal, the probability of a given observation will be determined entirely by events internal to the observer and so will bear no relation to empirically-measured probability. Moreover, as I discuss in Donald 1999, any proposal that only neural firing events are observed will bias the probabilities of observations toward the observation of events causing extensive neural firing. Polley's suggestions about the emergence of agreement between several observers seem to tie the observations of each single individual to the most likely thermal fluctuations in all the brains. (38)
J. Barrett and A. Kent, “Noncontextuality, Finite Precision Measurement and the Kochen-Specker Theorem” quant-ph/0309017.
The Kochen-Specker theorem is a beautiful piece of mathematics which rules out certain types of hidden-variable theories. It tells us that we shouldn't imagine, before we measure a quantum system, that that system has already given a value to all of its projection operators in a way compatible with the mathematics of quantum theory. Pitowsky, Meyer, and Kent were able to side-step the theorem by showing that values could be given to almost all the projection operators. This idea culminates with R. Clifton and A. Kent “Simulating Quantum Mechanics by Non-Contextual Hidden Variables” quant-ph/9908031.
Clifton and Kent's work has been much discussed, and Barrett and Kent's paper is a response to that discussion. An excellent contribution to the discussion is D.M. Appleby “The Bell-Kochen-Specker Theorem” quant-ph/0308114, in which it is shown that any scheme of the Pitowsky-Meyer-Kent-Clifton (PMKC) type necessarily involves pathological discontinuities.
It would be a mistake to suppose that the PMKC work suggests a way of interpreting quantum theory of even the slightest plausibility. In particular, Clifton and Kent postulate an occult and inexplicable interaction between measuring device and system which lies entirely outside the quantum theory they are attempting to model. Nevertheless, the work is interesting because it probes the strangeness of quantum theory in new and subtle ways. The three papers cited here explain and examine these subtleties clearly, competently, and thoughtfully. (37)
W.M. de Muynck, “Towards a Neo-Copenhagen Interpretation of Quantum Mechanics” quant-ph/0307235.
de Muynck provides an interesting analysis of some of the problems of the Copenhagen interpretation, emphasizing distinctions between preparation and measurement and between different notions of completeness. He is explicit about the need for some sort of “subquantum” theory to complete his proposed “neo-Copenhagen interpretation” but he fails to provide such a theory. I do not think that he substantiates his claim to have resolved the problem of non-locality. (36)
T. Ohsaku, “The Model of the Theory of the Quantum Brain Dynamics Can be Cast on the Heisenberg Spin Hamiltonian” quant-ph/0306021.
A biologist looking at this brief note would be right to be puzzled by the complete lack of any connection to conventional neurophysiology. (35)
J.B. Hartle, “What Connects Different Interpretations of Quantum Mechanics?” quant-ph/0305089.
Definite events in quantum theory can easily be represented using projection operators, and compatible sequences of definite events using consistent sequences of projection operators. For this reason, the consistent histories formalism provides a useful preliminary framework for the interpretation of quantum theory and ideas central to several interpretations can be analysed in terms of restrictions of such a framework. I have provided this kind of analysis of my own work in section 7 of Donald 1999. Hartle considers the Copenhagen interpretation, the Bohm interpretation, and the idea of Sums over Histories. The trouble with the consistent histories formalism however is simply that it is far too general to be anything other than a broad preliminary framework. (34)
S. Roy and M. Kafatos, “Quantum Processes, Space-Time Representation and Brain Dynamics” quant-ph/0304137.
This worthless paper confuses the geometries of space-time, of Hilbert space, of spaces of neural information, and of neural anatomy. The underlying idea would seem to be to attempt quantization at the level of neural information. The fact that a brain is already part of the quantum universe is ignored. (31)
R. Omnès, “Decoherence, Irreversibility and the Selection by Decoherence of Quantum States with Definite Probabilities” quant-ph/0304100.
Omnès discusses the development of a general theory of decoherence based on a theory of irreversible processes. He investigates situations in which decoherence does not involve the simple diagonalization of a reduced density matrix but in which there is also mixing between components. (32)
D.J. Berkeland, D.A. Raymondson, and V.M. Tassin, “Tests for Non-Randomness in Quantum Jumps” physics/0304013.
Are quantum events really random? Berkeland, Raymondson, and Tassin do not find any sign of non-random behaviour in data from long series of atomic transitions. (53)
K. Hornberger and J.E. Sipe, “Collisional Decoherence Revisited” quant-ph/0303094.
Hornberger and Sipe go carefully through several methods of deriving the rate of decoherence of a massive particle due to collisions in a thermal environment. Their analysis does not alter the qualitative conclusions of previous approaches but does reveal them to have been in error by a numerical factor. In “Collisional Decoherence Observed in Matter Wave Interferometry” quant-ph/0303093, K. Hornberger, S. Uttenthaler, B. Brezger, L. Hackermueller, M. Arndt, and A. Zeilinger provide an experimental demonstration of decoherence of this kind in the interference of fullerene molecules.
In “Decoherence of Matter Waves by Thermal Emission of Radiation” quant-ph/0402146, L. Hackermueller, K. Hornberger, B. Brezger, A. Zeilinger, and M. Arndt report the observation of the decohering effect of the cooling in flight of hot fullerene molecules. (33)
M. Tegmark, “Parallel Universes” astro-ph/0302131.
Tegmark provides an introductory discussion of four conceivable levels of co-existent physical worlds; infinite approximately-homogeneous cosmology, in which any possible finite-defined set of circumstances will be repeated in infinitely many places; chaotic inflation, in which what we think of as physical constants will take different values in different regions; conventional many-worlds quantum theory; and the class of all mathematical structures. He points out that if we are prepared to countenance the existence of the first of these levels, then many-worlds quantum theory will essentially add no new worlds.
The idea of the fourth level, in the sense of all mathematical structures having equal existence, may seem attractive as a theory apparently involving not only no free parameters, but even no laws. Tegmark provides a fuller discussion of this idea in “Is `The Theory of Everything' Merely The Ultimate Ensemble Theory?” gr-qc/9704009, but it is not, in my view, an idea which can properly be described as a “theory” at all. To start with, there are questions about the definition of the class of mathematical structures which are not answered by the idea that “mathematical existence is merely freedom from contradiction”. For example, there are questions about which language is used to express that freedom from contradiction. More importantly, any idea of total “mathematical democracy” will be lost as soon as we attempt, as we surely must, to make predictions by defining a probability measure on a set of structures. It is ridiculous to suggest that there is any way that all mathematical structures can be given equal statistical weight; at least if any structure is to be given any weight at all. Tegmark does have some interesting remarks about possible dimensions for space and time in these papers and in “On the Dimensionality of Spacetime” gr-qc/9702052, but these remarks are based on assuming that close analogs to conventional laws of physics hold uniformly over extended spacetime regions. This is a situation which is appropriately described in the context of his second level of parallel universe in which some over-arching physical laws are assumed.
The idea of a “self-aware substructure” is central to Tegmark's discussion and is yet another idea that he fails to analyse in any detail. In my own view, the complete formulation of any parallel-universe theory requires the identification of the natural laws which define the self-aware substructures as well as of the physical laws which provide the probabilities for their future experiences. (74)
T.A. Brun, “Probability in Decoherent Histories” quant-ph/0302034.
Brun provides a rather superficial discussion of probabilistic aspects of an undeveloped observer-centered interpretation based on the consistent histories idea. He claims, “I have argued, I hope convincingly, that quantum mechanics (and decoherent histories in particular) can describe the experiences of observers without ambiguity”. His argument for this point seems to consist almost entirely of the statement, “The internal state of the robot is a valid observable, so we can choose basis states which are eigenstates of this observable”.
The most important requirement of any interpretation which purports to describe “observers” is that it can be demonstrated that the description given can be applied to human beings. For reasons I spell out at length in the core papers on this site -- in particular, the thermal nature of neural states and the mesoscopic scale of neural information processing -- I believe that the suggestion that the internal state of a human being can be described as a valid observable in an unambiguous way is totally wrong. (27)
A.M. Steinberg, “Speakable and Unspeakable, Past and Future” quant-ph/0302003.
In the traditional approach to quantum measurement theory, an initial quantum state is prepared, and inferences are made about the probabilities of various observations at a later time. A more time-symmetric formalism, however, is also possible and leads to the idea of “weak measurements”. Steinberg introduces this idea, considers some of the implications of the formalism, and discusses some relevant experiments. (48)
B. d'Espagnat, “A Tentative New Approach to the Schrödinger Cat Problem” quant-ph/0301160.
d'Espagnat makes an analogy between consciousness and hidden variables in quantum theory. Both seem, in some sense, to be markers of aspects of the physical situation, or of the universal wave function, without acting on it. d'Espagnat suggests that it is because of decoherence that the states of hidden variables can provide accurate predictions. A central problem for the panpsychistic picture which he sketches would be to explain the level at which individual human consciousness emerges.
d'Espagnat's subsequent paper (“Consciousness and the Wigner's Friend Problem” quant-ph/0402121) emphasizes difficulties which arise when the existence of more than one mind is taken into account, but does not attempt to address the technical problems of psycho-physical parallelism, and ignores much relevant recent work. (30)
S.A. Gurvitz, “Quantum Description of Classical Apparatus: Zeno Effect and Decoherence” quant-ph/0212155.
Gurvitz presents a model in which a measuring device is treated quantum mechanically. Such models are important in investigating the general issue of whether “measurement” is the defining feature of some sort of boundary between quantum and classical realms. They are also relevant to more specific debates about the validity of the projection postulate, as in Donald 2001 and Donald 2003. (42)
A. Peres and D.R. Terno, “Quantum Information and Relativity Theory” quant-ph/0212023.
Much of the work on quantum information and communication has used a framework of fixed, finite-dimensional, non-relativistic quantum systems. Peres and Terno review some of the ideas and problems which might be relevant if special relativity, quantum field theory, and even general relativity were taken into account. Although their over-confident pronouncements on the measurement problem are naive and ill-thought-out, they do discuss several important technical issues. (26)
S. Saunders, “Derivation of the Born Rule from Operational Assumptions” quant-ph/0211138.
W.H. Zurek, “Environment-Assisted Invariance, Causality, and Probabilities in Quantum Physics” quant-ph/0211037.
Two papers which use similar techniques to argue for the inevitability of the Born rule in the calculation of probabilities for wavefunctions of external quantum systems, appeared in November 2002 on quant-ph. In both cases, so much of the mathematics of Hilbert space is assumed however that I feel that it would be more appropriate to refer to “consistency arguments” for the rule rather than “derivations” of it. Saunders assumes the existence of expectation values for experimental results as linear functions defined by the splitting of a wavefunction into distinguishable orthogonal components. He then assumes that the expectation values are invariant under simple unitary mappings on wavefunction and apparatus, which include permutations and phase changes in the components and refinements of the splittings of the components. He justifies these assumptions by claiming that the mappings amount merely to describing the same experiment in different ways. In his slightly earlier paper, Zurek provides a better justification for comparable assumptions by relying on the physics of locality. Zurek considers a system in contact with an environment in a joint pure state. Using a Schmidt decomposition for the joint state, Zurek assumes that the system has an unknown local wavefunction for which there are probabilities. He argues that these probabilities are invariant under the permutations and phase changes invoked by Saunders for the wavefunction of the system, by showing that such changes can be compensated for by corresponding independent changes on the environment which leave the total global state unchanged. Zurek's fine splittings also arise from processes in the environment rather than in the system.
M. Schlosshauer and A. Fine review Zurek's approach and comment on his assumptions in “On Zurek's Derivation of the Born Rule” quant-ph/0312058. H. Barnum provides another discussion in “No-Signalling-Based Version of Zurek's Derivation of Quantum Probabilities” quant-ph/0312150 and Zurek returns to the subject with an extended discussion in “Probabilities from Envariance” quant-ph/0405161.
At the core of the envariance argument is something like the following idea:
Suppose that, in the usual way, separate observers (Alice and Bob) share a singlet state of two spins. Suppose that Alice is thinking about measuring her spin in the x-direction. She knows that she will either see an up spin or a down spin and she wants to know how likely each result is. She considers using some device to rotate her spin through 180 degrees about the y-axis. This would mean that she would convert an up x-spin into a down x-spin and a down x-spin into an up x-spin. However, using her knowledge of the mathematics of quantum states, Alice notes that if Bob were simultaneously to rotate his spin through 180 degrees about the y-axis, then the total singlet state of both spins would be completely unchanged. Locality leads Alice to deduce that she is as likely to see an up spin as a down spin.
This seems quite convincing. It does depend, however, on prior acceptance of the mathematics of quantum states. It also deals only with what is, in my opinion, an unduly limited type of empirical experience in which the local outcomes for Alice are defined by a set of locally-decoherent quantum states with orthogonal support projections. When Alice considers outcomes involving her own internal states, I believe that more general outcome states need to be considered and that this requires a more general theory of quantum probabilities, as proposed in Donald 1992. (24)
C. Hewitt-Horsman, “Quantum Computation and Many Worlds” quant-ph/0210204.
Working in the incomplete framework of the Saunders-Wallace approach to the Everett interpretation (which I criticize in Donald 2002), Hewitt-Horsman argues that it is possible to describe a simple theoretical quantum computation using a many-worlds picture. Her description involves choosing to refer to as a “world” any appropriate substructure which, according to an implicit, uncharacterized, external observer, would have suitable short term stability. In my opinion, because of the artificiality of her concept of a world, Hewitt-Horsman fails to rebut the more interesting paper by A.M. Steane which she discusses. In this paper (“A Quantum Computer Only Needs One Universe” quant-ph/0003084), Steane puts forward some convincing arguments for why it may be misleading to view quantum computers as performing many computations simultaneously. In fact, any real quantum computer would merely be a physical device. Its existence might help to persuade us of the general correctness of quantum theory, but it would not directly help to solve the problem of the interpretation of the theory. Indeed, in “Copenhagen Computation: How I Learned to Stop Worrying and Love Bohr” quant-ph/0305088, N.D. Mermin gives a simple description of quantum computation in terms compatible with the Copenhagen interpretation. (22)
R.B. Griffiths, “Probabilities and Quantum Reality: Are There Correlata?” quant-ph/0209116.
Griffiths argues that his consistent histories theory is superior to Mermin's “Ithaca interpretation”. For most of this paper, as in “Choice of Consistent Family, and Quantum Incompatibility” quant-ph/9708028, Griffiths seems to be describing his own theory as one in which an external observer has complete freedom to choose any “framework” (any consistent set of histories) to describe any physical system. This makes his claim that his theory describes an objective reality independent of observers quite incomprehensible, unless that objective reality is the reality of the entire set of possible frameworks, which is presumably the reality of a no-collapse interpretation of quantum theory, and is therefore very different from the reality which we apparently observe. Towards the end of his paper, however, Griffiths suddenly turns around and says that his theory expresses a “robust realism”, and that there is a framework which is “based on projectors representing actual properties of the world”. He does not tell us how this framework is to be identified or what he means here by “actual”, except that he says that “a single quasi-classical framework suffices for answering all “classical” questions about the world”. He qualifies this statement by saying, “Actually there are many different, mutually incompatible quasi-classical frameworks which yield descriptions which are indistinguishable at the macroscopic level; for present purposes we can simply think of using one of these.” In Donald 2002, I have argued that such an unspecific quasi-classical framework does not provide an adequate foundation for a theory of psycho-physical parallelism.
In “Some Recent Developments in the Decoherent Histories Approach to Quantum Theory” quant-ph/0301117, J.J. Halliwell reviews some of the ways in which the emergence of classicality can be expressed using a consistent histories formalism. C. Anastopoulos discusses the problem of choosing a framework in “On the Selection of Preferred Consistent Sets” quant-ph/9709051. He reviews the use of coherent states to construct approximately-consistent quasi-classical frameworks. (18)
M.A. Rubin, “Relative Frequency and Probability in the Everett Interpretation of Heisenberg-Picture Quantum Mechanics” quant-ph/0209055.
Rubin provides a fairly straightforward analysis of the measurement of relative frequency in an elementary version of the many-worlds formalism. He emphasizes that finite resolution in the measurement of relative frequency is necessary for physical realizability as well as for application of the mathematics of the probabilistic laws of large numbers. (11)
I. Pitowsky, “Betting on the Outcomes of Measurements: A Bayesian Theory of Quantum Probability” quant-ph/0208121.
Pitowsky considers the constraints which arise when one tries to define probabilities on families of mutually commuting projections on a Hilbert space subject to simple consistency rules. This allows him to provide finite models of the uncertainty principle, the Kochen-Specker theorem, and the violation of Bell's inequality in the context of minimal assumptions about the nature of quantum states as assignments of probabilities. (12)
P.K. Aravind, “A Simple Demonstration of Bell's Theorem Involving Two Observers and No Probabilities or Inequalities” quant-ph/0206070.
For years there has been a process of development and simplification of thought experiments expressing the non-existence of local hidden variables in quantum theory. By invoking a system with four rather than just two qubits, Aravind manages to construct an example which would be particularly striking at the macroscopic level. His example shows two spatially-separated observers (Alice and Bob) able to make a range of tests in such a way that they always appear to get the same result whenever they make the same test, and yet where it is impossible for all the results of all the possible tests to have been fixed in advance.
The explanation of the example in terms of the predicted results of specified quantum measurements does not explain how quantum mechanics itself can work. This is why an interpretation of quantum theory is needed. One possibility might be that non-local influences somehow force the compability between Alice and Bob's results. The idea I prefer is that there is a second stage of measurement at the time when the results are shared, and that the compatibility of the observations is achieved at this stage, by a process of correlation between Alice's worlds and Bob's, driven by the relations between the states which define those worlds and the initial state of the system. (76)
C.A. Fuchs, “Quantum Mechanics as Quantum Information (and only a little more)” quant-ph/0205039.
Fuchs begins this paper by dismissing all serious attempts to understand the ontology of the quantum formalism as quasi-religious zealotry. While I agree that excessive conviction can indeed be a problem, it is however only when a rational, consistent, and complete description of reality is attempted that an interpretation can be destroyed, as, for example, facts about the eigenfunctions of reduced density matrices destroy the modal interpretation; the set selection problem destroys consistent histories; and the formalism of relativistic quantum field theory destroys the Bohm interpretation. Fuchs's pious hope is that salvation will come if we can only find the right slogans, and his claim is that those slogans will be something to do with information.
Despite this silly beginning, the remainder of this long paper fizzes with interesting ideas. First, Fuchs argues that quantum information is local and personal, which makes clear just how close his program is to a many-minds view. He then presents and discusses a very simple proof of a version of Gleason's theorem showing that density matrices provide the only way of attaching non-contextual probabilities to positive operator-valued measures on a Hilbert space. A related argument allows him to derive the tensor product structure for separate quantum systems. It is debatable however whether such positive operator-valued measures really encapsulate the “structure of our potential interventions” into the world, or whether they are merely a very carefully contrived way by which we can sometimes manage to see fairly directly into part of reality's deep structure.
Fuchs goes on to present his Bayesian approach to quantum measurement. He gives an interesting analysis of certain measurement processes in terms of refinements and readjustments of beliefs about quantum states. He also reviews the quantum de Finetti theorem which defines circumstances in which it is appropriate to believe that repeated trials on identical quantum systems can be represented as trials of a probability distribution over repeated quantum states.
Unfortunately, Fuchs's interesting ideas leave him close to the quagmire of total subjectivity. His suggestion that the Hilbert space dimension of the sort of simple quantum system studied in quantum information theory might be something which, in his terms, is not subjective, seems implausible because such simple systems are always state and context dependent idealizations. In my opinion, the existence of objective physical laws and also of some sort of objective initial quantum state, analogous to Everett's “universal wavefunction”, are required if the quagmire is to be avoided. (19)
N.E. Mavromatos, A. Mershin, and D.V. Nanopoulos, “QED-Cavity Model of Microtubules Implies Dissipationless Energy Transfer and Biological Quantum Teleportation” quant-ph/0204021.
In a long series of papers, Mavromatos and Nanopoulos and co-workers have argued for the existence of large scale coherent quantum effects in neural microtubules. In “Theory of Brain Function, Quantum Mechanics and Superstrings” hep-ph/9505374 by Nanopoulos, and “A Non-critical String (Liouville) Approach to Brain Microtubules: State Vector Reduction, Memory Coding and Capacity” quant-ph/9512021 and “Microtubules: The Neuronic System of the Neurons?” quant-ph/9702003 both by Mavromatos and Nanopoulos, the proposals involved speculations about string theory and quantum gravity and the formation and evaporation of tiny virtual black holes. In later papers, including “On Quantum Mechanical Aspects of Microtubules” quant-ph/9708003, “Quantum Mechanics in Cell Microtubules: Wild Imagination or Realistic Possibility?” quant-ph/9802063, both by Mavromatos and Nanopoulos, “Quantum Brain?”, quant-ph/0007088, by Mershin, Nanopoulos, and E.M.C. Skoulakis, and the title paper, the speculations are about coherent cavity modes within the microtubules interacting with the dimer structure of the tubulin subunits. The consequences are supposed to be the emergence of soliton coherent states which can transfer energy without loss; and allow macroscopic entanglements, quantum teleportation, and quantum holography; and can somehow perform quantum computations and thus, somehow, solve various, not very clearly specified, supposed problems.
I am dubious about these speculative proposed physical mechanisms and about the existence of the solitons and about all their supposed consequences, but I am especially dubious about the idea that any serious form of quantum computation takes place in a functioning brain. Decoherence in the warm wet brain is the effect of the unavoidable exchange of energy between the enormous variety of available modes of excitation. In particular, both through their polarization charges and through their physical momenta, any motion of water or of tubulin will inevitably affect and be affected by their surroundings on the vibrational mode timescales of 10^{-13} seconds. The fact that the tubulin subunit itself has a molecular weight of 55 kDalton, and therefore has thousands of thermally active degrees of freedom, makes it difficult to believe that it is reasonable to model any aspect of its behaviour by an isolated qubit. Even if the solitons were actually to exist, they would be macroscopic quasi-classical waves, rather than probability amplitude waves. Serious quantum computation requires the ability to control and manipulate arbitrary superpositions within a large space of quantum wavefunctions. There is no evidence of biological mechanisms for such control and manipulation, at any level. Neural systems have evolved with sensory inputs defined by quasi-classical neural firings and muscular responses determined by quasi-classical neural firings, with control mechanisms provided by changes in synaptic connectivities. Microtubules have evolved as cellular support and transport systems. It would indeed be remarkable if there also turned out to be a significant apparatus of microtubular quantum information processing which generations of microbiologists had somehow failed to notice. (21)
S.L. Adler, “Why Decoherence has not Solved the Measurement Problem: A Response to P.W. Anderson” quant-ph/0112095.
There are two senses in which environmental decoherence theory might be thought to have solved the measurement problem. One sense is in the context of the many-worlds idea, and supposes that decoherence is sufficient to identify a splitting of the universal quantum state into observed quasi-classical parts. I dispute this claim elsewhere in this site. Adler, by contrast, disputes the claim in the ridiculously strong sense that decoherence somehow could even solve the measurement problem without any sort of many-worlds assumption. (16)
H.P. Stapp, “The Basis Problem in Many-Worlds Theories” quant-ph/0110148.
Stapp reviews some of the well-known problems with the simplistic idea of a many-worlds theory which depends on a specific preferred orthonormal basis. (13)
S.M. Hitchcock, “ `Photosynthetic' Quantum Computers?” quant-ph/0108087.
Using an extremely weak definition of a “quantum computer”, Hitchcock argues that biological processes like photosynthesis, which can be viewed, in some sense, as “information processing” and which require quantum mechanics for a complete description, are “quantum computers”. However, if we wish to invoke the properties of quantum computers which can make them more efficient as information processors than conventional computers, then it is necessary to use a much stronger definition. In particular, control over a broad range of manipulations at the level of individual quantum states is certainly required.
“Genuine” quantum computation has been the subject of a huge number of theoretical papers of which “Basic Concepts in Quantum Computation” quant-ph/0011013 by A. Ekert, P. Hayden, and H. Inamori is a good example. These papers tell us what sort of information processing techniques would become possible if we could only somehow manage to get sufficient control at the appropriate level. The most important lesson from such papers for those wishing to invoke quantum computation in biological systems is that, whatever the implementation difficulties, quantum computation would not be magic and does not obviously bring anything new to the mind-body debate.
In “The Physical Implementation of Quantum Computation” quant-ph/0002077, D.P. DiVincenzo discusses five requirements for the implementation of genuine quantum computation. These requirements are not met by any biological system. Indeed, in “Quantum Computing: A View From the Enemy Camp”, cond-mat/0110326 and in “Is Fault-Tolerant Quantum Computation Really Possible?”, quant-ph/0610117, M.I. Dyakonov argues that, even given the possibility of error correction, the requirements on the control of qubits are so difficult to meet by any means that no large scale quantum computer will be built in the forseeable future, if ever. The statement in Dyakonov's first paper that he did not share the hope that the factorization of 15 would be achieved by a quantum computer within twenty years was however unfortunate, as publication of that achievement came within three months of his paper in “Experimental Realization of Shor's Quantum Factoring Algorithm Using Nuclear Magnetic Resonance” by L.M.K. Vandersypen, M. Steffen, G. Breyta, C.S. Yannoni, M. H. Sherwood, and I.L. Chuang, quant-ph/0112176. Nevertheless, although this demonstration that 15 equals 3 times 5 was an experimental triumph, its very difficulty perhaps does more to affirm than to deny Dyakonov's main conclusions.
In fact, Dyakonov claims that NMR techniques cannot amount to genuine quantum computing. This is to some extent just a question of terminology, but there are significant arguments on both sides. There are also some NMR-based computations which are certainly not quantum computing in the strict sense. This applies, for example, to the factorization of 157573 by M. Mehring, K. Mueller, I. Sh. Averbukh, W. Merkel, and W. Schleich which they describe in “NMR Experiment Factors Numbers with Gauss Sums” quant-ph/0609174. For an excellent introduction to NMR quantum computing from first principles, see R. Laflamme, E. Knill, D.G. Cory, E.M. Fortunato, T. Havel, C. Miquel, R. Martinez, C. Negrevergne, G. Ortiz, M.A. Pravia, Y. Sharf, S. Sinha, R. Somma, and L. Viola, “Introduction to NMR Quantum Information Processing” quant-ph/0207172. I would also recommend “Quantum Computing and Nuclear Magnetic Resonance” by J.A. Jones, quant-ph/0106067.
As far as the question of the possible existence of biological quantum computing is concerned, we had better assume that what is commonly referred to as NMR quantum computing is genuine quantum computing, given that it can be done in warm liquid systems. Also the theoretical question of whether the process scales satisfactorily in the limit of arbitrarily many qubits is hardly of relevance for a specific finite biological system. Ultimately however the nature of the physical apparatuses required to make NMR quantum computing work shows why biological systems could not have evolved such techniques or anything similar. The relative isolation of nuclear spins from environmental influences is essential, but at the same time that very isolation makes it difficult to manipulate and to read the states of those spins. Biological systems obviously do not contain 11.7 Telsa magnets, but the necessity for NMR quantum computing of a whole variety of precisely tuned radio-frequency pulses is a much clearer indication that the biological evolution of any such system is impossible. Evolution only tunes something that is already producing some sort of biological benefit.
The evolution of photosynthetic light harvesting is an excellent example of benefit-driven biological fine-tuning at a quantum level. (See, “Robustness and Optimality of Light Harvesting in Cyanobacterial Photosystem I” by M.K. Sener, D. Lu, T. Ritz, S. Park, P. Fromme, and K. Schulten physics/0207070.) Both Hitchcock, in the title paper for this item, and B. Lovett, J.H. Reina, A. Nazir, B. Kothari, and A. Briggs in “Resonant Transfer of Excitons and Quantum Computation” quant-ph/0209078 have proposed that components of suitable light harvesting systems could be used in manufactured quantum computers. The essential fact, however, remains that for a biological organism, the benefit of a quantum computation would be the result of the computation. The ability rapidly to factor large integers would not help a plant to survive in the jungle, but even if it did, no path to an evolved number-factoring quantum computer could exist because there are so many easier ways to factor small numbers.
Fault-tolerant quantum computing, discussed by J. Preskill in “Fault-Tolerant Quantum Computation” quant-ph/9712048 and by A.M. Steane in “Quantum Computing and Error Correction” quant-ph/0304016, allows for the correction of certain types of error at the expense of considerable structural complications. A requirement for error correction would lengthen substantially any hypothetical path to a biological quantum computer.
An example of a problem which does have biological relevance is that of database searching. For searching unstructured databases, L.K. Grover has developed a well-known quantum algorithm. Nevertheless, although he gave a pivotal paper, quant-ph/9706033, the title “Quantum Mechanics Helps in Searching for a Needle in a Haystack”, he does not in fact explain how real grass stalks should be shifted. More significantly, he does not analyse the complexity of the search query. In “Could Grover's Quantum Algorithm Help in Searching an Actual Database?” quant-ph/9901068 C. Zalka draws attention to the benefits of classical parallel processing in database searching and to the artificiality of the assumption of an unstructured database. These issues would both be relevant in biological systems. The question of whether, if quantum computers could be build, Grover's method would give any practical advantage over conventional computers is investigated further by G.F. Viamontes, I.L. Markov, and J.P. Hayes in “Is Quantum Search Practical?” quant-ph/0405001. They argue that no answer can be given without an analysis of the structure of specific search problems and of specific methods for solving those problems. In the biological domain, similar specific analyses should also be required from those who suggest that biological quantum computation could have evolved and that consciousness could depend on it.
Demonstrating what was possible in 2004, M.S. Anwar, D. Blazina, H. Carteret, S.B. Duckett, and J.A. Jones construct an NMR quantum computer which uses Grover's algorithm to find one needle in a total of four stalks in “Implementing Grover's Quantum Search on a Para-Hydrogen based Pure State NMR Quantum Computer” quant-ph/0407091. (23)
D. Wallace, “Everett and Structure” quant-ph/0107144.
Wallace argues that Everett worlds are simply emergent patterns. He claims that they are merely objects of observation, and that there is no more need, at a fundamental level, to provide an exact specification of them and their dynamics than there is to characterize tigers and their behaviour in terms of the motions of individual atoms. However, although zoological classification may reasonably be seen as reflecting human interests, superficial references to decoherence hardly suffice to reduce the physics of observation to psychology.
The application of the philosophy of functionalism in a many-worlds picture requires that “function” be identified, not by a presupposed human observer, but starting from the universal wavefunction and the universal Hamiltonian. Even in the context of decoherence, there is an endless range of possible patterns out there to which, like faces seen in the flames, function might be ascribed. Indeed, there is even a vast range of kinds of pattern. The patterns that encompass human functioning do not seem to emerge from the bare mathematics of the quantum formalism in any obvious or immediate fashion.
In “Quantum Probability from Subjective Likelihood” quant-ph/0312157, Wallace makes a similar attempt to reduce probability to psychology. He succeeds in expressing a particularly simple set of assumptions which lead to the assignment of Born probabilities to events characterized by a fixed sequence of orthogonal projections. However, when it comes to assigning probabilities to events, such as the physical correlates of distinct individual thoughts, which require a different and more sophisticated delineation, he simply abandons the task. Even if I take a subjective view of probablity or even if I am unable to quantify my objective futures, if all I hear of the results of a die throw are the word “Heads” if a six is thrown and “Tails” otherwise, I would be wrong to assume that a fair coin has been tossed, and I can expect to become increasingly sure of my error if the situation is repeated sufficiently often. A theory of probabilities cannot be achieved without an analysis of events. (62)
C.M. Caves, C.A. Fuchs, and R. Schack, “Quantum Probabilities as Bayesian Probabilities” quant-ph/0106133.
Caves, Fuchs, and Schack argue that quantum states are states of belief defined for individual observers. They require that beliefs be constrained by consistency in betting behaviour, and they invoke Gleason's theorem to produce the standard mathematics of quantum theory.
As with many discussions of Bayesian interpretations of quantum probability, a problem with this paper is that there is a blurring between the idea of purely subjective probabilities, in which a person's probabilities have to be consistent but otherwise can be anything that individual chooses; and the idea of knowledge-based observer-dependent probabilities, in which an individual's probabilities are what he should rationally choose given everything he knows. The latter seems to me to improve on the former, but requires analysis of what constitutes knowledge and of why some probabilities would be rational. I have proposed a theory of observer-dependent propensities, in which objective observer-dependent probabilities, defined by physical laws, exist, whether or not we are actually aware of them. This is based on the analysis of probabilities in terms of sequences of fundamental observer-defining random events.
Because Caves, Fuchs, and Schack do not analyse knowledge or observers, their theory should be seen as useful rather than fundamental. This means that it is unnecessary for them to avoid invoking ideas such as the “probability of a probability” or the “probability of an unknown quantum state”. In “Subjective and Objective Probabilities in Quantum Mechanics” quant-ph/0501009, M. Srednicki argues for the utility of such ideas in a Bayesian framework.
In “Subjective Probability and Quantum Certainty” quant-ph/0608190, Caves, Fuchs, and Schack take account of the problem of locality. A standard Kochen-Specker argument leads them to propose that even certainty can be just an observer-dependent probability. This proposal would seem to verge on a many-minds theory, were it not for their continuing refusal to address the nature of observers. They do say that, “Any attempt to give a complete specification of the preparation device in terms of classical facts and thus to derive the quantum operation from classical facts alone comes up against the device's quantum-mechanical nature.” Perhaps they will eventually notice that observers are also quantum-mechanical devices. They will then be in a position to consider how the different certainties of different observers can consistently be observed to be compatible.
Caves, Fuchs, and Schack make the claim that there is a category distinction between “facts” and “probabilities”. In my opinion, although this claim may seem plausible, it is in fact wrong. Suitable quantum states play an essential role in describing both the probabilities that individual observers should assign to their future observations and the actual quantum-mechanical nature of those observers. (49)
C.A. Fuchs, “Notes on a Paulian Idea: Foundational, Historical, Anecdotal and Forward-Looking Thoughts on the Quantum” quant-ph/0105039.
This is a rambling collection of letters and e-mails written between 1995 and 2001 by Chris Fuchs to a wide circle of colleagues. In his friendly, enthusiastic, opinionated style, Fuchs thinks out loud about quantum theory and its meaning. Lots of fun to dip into, to learn from, and to disagree with. Sequels are available from his web site. (1)
C. Simon, V. Buzek, and N. Gisin, “The No-Signaling Condition and Quantum Dynamics” quant-ph/0102125.
Quantum theory and empirical results derived from it appear to tell us that there are situations in which one experimenter, “Alice”, can find out about the genuinely random outcome of observations made by a distant colleague, “Bob”, simultaneously with his observation, but without there having been anything in Bob's laboratory which predetermined his result, or any message of any kind passed between the laboratories at the time of the observations. Assuming standard ideas about quantum observables and probabilities, Simon, Buzek, and Gisin argue that such “spooky actions at a distance” can only be compatible with the impossibility of superluminal communications as long as the average local dynamics is always given by a linear and completely positive map. (28)
L. Hardy, “Quantum Theory From Five Reasonable Axioms” quant-ph/0101012.
Hardy supposes that certain physical systems can be prepared in identifiable replicable states and that measurements made on those states have a finite range of outcomes with determinable probabilities. He supposes that the probabilities for any measurement can be determined from some given minimum number of appropriately chosen measurements, and he identifies a state with the probabilities to which it give rise. This gives the set of states a convex structure. Hardy then proposes some not unreasonable properties for that structure, and uses these properties to derive the mathematical framework of quantum theory. He provides an introductory account of his ideas in “Why Quantum Theory?” quant-ph/0111068
Hardy's work is impressive. Like so much work on quantum measurement theory, however, it depends on an entirely external view of quantum systems. Thus, there is no acknowledgement of the fact that the measuring devices used are themselves also quantum systems. From the point of view of my proposed many-minds interpretation, the fundamental change in a quantum measurement is not from an external quantum state to an observed outcome, but from one experienced quantum state to another. In this context, Hardy's results can be seen as a derivation of the framework of quantum theory from a set of plausible assumptions about one particularly simple class of physical investigations, and his work is not incompatible with the set of axioms for a more fundamental type of quantum probability proposed in Donald 1992 which takes that framework as given.
In “Quantum Theory from Four of Hardy's Axioms” quant-ph/0210017, R. Schack argues that Hardy's probabilities can be interpreted in Bayesian terms. (14)
B. Rosenblum and F. Kuttner, “The Observer in the Quantum Experiment” quant-ph/0011086.
This is an introductory account of the sort of experiment which has led people to consider that observers may have a significant role in quantum mechanics. Rosenblum and Kuttner sketch several ways in which their thought experiments might be interpreted. I would say that, in each experiment, each of us individually encounters a set of possibilities. This set, and the probabilities for each of its elements of our finding ourselves observing that element at the end of the encounter, are determined by physical laws, by our physical natures, and by our prior histories; which include our observations of the outcomes of previous encounters, as well as of the setup of the experiment. (72)
R. Schack, T.A. Brun, and C.M. Caves, “Quantum Bayes Rule” quant-ph/0008113.
Schack, Brun, and Caves use conventional formalism for quantum measurements to derive a method of updating beliefs about quantum states in the light of measured information. Under suitable conditions, they show that this method is precisely analogous to Bayesian updating in classical probability theory. They argue against naive use of maximum entropy methods but do not mention relative entropy. (50)
F.J. Tipler, “Does Quantum Nonlocality Exist? Bell's Theorem and the Many-Worlds Interpretation” quant-ph/0003146.
Tipler argues that the problem of non-locality in quantum theory can be eliminated in a many-worlds interpretation. The crucial point is that for correlations between the results of measurements at spacelike separated regions to be observed, the results need to be compared at a single locality.
This point is also emphasized by C.G. Timpson and H.R. Brown in “Entanglement and Relativity” quant-ph/0212140, which provides a brief review of the relations between entanglement, non-locality, and Bell's inequality. (2)
R.W. Spekkens and J.E. Sipe “Non-Orthogonal Preferred Projectors for Modal Interpretations of Quantum Mechanics” quant-ph/0003092.
Spekkens and Sipe take a fresh look at modal interpretations, for which they provide a useful introduction and review. One novelty in their approach is that they allow for differences in ontology depending on the results of measurements. I believe that a related requirement ought to be expressed in many interpretations of quantum theory. Indeed, this is one of the reasons why I think that, in many-world interpretations, “worlds” need to be observer-dependent. Spekkens and Sipe go on to propose that preferred decompositions of the universal wavefunction be identified by minimizing an entropy function. This is an interesting idea, although, in as far as their proposal merely generalizes the Schmidt decomposition, it inherits that decomposition's problems, including instability under small changes of global wavefunction and under small changes in Hilbert space factorization. Spekkens and Sipe's assumption of a distinguished factorization seems to me to be highly implausible. (58)
C.M. Caves, “Predicting Future Duration from Present Age: A Critical Assessment” astro-ph/0001414.
It is all too easy to make mistakes in probability theory. For example, one might suppose that one could estimate human life expectancy by taking a random sample of the people living at some moment and waiting to discover their ages when they die. Such a sample, however, would be very likely to be under-weight in short-lived individuals. Caves provides a thorough analysis of a range of related issues as he investigates the problems with a suggestion that it might be possible to make universal probabilistic predictions about how long a phenomenon will last, based only on knowing how long it has already lasted at the instant when one happens to stumble across it. (65)
R. Clifton and H. Halvorson, “Entanglement and Open Systems in Algebraic Quantum Field Theory” quant-ph/0001107.
Tragically, Rob Clifton died in July 2002 at the age of 38. He was one of the small community of philosophers of science with a truly extensive knowledge and understanding of the mathematical formalism of quantum theory. In this paper, written for that community, Clifton and Halvorson explain how the nature of quantum states changes and how entanglement becomes ubiquitous when we move from the mathematics of non-relativistic quantum mechanics to that of local relativistic quantum field theory. (17)
D. Buchholz and R. Haag, “The Quest for Understanding in Relativistic Quantum Physics” hep-th/9910243.
Buchholz and Haag give a rapid survey of the development of a mathematical understanding of relativistic quantum field theory. Buchholz provides a rather less abstract discussion, with more emphasis on recent work, in “Algebraic Quantum Field Theory: A Status Report” math-ph/0011044.
The idea of local algebras, and the relations between them, are central in this approach to quantum field theory. In my opinion, the states on these algebras ultimately have more physical relevance and should have a more significant role in the interpretation of quantum theory, than pure state wavefunctions. Although wavefunctions are at the heart of the formalism of the conventional interpretation, they can only be fundamental in non-relativistic zero-temperature models or at the level of the entire global Hilbert space.
Nevertheless, in “Localization and the Interface Between Quantum Mechanics, Quantum Field Theory and Quantum Gravity” arXiv:0711.4600, it seems to me that B. Schroer rather exaggerates the difficulties for measurement theory of local quantum field theory. Much of the empirical support for relativistic quantum field theory depends on the idea that it is an improvement on non-relativistic quantum mechanics. This requires that many of the structures of non-relativistic quantum mechanics have to be mirrored in the deeper theory, at least to good approximation. Thus while it is important to note the absence, at a fundamental level, of localized wavefunctions and of localized particles, there is no conflict with the idea of patterns of values of local projection operators. (15)
H.D. Zeh, “The Problem of Conscious Observation in Quantum Mechanical Description” quant-ph/9908084.
In this thoughtful paper, originally published in 1981, Zeh provided the first explicit published analysis of the Everett interpretation as a many-minds interpretation, or, as he then referred to it, a “multi-consciousness interpretation”. (3)
M. Tegmark, “The Importance of Quantum Decoherence in Brain Processes” quant-ph/9907009.
Using somewhat crude analyses, Tegmark produces numerical estimates of decoherence timescales for certain mechanisms acting on certain superpositions within the brain. Whether this is more convincing than a broad physical understanding of how and on what timescale one charge or ionic or molecular movement in the brain inevitably produces a variety of effects in the surrounding fluid is perhaps a matter of taste. However, despite having recognised the millions of ions which move in response to a single neural firing, Tegmark then proceeds to discuss a model of perception involving a six dimensional Hilbert space! Given that the brain is a hugely complex, warm, wet, dynamic, open system, such simplistic models are surely valueless, as, for example, they assume without argument that states of perception can be modelled by pure states in some unidentified Hilbert space and that those pure states can be exactly repeated.
In quant-ph/9907052 “Can All Neurobiological Processes be Described by Classical Physics?”, A.M. Lisewski responds to Tegmark by arguing that the physics of neural processing is ultimately quantum mechanical. This is undeniable, but it misses the point of Tegmark's paper which is to ask whether there can be large scale quantum coherence in the brain.
S. Hagan, S.R. Hameroff, and J.A. Tuszynski (“Quantum Computation in Brain Microtubules? Decoherence and Biological Feasibility” quant-ph/0005025) respond to Tegmark's paper with a strenuous defence of the Penrose-Hameroff “orchestrated objective reduction” model of quantum computation in neural microtubules. I am actually sceptical about this entire program; partly because I am not persuaded by any of the motivations for it; partly because the mechanisms invoked seem biologically superfluous; and partly because empirical evidence for those mechanisms seems, at least so far, to be lacking. The strongest argument in the paper is that the specific superpositions ruled out by Tegmark are not those claimed in the model. Hagan, Hameroff, and Tuszynski also speculate about ways in which the quantum coherences they require might be protected. If these speculations are to be convincing, however, then ultimately they will have to be presented in much more detail. Moreover, when they suggest the existence of quantum error correction, they will ultimately need to explain how such error correction could have evolved. Indeed, in my opinion, explaining the evolution of any form of quantum computation is a crucial stumbling block for any proposal invoking it in a biological system. Mother nature works with the technology of the day constantly demanding useful improvements in function; she has no time for blue skies research. Wings can evolve because even a little flight -- a long jump -- is useful. Eyes can evolve because even a little sight -- shadow detection -- is useful. A little quantum computation, however, is just a very expensive ordinary computation.
There are a couple of serious flaws in the Hagan, Hameroff, Tuszynski paper. One of the ways in which Tegmark's analyses are “crude” is that he neglects the difference between the dielectric permittivity of the vacuum and that of the neural medium. Hagan, Hameroff, and Tuszynski point out that the medium's dielectric constant may be quite high, but fail to notice that this is precisely because the medium, considered as an environment, is sensitive to the movement of charges and therefore is itself decohering. To argue for decoherence, it is only necessary to demonstrate one decohering mechanism, but coherence requires that every such mechanism be excluded. Even ordered water will have its modes of excitation. Hagan, Hameroff, and Tuszynski also criticize the temperature dependence of Tegmark's formulas without giving any explanation of what they think is wrong with the velocity dependence of the Rutherford scattering cross section. In the context of the strong interactions in the interior of a material system, the intuition that local decoherence is an effect of high rather than low temperatures is certainly arguable. The singlet state, for example, is typical of ground states in combining global coherence with decoherence of the substates of individual constituents. (20)
D. Deutsch, “Quantum Theory of Probability and Decisions” quant-ph/9906015.
Probability theory was invented so that gamblers could decide whether or not accepting any given bet would be sensible in the long term. Conversely, a gambler who knows exactly what bets would be sensible in the long term can use that knowledge to calculate the probabilities of the events on which he is betting. Decision theory gives us a general framework within which any complete rational scheme of preferences for acts implies probabilities for the possible outcomes of those acts. Deutsch uses structures and symmetries of quantum mechanics to argue that any rational scheme of preferences governed by quantum theory should imply standard quantum probabilities. His arguments are criticized by H. Barnum, C.M. Caves, J. Finkelstein, C.A. Fuchs, and R. Schack in “Quantum Probability from Decision Theory?” quant-ph/9907024. In “Quantum Probability and Decision Theory, Revisited” quant-ph/0211104, D. Wallace provides an introduction to decision theory and extends and revises Deutsch's analysis. A shorter version of Wallace's paper is “Everettian Rationality: Defending Deutsch's Approach to Probability in the Everett Interpretation” quant-ph/0303050. By constructing an unphysical counter-example in “On the Everett Programme and the Born Rule” quant-ph/0505059, P. Van Esch shows the necessity of some of the assumptions made in the analysis; for example, assumptions about the equivalence of certain procedures.
Deutsch and Wallace both make extravagant claims for the significance of decision theory. It is certainly the case that the mathematical structure of quantum mechanics makes the Born rule by far the most plausible way of calculating objective probabilities for those external events which have objective probabilities and which can accurately be characterized by projection operators. Decision theory does provide some sort of framework for this argument, but it does not solve any of the deeper conceptual or technical problems. For a gambler, knowing the odds may or may not be straightforward, but what he really wants to know is whether he is going to be lucky. Physicists, on the other hand, already know that it is rational to expect to be typical. Their problems are to understand the nature of the events to which probabilities should be attached, and to discover what sets of events should be thought to be typical by constructing convincing theories which adequately encapsulate what has been learnt from the past.
In “What is Probability?” quant-ph/0412194, S. Saunders introduces the decision theory approach. A wide-ranging exploration of the nature of quantum probability, Saunders' paper is full of interesting ideas and suggestions. However, for reaons which I explained in Donald 2002, I believe that Saunders is wrong to suppose that decoherence theory provides a sufficiently unambiguous branching structure to underlie the emergence of probability. I also think that a framework in which measurement is modelled as the measurement of a disjoint family of projection operators is not adequate for describing all physical observations. (25)
D. Deutsch and P. Hayden, “Information Flow in Entangled Quantum Systems” quant-ph/9906007.
In this intriguing paper, Deutsch and Hayden analyse information flow in quantum mechanics in terms of local changes in Heisenberg operators. Their approach has some similarities with the theory of local algebras, which is used in algebraic quantum field theory to express the locality of relativistic interactions. Deutsch and Hayden, however, consider only short term developments of finite-dimensional systems with interactions defined by quantum computational gates. Focusing on changes in operators, and essentially ignoring local states and observed information, they attempt to argue that “a complete description of a composite system can always be deduced from complete descriptions of its subsystems”. This leads them to posit the rather convoluted idea of local but locally-inaccessible information. I suspect that, without the introduction of observed information, all information in their terms would become locally-inaccessible on cosmological time scales. C.G. Timpson discusses the paper in “Nonlocality and Information Flow: The Approach of Deutsch and Hayden” quant-ph/0312155. (45)
Y.H. Kim and Y. Shih, “Experimental Realization of Popper's Experiment: Violation of the Uncertainty Principle?” quant-ph/9905039.
In 1934, even before the famous Einstein-Podolsky-Rosen paper, Karl Popper considered whether it was possible from measurements of one of a pair of interacting particles to infer properties of the other. Popper subsequently realized that there were serious problems with his initial discussion, but many years later he proposed a related experiment. A version of this has been performed by Kim and Shih. In this experiment, pairs of correlated photons head in opposite directions. On one side of the experiment the photons meet a slit. Popper claimed that the uncertainty principle implies that the partners of those which pass through should scatter as if there were slits on both sides, on the grounds that what we learn about one of a pair we then know about its partner. He predicted that this would not be what is observed. His prediction was confirmed, but he was wrong to suggest that this contradicted quantum theory.
A. Peres points out in “Popper's Experiment and the Copenhagen Interpretation” quant-ph/9910078, that Popper's reasoning is unacceptable because it depends on the counterfactual idea of where the partner particle would have been at a time when it was not being observed. Detailed treatments of the situation which explain the experimental results in quantum theoretical terms have been given by T. Qureshi in “Popper's Experiment, Copenhagen Interpretation and Nonlocality” quant-ph/0301123, “Understanding Popper's Experiment” quant-ph/0405057, and “On the Realization of Popper's Experiment” quant-ph/0505158, and by A. Bramon and R. Escribano in “Popper's Test of Quantum Mechanics” quant-ph/0501134 and “Popper's Test of Quantum Mechanics and Two-Photon "Ghost" Diffraction” quant-ph/0507040. (68)
U. Mohrhoff, “The Pondicherry Interpretation of Quantum Mechanics” quant-ph/9903051.
Mohrhoff claims that problems with quantum mechanics arise because we try to impose a conceptual framework that is more detailed than the actual world. For example, he says that Einstein-Podolsky-Rosen correlations are no problem because, “At a fundamental level, `here' and `there' are the same place”. He argues that reality is built on “facts” and says that, “The actually existing spatial distinctions are those that are warranted by facts”. This is by no means entirely implausible, but the trouble is that Mohrhoff does not provide any characterization of the nature of “facts”. “Clicks” are apparently “facts” (U. Mohrhoff, “Making Sense of a World of Clicks” quant-ph/0202148), but it is not made clear whether a click is supposed to be the movement of a loudspeaker, or the longitudinal wave to which it gives rise, or some eigenstate of some operator defined by the situation, or indeed whether there is supposed to be any relation between the click and what is conventionally taken to be its underlying physics. Just leaving us to imagine that we know a fact when we see one is not giving us an explanation of anything and is missing the point of most technical work on the interpretation of quantum theory.
In “Quantum Mechanics and Consciousness: Fact and Fiction” quant-ph/0102047, Mohrhoff expresses his annoyance at consciousness being dragged into discussions of physics. The reason that it has been brought in, however, is just that consciousness does appear to form a natural and ultimate division between possibilities and facts. The idea is that the least detailed framework compatible with our observations should be a framework defined by mental “facts”. Even this idea, however, does still require us to provide a characterization of the nature and temporal development of mental facts.
Mohrhoff's work is discussed by L. Marchildon in “Remarks on Mohrhoff's Interpretation of Quantum Mechanics” quant-ph/0303170. (29)
W.H. Zurek, “Decoherence, Einselection, and the Existential Interpretation (the Rough Guide)” quant-ph/9805065.
Zurek reviews his analysis of the existence of comparatively stable “pointer states” for quantum systems in contact with suitable environments. Although, in general, these states cannot be uniquely defined, they can, to a good approximation, be reproducibly observed without disturbance; for example, by looking (perhaps literally) at the environment. This is interesting and significant work, based on a wide variety of detailed models. However, it remains phenomenological. Zurek's identification of “pointer states for neurons” is approximate and depends on how those states would behave under investigation by external observers or over extended time periods. While this is certainly useful in explaining the externally-observed functioning of neurons for practical purposes, it is, in my opinion, a long way from what is required for the foundation of a fundamental interpretation of quantum theory.
In “Decoherence, Einselection, and the Quantum Origins of the Classical” quant-ph/0105127, Zurek provides another and longer review of his work. He suggests that he is providing a new paradigm appropriate for textbooks on quantum theory. I have no doubt that serious students of the subject will learn much from his papers, but there are many questions, in particular those raised in Donald 2002, which they do not answer. If we try to analyse the detailed real-time functioning of an individual human brain for itself, there is considerable ambiguity as to the precise scales on which information is being experienced and as to the precise physical structures which represent that information. These ambiguities cannot be resolved by looking for the existence at longer time scales of less detailed information; whether in entitities external to the brain or in subsequent memory traces.
In “Emergence of Objective Properties from Subjective Quantum States: Environment as a Witness” quant-ph/0307229 and “Environment as a Witness: Selective Proliferation of Information and Emergence of Objectivity” quant-ph/0408125, H. Ollivier, D. Poulin, and W.H. Zurek investigate and model the mirroring of multiple redundant copies of certain information out of a quantum system into the environment of that system. (4)
C. Kiefer and E. Joos, “Decoherence: Concepts and Examples” quant-ph/9803052.
Kiefer and Joos give a brief sketch of the theory of decoherence. A brief philosophical analysis is given by H.D. Zeh in “What is Achieved by Decoherence?” quant-ph/9610014. Another discussion of the same material is given by E. Joos in “Elements of Environmental Decoherence” quant-ph/9908008.
“Decoherence and the Transition from Quantum to Classical -- Revisited” quant-ph/0306072 by W.H. Zurek is an update of an introduction to decoherence theory which appeared in 1991 in Physics Today. Zurek has added comments on some more recent theoretical work, mainly by himself and his collaborators, and he also refers to some recent experimental results. (5)
M. Tegmark, “The Interpretation of Quantum Mechanics: Many Worlds or Many Words?” quant-ph/9709032.
An elementary and mainly naive discussion of the many-worlds interpretation. Tegmark suggests that decoherence theory has solved the outstanding problems of the interpretation; a suggestion which I dispute at length elsewhere in this site. Tegmark also suggests that by performing many repetitions of a version of the Schroedinger cat experiment with oneself as the cat, one would be able to distinguish between the many-worlds interpretation and the Copenhagen interpretation. While the question of observer effects in a many-minds interpretation is a complicated and highly technical one, the idea that, because an individual will not be aware of his own death, he need not treat it as a possibility is simply absurd. (6)
G.C. Hegerfeldt, “Problems about Causality in Fermi's Two-Atom Model and Possible Resolutions” quant-ph/9707016.
For a Hamiltonian H which is bounded below, an inner product like f(t) = (φ, exp(-itH)ψ) is the boundary value of an analytic function. As a result, either f vanishes identically, or it can only have isolated zeroes. This calls into question, in relativistic as well as non-relativistic quantum theories, the idea that a wavefunction like ψ could describe a system as having been precisely localized in one region with no possibility that an effect, which would apparently be due to that system, could occur in some distant region until light could travel between them.
In an earlier discussion of this problem, in “There are No Causality Problems for Fermi's Two Atom System” hep-th/9403027, D. Buchholz and J. Yngvason point out that, although the mathematics of local quantum field theory provides a formalism within which a plausible concept of locality compatible with relativity can be defined, that formalism does not allow local states to be specified by single local projection operators. Moreover, almost all physical states in local quantum field theory are locally faithful. This means that any local projection will have non-zero expectation, and so there is always some possibility that any given local effect will appear to occur spontaneously. Hegerfeldt's result can be seen as a generalization of this statement and to say that, in any quantum system, if an effect can ever be brought about, then at almost any moment it can spontaneously appear to occur. (85)
A. Kent, “Against Many-Worlds Interpretations” gr-qc/9703089.
In this 1990 paper, Kent argues forcefully and correctly that various elementary versions of the many-worlds idea are incomplete. (7)
R. Geroch, “Suggestions For Giving Talks” gr-qc/9703019.
Really bad talks are given by people who don't understand their subject. These are rare. Ordinary bad talks are given by people who overestimate their audience's knowledge and underestimate their own. These are common. We all endure listening to such talks, each assuming that we are alone in not being able to follow. Also common are talks in which we are given no reason to care about what is being said. The situation would improve, if, as speakers, we were all to act on Geroch's excellent advice. Although directed at relativists, his suggestions are universally applicable. (61)
L. Vaidman, “On Schizophrenic Experiences of the Neutron or Why We Should Believe in the Many-Worlds Interpretation of Quantum Theory” quant-ph/9609006.
An introductory account of the many-worlds (or many-minds) idea at an elementary level. The problem with the paper is the assumption, without detailed definition, of the existence of sentient observers. Given those observers, Vaidman supposes that a local, decoherent, preferred basis can be defined and that probabilities can be explained as ignorance probabilities. (8)
M. Tegmark, “Does the Universe In Fact Contain Almost No Information?” quant-ph/9603008.
Tegmark argues that in a many-worlds (or many-minds) interpretation, it is possible for the universal wave-function (or universal quantum state), to be a simple state (for example, the vacuum state of some “theory of everything”) with the complexities we see around us being a result of how that state is observed from our subjective points of view. (9)
D.N. Page, “Sensible Quantum Mechanics: Are Only Perceptions Probabilistic?” quant-ph/9506010.
Page sketches a tedious series of broad proposals for variants of a type of many-minds (or “many-perceptions”) interpretation. He either ignores, chooses to repudiate, or gives completely vague answers for, all the most difficult issues; including the nature of psycho-physical parallelism and the way in which minds develop in time. The shortened version of this paper (“Sensible Quantum Mechanics: Are Probabilities only in the Mind?” gr-qc/9507024) still does not address the difficult issues, but it is much more readable and it does have a few interesting remarks. A later version with more introductory material is “Mindless Sensationalism: A Quantum Framework for Consciousness” quant-ph/0108039. (10)
F. Dowker and A. Kent, “On the Consistent Histories Approach to Quantum Mechanics” gr-qc/9412067.
In this superb paper, Dowker and Kent provide a thorough critical analysis of consistent histories formalisms. They argue convincingly that any such formalism could, at best, form only part of a complete interpretation of quantum mechanics. (40)
* * * * * * * * * * * * * *
* * * * * * * * * * * * * *
Notes on some relevant, or significant, or recommended books.
home page: http://www.bss.phy.cam.ac.uk/~mjd1014