From Prediction to Readability: Quantum Mechanics as a Constructive Program
Abstract
Einstein's distinction between "principle theories" and "constructive theories" is revisited to re-evaluate quantum mechanics in a material-operative key. It is argued that the Bohmian reading realizes the constructive ideal: it introduces microstructure (particles with positions and a guiding law) and takes the wave function as an operative symbol—a material structure that represents material relations and organizes non-signaling global dependencies—dispensing with ontological collapses. Intelligibility is treated as functional readability: trajectories and marks stabilized by measurement make the genesis of the phenomenon traceable. The transition to the classical is described as the operative stabilization of global dependencies (not as a mere limit ħ→0). Explicit evaluation criteria—ontological cost vs. explanatory gain—are proposed and applied to canonical cases (interference, Bell), showing that the particle/symbol asymmetry is a structural feature compatible with relativistic causality. The contribution consists in translating the principle/constructive opposition into a grammar of inscription and stabilization, offering a procedural narrative from the micro-process to the mark that preserves empirical adequacy and clarifies the formalism's ontological options.
Keywords: Quantum mechanics; Principle theories; Constructive theories; Bohmian interpretation; Operative symbol; Non-locality; Functional readability; Inscription; Operative stabilization.
Introduction
In the framework adopted here, Einstein's distinction between "principle theories" and "constructive theories" is recoded as an opposition between regimes of invariances (rules of prediction and consistency) and constructive programs (models with microstructure that make the genesis of phenomena traceable). In this reading, the Copenhagen interpretation operates as a regime of principles, while the Bohmian proposal configures a constructive program: it introduces a microstructure (particles with positions and a guiding law) and takes the wave function as an operative symbol—a material structure for dynamic guidance, an immanent representation of material relations that reorganizes non-signaling global dependencies—not a descriptive convention or a transcendent entity, and without postulating an ontological collapse.
Three shifts are highlighted: (i) intelligibility through functional readability (trajectories and marks stabilized in measurement); (ii) the transition to the classical as the effective stabilization of global dependencies (instead of the formal limit ħ→0); (iii) non-locality as a non-signaling global dependency, compatible with relativistic causality. This framework provides comparative criteria—ontological cost vs. explanatory gain (additional postulates; particle/symbol asymmetry; micro-complexity → mark; explicit treatment of global dependencies)—and supports the thesis that the Bohmian reading realizes the ideal of a constructive theory envisioned by Einstein.
Thus, a distinctive argumentative architecture is contributed, anchored in the notions of inscription and functional readability, which operationalizes the criteria of ontological cost and explanatory gain in canonical cases without resorting to ontological collapses. The novelty lies in the material translation of the principle/constructive distinction into a grammar of inscription and operative stabilization, making explicit the non-signaling global dependencies as a structural condition and not as a methodological exception. Any terminological or illustrative coincidences (e.g., double-slit, Bell, wave function) are inherent to the domain and do not result from textual reuse.
Intelligibility is not decorative: it replaces the operational opacity of the standard reading with a traceable procedural narrative, from the micro-process to the mark, while preserving empirical adequacy.
The standard/Copenhagen interpretation has been treated as a regime of principles: the priority is empirical adequacy, and the formalism functions as a prediction algorithm. This attitude is expressed by Fuchs and Peres (2000, p. 70): "quantum theory does not describe physical reality; it provides an algorithm for probabilities of macroscopic events ('detector clicks')." The pedagogical result is an emphasis on operative skills (mathematical and computational) at the cost of suspending the ontological question: what does the theory describe as physical reality? This position revives instrumentalist empiricism, which excludes ontology as a condition of meaning. In light of the Ontology of Emergent Complexity (OEC), adopted as a reference framework, this refusal is unfounded: explicit ontological commitments are required beyond the predictive algorithm, formalized as an obligation to spell out the procedural chain of inscription. What is missing is not a calculation technique, but a procedural model that makes the path from the micro to the stabilized traceable.
In the constructive program advocated here, this requirement is met by entities and processes with objective status (particles with positions and not-necessarily-classical trajectories; marks inscribed in measurement; the wave function understood as an operative symbol that guides dynamics and reorganizes possibilities), while maintaining empirical equivalence with standard QM.
The standard interpretation, founded on Bohr's worldview and instrumentalism, declares itself complete: it would not admit relevant additional microstructures. From this arose the well-known impossibility arguments against deterministic or hidden-variable readings. In contrast, Einstein maintained that QM is incomplete and that determinism could be restored in a manner analogous to statistical mechanics, with underlying trajectories and statistical descriptions motivated by epistemic limitations on the initial conditions of ensembles.
Misunderstandings in the Ontology of Quantum Mechanics
Thesis: While maintaining the empirical equivalence of standard QM, the constructive reading defended here makes the genesis of phenomena traceable, takes the wave function as an operative symbol (non-signaling global dependencies), and reinterprets the transition to the classical as stabilization—offering greater intelligibility with lower ad hoc ontological cost.
This section identifies fundamental misunderstandings crystallized in the Copenhagen tradition and in current teaching, and formulates rectifications within the adopted theoretical framework (constructive program and readability criteria). The goal is not to reopen historical controversies, but to reorder categories to recover intelligibility without loss of empirical adequacy.
For historical clarity: (i) von Neumann's (1932) proof rests on an operator realism that is not mandatory; (ii) Bohm (1952) provides a constructive counterexample; (iii) Bell (1964) shows the necessity of non-locality in correlations, regardless of determinism; (iv) Kochen–Specker (1967) makes contextuality explicit, not a veto on microstructure.
(1) Reification of the formalism.
Misunderstanding: Taking the wave function as a complete physical entity with a robust ontological status in itself.
Rectification: The wave function is an operative symbol that guides dynamics and reorganizes possibilities; it does not exhaust the system's description. The constructive reading requires additional microstructure (positions/trajectories) to make the genesis of results readable (Bohm, 1952; Bohm & Hiley, 1993).
(2) "Completeness" ≠ veto on microstructures.
Misunderstanding: Declaring the theory complete forbids ontological complements.
Rectification: "Completeness" is a practical criterion dependent on explanatory ends. A microstructure that increases readability without altering predictions is philosophically pertinent (Einstein, 1919). Clarification: The unobservable is not metaphysical—it indicates quantities that are operationally inaccessible in the measurement regime but can be integrated into a procedural model that explains the observable marks.
(3) Theorems as impossibilities of principle.
Misunderstanding: Reading von Neumann as a general proof against hidden variables.
Rectification: The proof presupposes a naive realism of operators that is not mandatory; it is therefore innocuous against constructive programs (von Neumann, 1932). Kochen–Specker (1967) reinforces limits on the simultaneous attribution of values (contextuality), but does not prohibit microstructure.
(4) The meaning of Bell's theorem.
Misunderstanding: Bell would refute determinism or hidden variables.
Rectification: Bell (1964) shows that EPR-type correlations imply non-locality as a global dependency of the relevant variables, whether the theory is deterministic or probabilistic—without superluminal signaling. A minimal procedural example (EPR–Bohm): correlated preparation → spatial separation → local measurement → inscribed correlation without a message channel (non-signaling).
(5) Measurement as revelation, not inscription.
Misunderstanding: Measurement would reveal pre-existing properties of the isolated subsystem or would require a special collapse.
Rectification: Measurement is a coupling that stabilizes differences under controlled material conditions, producing readable marks. It does not require ad hoc collapses; it requires a procedural model of the genesis of inscription (Fuchs & Peres, 2000; Ghirardi, Rimini & Weber, 1986, as a dynamic contrast).
Comparative note: In contrast to QBism (Fuchs & Caves), which reads probabilities as agent beliefs, the adopted perspective anchors prediction in materially controlled inscription processes. Similarly, in the face of the relational reading (Rovelli), once the mark is stabilized in the device, the result becomes objective as a persistent difference, independent of the observer as an agent.
(6) The double-slit as an irresolvable paradox.
Misunderstanding: The "wave/particle" alternation is resolved only by complementarity.
Rectification: Each particle follows one slit; the trajectories are guided by global dependencies encoded in the operative symbol; the pattern results from the genealogy of the marks on the screen (de Broglie, 1927; Bohm, 1952). Procedural template: initial conditions → couplings → global guidance (operative symbol) → inscription (marks) → stabilization.
(7) Locality confused with causality.
Misunderstanding: The requirement of locality would be inseparable from relativistic causality; therefore, non-locality would be unacceptable.
Rectification: The relevant non-locality is non-signaling: it admits global dependencies in correlations without faster-than-light transport of information/energy; relativistic causality remains intact (Bell, 1964). Classical separability is a useful ideal (Newton, 1999, as a backdrop), not an ontological dogma (in contrast with Einstein).
From this, comparative criteria emerge for evaluating quantum interpretations: (i) ontological cost vs. explanatory gain; (ii) procedural readability (from micro-process to inscription); (iii) explicit treatment of non-signaling global dependencies. The Copenhagen tradition maintains empirical adequacy but tends to confuse levels of description (formal, operational, ontological). The constructive reading preserves predictions and recovers intelligibility by articulating objects and processes (particles and trajectories), the operative symbol (wave function), and measurement marks into a coherent story of phenomenon production. Thus prepared, the next step is to reread the formalism without formulas, but with clear ontological commitments.
Quantum Formalism and Its Interpretation
A Rereading without Formulas—Five Operative Statements
(i) Physical State. The state of a system is represented in a state space appropriate to the domain, which encodes possibilities of outcomes.
(ii) Temporal Evolution. The evolution is determined by a well-defined dynamics (in the quantum case, Schrödinger's equation), which governs how possibilities reconfigure over time.
(iii) Observables. To each measurable quantity corresponds an operational procedure; the admissible values are fixed by the formalism (discrete and/or continuous spectra).
(iv) Expected Values. The averages in series of measurements obey the standard rule of the formalism, without needing to postulate ad hoc collapses.
(v) Superposition and Update. States admit superposition; the post-measurement update is handled by the adopted interpretation (what follows compares two competing regimes).
The formalism as such is ontologically neutral: it fixes relationships between quantities and rules of prediction. The interpretation gives it an ontological structure; in the Bohmian reading, this is done through microstructure and an operative symbol that guides the dynamics.
Bohm's Reading (1952)—revisiting de Broglie (1927).
A microstructure is introduced: particles with positions and (non-classical) trajectories, guided by a wave function taken as an operative symbol. This symbol does not reveal a "hidden state"; it organizes global dependencies and guides dynamics, allowing the genealogy of marks in measurements to be reconstructed (Bohm & Hiley, 1993). A conceptual rewriting of the wave function (amplitude and phase) reveals an additional term—the quantum potential—that captures deviations from classical mechanics and makes explicit non-local dependencies between parts of the system (Holland, 1993). The trajectories are deterministic and sensitive to initial conditions (explaining, for example, the double-slit pattern), while the observed statistics emerge from the distribution of initial positions and the coupling with the device. To measure is to couple the system to the device in order to stabilize differences as readable marks; there is no ontological collapse, but a dynamics of coupling and inscription. The relevant non-locality is non-signaling: it correlates distant results without allowing superluminal transmission of information. The quantum→classical transition does not require forcing ħ→0; it consists of stabilizations in which the quantum potential becomes ineffective and global dependencies are deactivated in the relevant degrees of freedom, resulting in effectively Newtonian behaviour for macroscopic variables (Bohm & Hiley, 1993; Holland, 1993). Decoherence (Zurek) acts as a stabilization mechanism that suppresses interference in the relevant degrees of freedom; it does not solve the measurement problem by itself, but integrates into the procedural picture of mark inscription, clarifying the quantum→classical transition. Functional readability increases (tracing from micro-process to marks) and objectivity is reinforced (intersubjective stability of marks), at the price of accepting non-locality and an action-reaction asymmetry between particles and the operative symbol—understood here as a structural feature of the quantum regime, not a defect. This asymmetry does not violate the classical principle of action-reaction: the wave function does not act as a localized agent, but rather encodes global coupling conditions that constrain the trajectories.
Standard Reading (Bohr; Heisenberg; von Neumann).
Prediction and consistency rules are privileged, treating the formalism as an algorithm for probabilities of outcomes (Fuchs & Peres, 2000). Three theses structure this regime:
(1) Complementarity. Certain descriptions are mutually limiting (wave/particle), requiring distinct vocabularies depending on the experimental conditions.
(2) Completeness. The state vector is considered descriptively sufficient; additional trajectories or microstructures are deemed unnecessary.
(3) Instrumentalism. Physical meaning is anchored in measurement procedures; collapse functions as an update rule linked to the act of measuring, without an agreed-upon dynamic mechanism.
This reading is empirically adequate but pays the price of lower readability: the preparation → inscription interval remains opaque; the transition to the classical is often formulated through unenlightening formal limits; and non-locality appears as a formal feature, rather than as an ontologically explicit structure of dependencies.
Comparative Assessment.
Both preserve the empirical success of the formalism. Copenhagen maximizes economy of principles; Bohm maximizes intelligibility by making explicit micro-processes, global dependencies, and inscription mechanisms. The philosophical choice compares ontological cost and explanatory gain, criteria that will guide the subsequent analysis. Constructive does not just mean "more detailed," but readable in the genesis of phenomena through explicit microstructure (entities, couplings, trajectories) and functional readability criteria. A regime of principles privileges constraints and symmetries that ensure predictive consistency, without additional ontological commitment.
Constructive Theories and Locality
What is established at this point is: (i) separability as a heuristic ideal, not an ontological dogma; (ii) non-locality as a structure of non-signaling global dependencies; (iii) the constructive status of the Bohmian reading versus principle regimes; (iv) ontological cost/explanatory gain as a criterion for rational choice between interpretations.
Einstein distinguished between constructive theories and principle theories. The former start from microstructural models to reconstruct complex phenomena (the kinetic theory of gases explains the macroscopic through molecular collisions); the latter fix general constraints that any model must obey (Thermodynamics regulates without descending to individual trajectories). In this framework, kinetics is subordinated to more comprehensive principles. Einstein valued constructive theories for their mechanistic clarity and adaptability, while recognizing the logical perfection and foundational stability of principle theories; he considered special relativity a principle theory. In contemporary debate, Brown (2005) proposes a dynamic/constructive reading of relativity, contested by Janssen (2009), who clarifies its scope and limits.
Applied to quantum mechanics, Bub argued that the standard framework is, fundamentally, a principle theory of logical structure. However, this perspective can be broadened: the Bohmian interpretation, with its additional microstructure, qualifies as a constructive theory for non-relativistic phenomena. It starts from micro-systems (particles in motion) and reconstructs phenomena based on the principle that trajectories are guided by a wave function taken as an operative symbol. Thus, it enhances intelligibility by offering a coherent representation of the production of phenomena. As Einstein wrote (1919, p. 13): "to understand a set of natural processes" means to have "a constructive theory" that encompasses them.
Despite this, Einstein received Bohm with indifference. For him, the EPR argument anchored the demand for locality as a condition of completeness: a physically intelligible theory should specify the state of each part without violating the relativistic constraint that causal influences cannot exceed the speed of light. From this follows the principle of separability as a criterion for the objectivity of composite systems.
From a constructive perspective, the goal is not to restore "classical" ideals, but to reconstruct procedural readability: guided trajectories, couplings, and inscriptions explain how marks are produced—without deterministic nostalgia. Visualization is the effect of a material chain (conditions → couplings → global guidance → inscription → stabilization), not an inherited aesthetic value. Objectivity is not "observer independence," but the intersubjective stability of marks under reproducible conditions. The elimination of ad hoc collapses and the explication of non-signaling global dependencies increase the operational coherence of the picture and fulfill the constructive desideratum in the Einsteinian sense.
As for locality, it remains a methodological ideal in classical domains; however, in the quantum regime, the relevant non-locality is non-signaling: it admits correlations at a distance without superluminal transport of information/energy, preserving relativistic causality. The trade-off is real: procedural readability is gained by accepting global dependencies. Intelligibility does not require strict locality; it requires traceable chains to the mark and non-signaling. The tension with local field frameworks remains fertile, demanding a nuanced understanding that reconciles these aspects.
In light of Bell's theorems, insisting on a strictly local description is untenable; the point is not determinism, but the acceptance of global dependencies in correlations. Bell's re-evaluation is decisive:
"In 1952 I saw the impossible done... Bohm showed... [that] the subjectivity of the orthodox version... could be eliminated... The essential idea had already been advanced by de Broglie... Why did the 'impossibility proofs' continue after 1952?... Should not [the pilot wave] be taught—not as the only way, but as an antidote to the prevailing complacency?" (Bell, 2004, p. 160)
The testimony underscores two points: (1) vagueness and subjectivity do not stem from the facts, but from theoretical choices; (2) a constructive alternative is possible without empirical loss, provided one accepts non-signaling non-locality. Thus, the evaluation between readings is not a dispute over predictions, but a comparative examination of intelligibility.
In operative summary: (a) principle regimes maximize consistency with ontological parsimony; (b) constructive theories maximize readability through microstructure and mechanisms; (c) separability is a contingent ideal; (d) the relevant non-locality is non-signaling; (e) the Bohmian reading fulfills the constructive desideratum by accepting explicit global dependencies.
The Bohmian and Copenhagen interpretations are empirically equivalent in the non-relativistic domain (spectra, scattering, superconductivity, tunneling). The difference is conceptual: the standard reading privileges the predictive algorithm; the constructive reading privileges the procedural history that leads to the marks. Since the 17th century (Descartes; Leibniz; Newton), the ideal of understanding has combined formalism with intelligible models. Today, formalism remains central, but interpretation is not dispensable: calculating without saying what exists and how the inscription is produced is philosophically insufficient. The constructive reading shows how one moves from the micro-process (guided trajectories) to the inscription (stabilized marks), without invoking special collapses (Fuchs & Peres, 2000; Bohm & Hiley, 1993).
Conclusion
In teaching, a practical advantage of this approach lies in its functional readability: the student follows the causal chain and learns to relate initial conditions, couplings, and results. Instead of a generic appeal to "complementarity," the mechanism that explains the pattern is described (in the double-slit: one slit per particle; global guidance; inscription of marks on the screen): initial conditions → couplings → global guidance → inscription → stabilization.
On a methodological level, the analogy between quantum equilibrium and thermal equilibrium opens space for techniques from statistical physics (ensemble averages, molecular dynamics) and for computational uses in quantum chemistry and materials science (e.g., DFT/TD-DFT (Hohenberg & Kohn, 1964; Kohn & Sham, 1965; Runge & Gross, 1984) as a heuristic for reading outputs), without requiring new phenomenology (Holland, 1993; Bohm & Hiley, 1993). This is not to confuse quantum equilibria with classical ergodicity: in the quantum framework, effective equilibria emerge from decoherence and preparation regimes, not from strong ergodic hypotheses.
Regarding quantum computing and information, the non-local structure can be read as a network of dependencies that clarifies gates and protocols without contradicting relativistic causality. This reading facilitates the intuition of a circuit (qubits, gates, measurements) without ontological collapses, emphasizing processes of inscription and non-signaling correlation.
Summary comparison. Everett's (1957) proposal and the developments of "Many-Worlds" (Wallace, 2012) eliminate collapse but increase the ontological cost (proliferation of branches) and tend to reduce procedural readability (the micro → mark chain is less clear). By the adopted criterion—functional readability and cost/benefit—the constructive solution is preferable.
In cosmology and observerless regimes, where there is no "observer" (early universe), the constructive reading avoids paradoxes: a latent state is not "revealed"; an inscription is generated from couplings that stabilize differences under specific physical conditions. Intelligibility results from the genealogy of marks, not from a postulate of collapse.
Quantum equilibrium versus non-equilibrium. Beyond the equilibrium regime (where stable distributions allow for direct statistical reading), the relevance of non-equilibrium regimes is admitted, in which the structure of global dependencies may leave observable signatures. The philosophical utility lies in distinguishing stabilization from transient: the former provides robust readability; the latter requires interpretive care and may offer windows for conceptual testing.
In plasmas and quantum optics, in dense and strongly correlated media (quantum plasmas) and in quantum optics setups, reading processes as couplings that generate marks helps to clarify both collective phenomena and communication protocols. The emphasis remains on the genealogy of inscription, not on additional collapse hypotheses.
Limitations and scope. The framework adopted here remains non-relativistic; non-locality is read as a non-signaling global dependency; the asymmetry between the operative symbol and particles is taken as a structural feature of the quantum regime, not a defect. Open questions include relativistic extensions and fine empirical criteria to distinguish stabilization from masking by noise.
Work agenda: (i) metrics of functional readability (from micro-process to inscription); (ii) systematic integration with methods from statistical physics (heuristics like DFT/TD-DFT (Hohenberg & Kohn, 1964; Kohn & Sham, 1965; Runge & Gross, 1984) for reading results without imposing new phenomenology); (iii) didactic clarification through procedural models (double-slit, interferometry) that make explicit the chain of initial conditions → couplings → marks; (iv) comparative mapping with objective collapse models (GRW/CSL), making explicit the additional ontological cost versus gains in prediction/explanation.
In summary, the constructive reading preserves the predictions and proposes an ontologically clear narrative: an operative symbol (wave function) for global dependencies, microstructure (positions/trajectories) for objectivity, inscription for measurement, and stabilization for the transition to the classical. The balance between ontological cost and explanatory gain favours, in this framework, the constructive option: it describes how results emerge from an explicit microstructure and non-signaling global dependencies, dispensing with ad hoc collapses; the transition to the classical is read as an effective stabilization of behaviours, not as a formal limit on ħ. Thus, the comparison between interpretations is made by cost/gain, not by divergence of predictions, and confirms the pedagogical and methodological utility of a procedural narrative (from micro to inscription) that enhances readability without sacrificing the precision of the formalism.
Bibliography
Bell, John S. 1964. "On the Einstein–Podolsky–Rosen Paradox." Physics Physique Fizika 1 (3): 195–200.
Bell, John S. 2004. Speakable and Unspeakable in Quantum Mechanics. 2nd ed. Cambridge: Cambridge University Press.
Bohm, David. 1952. "A Suggested Interpretation of the Quantum Theory in Terms of 'Hidden' Variables I & II." Physical Review 85 (2): 166–193.
Bohm, David, and Basil J. Hiley. 1993. The Undivided Universe: An Ontological Interpretation of Quantum Theory. London: Routledge.
Brown, Harvey R. 2005. Physical Relativity: Spacetime Structure from a Dynamical Perspective. Oxford: Oxford University Press.
Bub, Jeffrey. 1997. Interpreting the Quantum World. Cambridge: Cambridge University Press.
Caves, Carlton M., Christopher A. Fuchs, and Rüdiger Schack. 2002. "Quantum Probabilities as Bayesian Probabilities." Physical Review A 65: 022305.
de Broglie, Louis. 1927. "La mécanique ondulatoire et la structure de la matière et du rayonnement." Journal de Physique et le Radium 8 (5): 225–241.
Einstein, Albert. 1919. "What is the Theory of Relativity?" The London Times, 28 November 1919.
Everett, Hugh. 1957. "'Relative State' Formulation of Quantum Mechanics." Reviews of Modern Physics 29 (3): 454–462.
Fuchs, Christopher A., and Asher Peres. 2000. "Quantum Theory Needs No 'Interpretation'." Physics Today 53 (3): 70–71.
Ghirardi, GianCarlo, Alberto Rimini, and Tullio Weber. 1986. "Unified Dynamics for Microscopic and Macroscopic Systems." Physical Review D 34 (2): 470–491.
Hohenberg, Pierre, and Walter Kohn. 1964. "Inhomogeneous Electron Gas." Physical Review 136 (3B): B864–B871.
Holland, Peter R. 1993. The Quantum Theory of Motion: An Account of the de Broglie-Bohm Causal Interpretation of Quantum Mechanics. Cambridge: Cambridge University Press.
Janssen, Michel. 2009. "Drawing the Line Between Kinematics and Dynamics in Special Relativity." Studies in History and Philosophy of Modern Physics 40 (1): 26–52.
Kochen, Simon, and Ernst P. Specker. 1967. "The Problem of Hidden Variables in Quantum Mechanics." Journal of Mathematics and Mechanics 17 (1): 59–87.
Kohn, Walter, and Lu Jeu Sham. 1965. "Self-Consistent Equations Including Exchange and Correlation Effects." Physical Review 140 (4A): A1133–A1138.
Newton, Isaac. 1999. The Principia: Mathematical Principles of Natural Philosophy. Translated by I. Bernard Cohen and Anne Whitman. Berkeley: University of California Press.
Rovelli, Carlo. 1996. "Relational Quantum Mechanics." International Journal of Theoretical Physics 35 (8): 1637–1678.
Runge, Erich, and E. K. U. Gross. 1984. "Density-Functional Theory for Time-Dependent Systems." Physical Review Letters 52 (12): 997–1000.
von Neumann, John. 1932. Mathematische Grundlagen der Quantenmechanik. Berlin: Springer.
Wallace, David. 2012. The Emergent Multiverse: Quantum Theory according to the Everett Interpretation. Oxford: Oxford University Press.
Zurek, Wojciech H. 2003. "Decoherence, Einselection, and the Quantum Origins of the Classical." Reviews of Modern Physics 75 (3): 715–775.