*In their seeking of simplicity, scientists fall into error. They mistake their abstract concepts describing reality – for reality itself. The map for the territory. This leads to dogmatic overstatements, paradoxes and mysteries such as quantum gravity. To avoid such errors, we should evoke the thinking of philosopher Alfred North Whitehead and conceive of the universe as a universe-in-process, where physical relations beget new physical relations, writes Michael Epperson. *

When celebrity physicists disagree about some fundamental prediction or hypothesis, there’s often a goofy and well-publicized wager to reassure us that everything is under control. Stephen Hawking bets Kip Thorne a one-year subscription to *Penthouse* that Cygnus X-1 is not a black hole; Hawking and Thorne team up and bet John Preskill a baseball encyclopedia that quantum mechanics would need to be modified to be compatible with black holes. *Et cetera*, *et cetera*. And even as we roll our eyes, we’re grateful because at least some part of us does not want to see these people violently disagreeing about anything.

So when celebrity physicist Lawrence Krauss publicly called celebrity physicist David Albert a “moron” for not appreciating the significance of Krauss’s discovery of the concrete physics of nothingness, it caused quite a stir. In his book, *A Universe from Nothing*, Krauss argued that in the same way quantum field theory depicts the creation of particles from a region of spacetime devoid of particles (a quantum vacuum), quantum mechanics, if sufficiently generalized, could depict the creation of spacetime itself from *pure* *nothingness*. In a scathing New York Times review of Krauss’s book, Albert argued that claiming that physics could concretize “nothing” in this way was at best naïve, and at worst disingenuous. Quantum mechanics is a physical theory, operative only in a physical universe. To contort it into service as a cosmological engine that generates the physical universe from “nothing” requires that the abstract concept of “nothing” be concretized as physical so that the mechanics of quantum mechanics can function. What’s more, if quantum mechanics is functional enough to generate the universe from nothing, then it’s not really nothing; it’s nothing plus quantum mechanics.

This is a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis.

This is a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis. It is a cheap instrument, as attractive as it is defective, used more often as cudgel than tool for exploration. Fortunately, as we saw with David Albert, few despise its dull edge more than other physicists and mathematicians. During the first years of modern mathematical physics and the construction of its two central pillars, quantum theory and relativity theory, Alfred North Whitehead warned, “There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.”

Whitehead would later generalize this error as the “fallacy of misplaced concreteness.” It is often oversimplified as merely mistaking an abstract conceptual object, like a mathematical or logical structure (e.g., the number zero, or the concept of “nothingness”), for a concrete physical object. But the fallacy has more to do with what Whitehead argued was the chief error in science and philosophy: dogmatic overstatement. We commit the fallacy of misplaced concreteness when we identify *any* object, conceptual *or* physical, as *universally fundamental* when, in fact, it only exemplifies selective categories of thought and ignores others. In modern science, the fallacy of misplaced concreteness usually takes the form of a fundamental reduction of some complex feature of nature—or even the universe itself—to some simpler framework. When that framework fails, it is replaced with a new reduction—a new misplaced concreteness, and the cycle repeats.

Scientific progress is marked by these cycles because “failure” doesn’t mean the reduction was entirely wrong; it just means it wasn’t as fundamental—as concrete—as previously supposed. Our understanding of nature does increase, just not at the expense of nature’s complexity. In this regard, the reductive mathematization of natural philosophy over the last 500 years has proven to be both its greatest strength and its greatest hazard. The fundamental objects of modern physics are no longer understood as material physical structures but rather as *mathematical structures* that produce physically measurable effects. The waves of quantum mechanics are not material-mechanical waves; they are mathematical probability waves. The “fabric” of spacetime in relativity theory is pure geometry.

Scientific progress is marked by these cycles because “failure” doesn’t mean the reduction was entirely wrong; it just means it wasn’t as fundamental as previously supposed.

With his “Mathematical Universe Hypothesis,” physicist Max Tegmark has concretized these and other examples of fundamental mathematical objects into a simple reduction: the universe *is* mathematics. In contemporary mathematical physics, he argues, there is no longer a distinction between a world *described by* mathematics and a world *explained as* mathematics. Tegmark characterizes physics as entailing nothing less than a one-to-one correspondence between physical reality and the mathematical structure by which we define this reality. “If our external physical reality is isomorphic to a mathematical structure,” he concludes, “it therefore fits the definition of *being* a mathematical structure… In other words, our successful theories are not mathematics approximating physics, but mathematics approximating mathematics.”

The artful incoherence of “mathematics approximating mathematics” evinces the misplaced concreteness of the premise it presupposes. The fact that *some* features of reality are usefully *describable as* abstract mathematical structures does not necessarily entail that *all* features of reality are *reducible to* concrete mathematical structures. What would compel Tegmark to attempt such a leap? What would compel Krauss? “The aim of science,” Whitehead writes, “is to seek the simplest explanations of complex facts. We are apt to fall into the error of thinking that the facts are simple because simplicity is the goal of our quest. The guiding motto in the life of every natural philosopher should be, ‘Seek simplicity and distrust it.’” And then investigate further.

There is no more potent lesson in the history of natural philosophy than this: It is when we make simplicity the end of science rather than its beginning that we doom our fundamental theories to failure. 2500 years after the Pythagoreans had developed their own mathematical universe hypothesis and the Milesians had felt compelled to reduce the four fundamental elements to one, modern physics likewise feels compelled to reduce the four fundamental forces to one—the persistently illusive “Theory of Everything.” After nearly a century, this compulsion has yet to yield a single empirically viable theory, let alone a successful one. What’s worse, the misplaced drive to concretize nature into a single fundamental framework has resulted in two completely incompatible frameworks, each considered equally fundamental: quantum mechanics—the physics by which we understand nature at the small scale—and the general theory of relativity—the physics of the large scale. Given that the large scale is presumed to be reducible to the small scale, this fundamental incompatibility amounts to nothing less than a crisis for physics.

When viewed through the lens of the fallacy of misplaced concreteness, the root of this crisis is clear: the general theory of relativity concretizes spacetime as a continuum. The problem with this, as Zeno famously demonstrated, is the infinite divisibility of a finite interval—the finite containing the infinite—which is fine if you’re thinking about numbers alone, but highly problematic for physics. One example referenced frequently in popular science is the concept of a physical “singularity”—a physical concretization of a geometric point defined as infinitely dense but with zero volume. Given that density is defined as mass divided by volume, this is nothing less than a physical concretization of a conceptual paradox—a concretization of the “divide by zero” error you get on your calculator when you’re reckless with your math.

Quantum mechanics was explicitly designed to immunize physics against such concretized infinites and their associated paradoxes, and this is the heart of its incompatibility with the general theory of relativity. It avoids the misplaced concreteness of a fundamental continuum by instead describing physical systems as serial “physical histories” of discrete, physical states. But quantum mechanics is not without its own misplaced concretizations. Like relativity theory, quantum mechanics concretizes mathematical structures, including those defining *potential* physical states like Schrödinger’s “cat alive” and “cat dead” states. The implication for many is that the cat is fundamentally “really alive” and “really dead” at the same time—perhaps each in its own separate universe, as asserted in the infamous “Many Worlds Interpretation” of quantum mechanics. Despite the fact that Schrödinger had intended his cat to warn us away from this kind of misplaced concretization, most of today’s interpretations engage in it either explicitly, as in the case of the Many Worlds approach, or implicitly by failing to account for how measurement generates *observable* “traditional” physical states from unobservable but *calculable* “potential” physical states. Quantum mechanics contains no physical “mechanism” to explain this; it only contains mathematical structures that describe the process.

If you read any of Whitehead’s earlier works in mathematics, it’s clear that he anticipated the problem of concretizing mathematics in physics. And it is equally clear in his later philosophical magnum opus *Process and Reality* that he recognized the incompatibility of general relativity and quantum theory as a critical manifestation of this problem. He argued that the best solution would be a mathematical framework that was itself impossible to concretize—one not rooted in the rigidity of geometry, the conventional foundation of mathematical physics depicting the universe as a closed totality, a block whole reducible to the sum of its concretized parts. Whitehead’s mathematical framework, by contrast, would have a built-in openness, and its robustness would derive not from rigidity, but rather from *elasticity*. He called this framework, simply, “the algebraic method.” When applied to physics, the universe becomes formalized not as a closed totality with fixed and deterministic relations, but an *open* *totality-in-process* with novel “creative” relations—that is, physical relations that beget *new* physical relations, not just reconfigurations of previous relations. In such a universe, the business of physics is not the concretizing of what is, but rather the historicizing of what becomes.

Quantum mechanics was explicitly designed to immunize physics against such concretized infinites and their associated paradoxes, and this is the heart of its incompatibility with the general theory of relativity.

Today, the most promising interpretations of quantum mechanics exemplify this Whiteheadian idea, formalizing physical systems, and the universe itself, as ongoing histories of actualizations of potential states. They avoid the concretization of Schrödinger’s cat by defining *actual *states and *potential* states as two separate categories of reality—an idea first proposed by Aristotle and rehabilitated by Heisenberg. In these interpretations the classical conception of a history as a derivative story about fundamental physical objects is reversed: the classical “physical object” becomes the derivative story about fundamental histories of quantum states. The consistent histories interpretation of Robert Griffiths and the decoherent histories interpretation of Roland Omnès are cardinal examples of this approach to quantum mechanics.

Another example, the relational realist interpretation, (1) begins with the decoherent histories approach and formalizes it explicitly as a Whiteheadian interpretation, taking his “algebraic method” and applying it to quantum mechanics via algebraic topology. This branch of mathematics can be thought of as a “de-concretized” geometry in that it focuses only on the logical relations among objects and regions (and relations *of* these relations) rather than the fixed magnitudes of lines, planes, and other such rigid intervals. It has all the robust logical structure of geometry yet allows for an elasticity of relations that makes topological mathematical structures inherently immune to “misplaced concretization.” Among the many advantages of the relational realist topological interpretation, it is wholly compatible with the general theory of relativity, reformulating the latter in algebraic topological terms (1,2). And likewise, one of the most promising approaches to quantum computation is the topological quantum computing framework currently in development at Microsoft.

It is not surprising that it took a mathematician and philosopher as brilliant as Whitehead to emphasize the fallacy of misplaced concreteness and its hazards, or that the most promising solutions to our current problems in fundamental physics would be those that explicitly aim to avoid that fallacy. Deconstructing the “simple” and “self-evident” concretized categories by which we habitually (and often dogmatically) coordinate our thoughts and experiences of the world…this, for Whitehead, was the only route to progress. “If science is not to degenerate into a medley of *ad hoc* hypotheses,” he writes, “it must become philosophical and must enter upon a thorough criticism of its own foundations.” While it’s true that the strongest foundations are often concretized, it is equally true that this strength always begins and ends with what lies beneath.

References

(1) M. Epperson & E. Zafiris. *Foundations of Relational Realism: A Topological Approach to Quantum Mechanics and the Philosophy of Nature*. Lexington Books, New York, 2013. (csus.edu/cpns)

(2) E. Zafiris, A. Mallios, “The Homological Kähler-De Rham Differential Mechanism, Part I: Application in the General Theory of Relativity, and Part II: Sheaf-Theoretic Localization of Quantum Dynamics,” *Advances in Mathematical Physics* (2011) (csus.edu/cpns)

## Join the conversation