We still don't understand the measurement problem

An interview with Sabine Hossenfelder

Progress in the foundations of physics has been slow since the 1970s. The Standard Model of particle physics is both very successful, but also displays a fundamental weakness: it can’t make sense of gravity. Physicists have pursued a number of pet projects, like supersymmetry and string theory, claiming they held the key to overcoming the Standard Model’s shortcomings, but at this point they seem like failed projects. So how can theoretical physics become unstuck? According to Sabine Hossenfelder, any improvement on the Standard Model goes through understanding the main conundrum of quantum mechanics: the measurement problem.

 

Physicists have been promising us for more than a century that a theory of everything is just around the corner, and still that goal isn’t in sight. Why haven’t we seen much progress with this?

Physicists made good progress in the foundations until the 1970s when the Standard Model of particle physics was formulated in what would pretty much be its current form. Some of the particles were only experimentally confirmed later, the final one being the Higgs-boson discovery in 2012, and neutrinos became masses, too, but the mathematics for all that was in place in the 1970s. After that, not much happened. Physicists pursued some dead ends, like grand unification, supersymmetry, and string theory, but nothing has come out of it.

The major reason that nothing has come out of it, I think, is that physicists haven’t learned from their failures. Their method of inventing new theories and then building experiments to test them has resulted in countless falsifications (or “interesting bounds”, as they put it in the published literature) but they keep doing the same thing over and over again instead of trying something new.

But not only do we have plenty of evidence that tells us the current method does as a matter of fact not work, we also have no reason to think it should work: Guessing some new piece of math has basically zero chance of giving you a correct theory of nature. As I explained at length in my 2018 book “Lost in Math”, historically we have made progress in theory development in physics by resolving inconsistencies, not by just guessing equations that look pretty. Physicists should take clue from history.

Continue reading

Enjoy unlimited access to the world's leading thinkers.

Start by exploring our subscription options or joining our mailing list today.

Start Free Trial

Already a subscriber? Log in

Join the conversation

Bud Rapanault 14 July 2022

"... particles which we produce in an LHC collision go into a detector and yet the detector never picks up the quantum properties the particles had. We don’t know why. Somehow detectors make quantum effects disappear."

The other, far more realistic, interpretation of this situation is that the so-called quantum effects don't exist. It is perfectly reasonable to conclude (after much effort has been expended) that the reason we can't observe something is because that "something" is not there. Unfortunately, in theoretical physics there has arisen a tendency to believe that a mathematical model is an accurate description of physical reality even when that description is at odds with empirical observations. That, in a nutshell, is the ancient, half-baked philosophy known as mathematicism, the belief that math underlies and determines the nature of physical reality. That belief has no scientific basis but it is the default operating paradigm of the modern theoretical physics community.

Mathematicism is the reason that most theoretical physicists currently believe in a host of entities and events that are not present in empirical reality. From the invisible particle zoo of Quantum Field Theory (quarks, gluons, etc.) to the "dark" components that comprise 95% of the Big Bang "Universe", modern theoretical physics is as infused with metaphysical entities and events as any religion.

With regard to the measurement problem it has long been obvious that the "problem" is self-induced. As long as you interpret the wavefunction as a complete description of physical reality at the quantum scale you will have a measurement problem because the wave function does not describe a physical process only statistical outcomes of the undescribed physical process that produces the statistical results. When scientists, who cling to the "wavefunction is all" postulate, attempt to describe the underlying physical process they resort to metaphysical nonsense - superposition of states and wave-particle duality, or many worlds, etc.

A far more rational interpretation of quantum mechanics called Bohmian mechanics has long been known; John Bell was a strong proponent of the model. In Bohmian mechanics there are, in addition to the wavefunction, particles and waves just as there are on all other observed scales. In Bohmian mechanics a charged particle like an electron induces a guiding wave in an underlying, chaotic, sub-field; the wavefunction simply describes the outcome probabilities for the resulting physical interaction. The statistical outcomes for BM are the same as for standard (wavefunction-only) QM. The only but essential difference is that BM describes the underlying physical processes in terms of well established physical concepts (waves and particles) and standard QM explains the underlying physics with scientifically superfluous metaphysics.

To be clear, Bohmian mechanics is not yet a complete and robust theory but it can be argued that is mostly a function of neglect. There is simply little or no research money available for BM; it only survives at the margins of scientific respectability in the theoretical physics community. This is especially surprising since, unlike all the other current interpretations of QM which rely on metaphysical arguments, Bohmian mechanics is a physical theory and as such it should be subject to direct physical experimentation. The preference for the metaphysical, wavefunction-only interpretation of QM in the scientific academy is difficult to comprehend in scientific terms. The measurement problem is not only self-induced, it seems willfully delusional.

Ronald Sechler 2 June 2022

The measurement problem is very simple. We can not detect subatomic particles. So, we hit a target subatomic particle or group of particles with a stream of high energy particles, like a laser and then try to detect the reflection off of the target. There is the problem. The laser adds energy to the target, which causes the target to react. This brings us to the theory of everything. The universe is energy, and it exists as energy and mass. Energy does not flow, it does not move. But, energy acts on mass, energy does cause mass to move. Energy and mass can not occupy the same space at the same time, because they are simply different states of the same thing, energy. Energy moves mass from where there is more energy to where there is less energy, from where there is less mass to where there is more mass. Because, where there is mass there is no energy, and where there is more mass there is less energy than where there is less mass. Energy acting on mass creates heat, pressure, movement, speed, gravity, time, information, and evolution. Hitting a target subatomic particle with a laser adds heat, pressure, movement, speed, gravity, time, information, and evolution to the target, or some combination of these effects. Heat, pressure, movement, speed, gravity, time, information, and evolution are not things, they are effects. And, gravity does not pull. Gravity pushes. And, we only perceive the passing of time relative to our perspective, point of reference and position in the universe. I suppose this is the beginning of a theory of everything. How about that?