Escaping cosmology’s failing paradigm

Why we may be radically wrong about the universe’s size and expansion

The current orthodoxy of cosmology rests on unexamined assumptions that have massive implications for our view of the universe. From the size of the universe to its expansion, does the whole programme fail if one of these assumptions turns out to be wrong?

 

There is a great paradox haunting cosmology.

The science relies on a theoretical framework that struggles to fit and make sense of the observations we have but is so entrenched that very few cosmologists want to seriously reconsider it.

When faced with discrepancies between theory and observation, cosmologists habitually react by adjusting or adding parameters to fit observations, propose additional hypotheses, or even propose “new physics” and ad hoc solutions that preserve the core assumptions of the existing model.

 

Picture 2

 

Today, there is increasing critical attention on some problematic parts of the Standard Model of Cosmology. Dark matter, dark energy and inflation theory are parts of the standard theoretical framework that remain empirically unverified - and where new observations prompt ever more questions.

However, little questioning is heard of the many unverifiable core assumptions that make up our model of the universe.

Dark matter, dark energy and inflation theory are parts of the standard theoretical framework that remain empirically unverified.

Before any physics or mathematics is involved, the framework is based on a series of logical inference leaps - we count 13 - that works as an invisible premise for the theory. Of these, some are not testable or are barely plausible. But they are necessary as simplifying conditions that enable scientists to articulate a scientifically consistent theory of the universe.

What if any of these hidden inferences happen to be fundamentally wrong?

In this article, we would like to focus on just a few of these unverified core assumptions that make up today's standard cosmology, in order to raise a question:

Has the current standard model become orthodoxy because it is very well-founded and proven - as the consensus view would have it? Or is it rather orthodoxy because it’s become ‘paradigm stuck’ - that is, path dependent and unable to generate a viable alternative?

How do we know the Universe?

Let's first look at this science in the big picture.

No, not the big picture story of the "Big Bang" - the hot and dense state of the universe as it was billions of years ago - but rather the empirical problem of how we as Earth-dwellers come to picture the universe scientifically.

Cosmology is different from other sciences in a fundamental way. The sheer scope of the subject matter covers the largest extent imaginable - literally - and it does so based only on observations from our own local place within it.

Unlike physics in the micro scale, experiments cannot be repeated under controlled conditions. And the macrophysical universe as we know it is at least 30 orders of magnitude higher than that of particle physics.

In examining the unfathomably large universe, astronomers face serious difficulties. How can we, from the very limited region of space that is visible, comprehend the entire universe - let alone measure it with confidence?

As the physicist Sabine Hossenfelder recently pointed out, a key assumption like ‘the cosmological principle’ - that the universe is on average the same in all directions - does not hold up well against observations.

What is today called the Standard Model of Cosmology emerges in the context of these enormous limitations, which in turn require some far reaching simplifying assumptions to make a universal theory possible.

As the physicist Sabine Hossenfelder recently pointed out, a key assumption like ‘the cosmological principle’ - that the universe is on average the same in all directions - does not hold up well against observations (or plausibility). But abandoning the cosmological principle would have enormous consequences and so it is resisted.

Some problematic assumptions run even deeper and may have been forgotten by cosmologists in the historical development of the model.

Cosmic Leap #1: measuring the universe

We measure the universe in billions of light years and megaparsecs with ostensibly astonishing precision. But how do we really know its true scale and how far away distant galaxies are from our own tiny place in the cosmos?

Astronomy has developed brilliant techniques for measuring distances but their validity is assumed to stretch far beyond what we can ascertain. Most of our cosmology is based on things we know with empirical confidence about our own galaxy, then hyperextended outwards toward infinity. In the case of the Big Bang model, this extension goes backward to a hypothetical 'early universe' horizon.

Certainly, within our own Milky Way galaxy we can measure distances quite accurately by triangulating visible stars. This ‘high-confidence zone’ for our empirical measurements corresponds to an estimated 0.00001% of the theoretical observable universe.

Venturing beyond our galaxy with the mathematical framework of General Relativity to guide us, scientists can measure up to about 5% of the theoretical universe on a reasonably convincing empirical basis. Beyond this, however, the choice of cosmological model used begins to impact on both measurement and explanation of what astronomers see. This is because in order to understand observations, relativistic mathematical corrections must be applied. For example, images of galaxies need to be resized and their brightness adjusted to take into account that the universe was expanding while light was travelling towards us. But these recalculations are in turn based on the model that cosmologists seek to confirm in the first place.

At the basis of scientific work a self-reinforcing circularity creeps into the measurements - which in turn is used to reinforce key theoretical assumptions.

Astronomers use a so-called distance ladder to measure much greater distances, up to 30% of the theoretical universe size by some estimates, by using light from supernovae explosions as guideposts. At that distance and beyond, however, model-dependent errors could add up to more than 50% of the measured value. And the further out into the universe we go, the more we rely on the theoretical framework to make any estimations, and the further confidence in the distance ladder accuracy decreases.

At these large distances the astronomer is forced to rely more heavily on the parameters derived from General Relativity and on the redshift-distance inference (more on that below) to interpret observations as distance. But this interpretation is needed to support the Big Bang Model, which is itself the model being used to find the parameters needed for the interpretation of observations.

Thus, at the basis of scientific work a self-reinforcing circularity creeps into the measurements - which in turn is used to reinforce key theoretical assumptions. Measurements corrected using a theoretical distance ladder are routinely taken as evidence for the model itself. Over time, this lends more invisible credence to the hypothetical premise underlying the model, even though this premise is not at all what the observations test.

This logical process is familiar to anyone who has studied myth creation, ideologies or religion, in which there is always a founding fiction. Is it outrageous to think that an advanced science could be based on little more than a continual repetition of the same idea?

Data analysis based on a hypothetical premise over time becomes taken as implicit validation of the same premise, allowing astronomers to speak confidently about 'mountains of data' as 'incontrovertible proof' of the Big Bang theory.

Science is above all a practice, and it is simply very hard to do scientific work without making a set of reinforceable assumptions. What starts as "If A, then B etc." soon becomes "A, therefore B" and the fictional premise hardens into a 'fact' that must be protected in order to continue the line of research.

In other words, data analysis based on a hypothetical premise over time becomes taken as implicit validation of the same premise, allowing astronomers to speak confidently about 'mountains of data' as 'incontrovertible proof' of the Big Bang theory - when in fact no data set on its own or aggregated can test the fundamental assumptions of this model.

In the Standard Model of Cosmology then, almost the entire universe can only be the result of a gigantic inference leap from the only finite neighborhood we know, on a scale that is determined by an unverifiable hypothesis and some form of circular reasoning. Now you might argue that it’s the only or the best option we have under the circumstances - but that is not usually the humility with which this lack of confidence is treated.

Cosmic Leap #2: observing the expansion of space

It is considered a universal fact that space is expanding. But how do we really know this - and how do we infer from this that the universe must have expanded indefinitely from a primordial hot dense state?

While the astronomical distance ladder used to measure large distances leaps outwards with progressively lower confidence the further out we go, some key inferences in the cosmic framework are of a different kind: they leap from what we can observe to universal principles and universal laws.

One such principle is known as Hubble's law, upon which the entire Big Bang hypothesis rests. This 'law' is really a consensus interpretation of an observed phenomenon - it is not based on a demonstrated fact.

In the 1920s, the astronomer Hubble discovered a certain relation between the distance and redshift of galaxies. This redshift, a displacement of spectral lines toward longer wavelengths, appeared larger for galaxies at larger distances. The redshift phenomenon is well-known on Earth as being a result of the Doppler effect: the motion of a light source produces a shift of its apparent colour. When galaxies were seen to have a spectral redshift, this was interpreted as a measurement of their velocities as they move away from us. This was called a ‘recession velocity’.

At the time Hubble and other astronomers noted that although the velocity of a galaxy always causes a redshift, the logic doesn’t necessarily go the other way - observation of a redshift in the light received from a galaxy does not have to imply recession velocity. But with few other plausible explanations for the redshift on hand at the time, the redshift-velocity inference became the accepted interpretation. In the context of General Relativity, space expansion mimics the Doppler effect, which can then explain the redshift observed by Hubble.

Hubble's discovery meant that more distant galaxies appear to have a larger recession velocity. This follows a pattern known from explosions, that things flying apart with larger velocities at larger distances were all clustered together in the past.

The redshift-velocity interpretation is the most fundamental building block of Big Bang theory - and it has its share of empirical challenges.

With something that looked like relativistic expansion, the Big Bang theory made its first appearance. The inference leap cosmologists made was to extrapolate Hubble's redshift-velocity relation to the entire universe. Assuming this expansion is everywhere, they inferred for the mathematical modeling that the universe must have expanded and all observed galaxies must at an earlier time have been compressed together in a hot and dense state.

The redshift-velocity interpretation is the most fundamental building block of Big Bang theory - and it has its share of empirical challenges. The model makes galaxies appear to rotate much faster than should be possible and their motion in galactic clusters faster than allowed by the laws of gravity. If the Doppler effect is the right explanation for the redshift, measurements indicate that more mass is needed to explain the observed velocities.

Based on the redshift-velocity interpretation, a consensus hypothesis arose with the development of Big Bang theory: that these unexplainable observations are caused by “Dark Matter” - an invisible substance for which there is no empirical evidence but which has the important function of keeping the cosmological framework intact.

Moreover, in observations of distant quasars, for example, an association with nearby galaxies is clearly detected in the data - which would make no sense if the model is correct. Cosmologists explain these quasar-galaxy associations as improbable chance alignments, despite thousands of examples found in observational data.

Cosmologists today extrapolate the redshift-distance pattern well beyond observed galaxies on the assumption that "Hubble's Law" is universal. Because they observe a pattern that extends over a certain range, scientists assume this pattern will hold for the entire universe.

To be clear, this is not an unreasonable assumption - but it is one that has enormous implications if it were to be even a little bit wrong. Not least would it have big implications for the framework that scientists rely on.

Protecting the Core

The fundamental uncertainty on scale and the interpretation of redshift in far-away galaxies are only two of many cosmic inference leaps that underpin the Big Bang theory - parts of the theory that are as grounded in metaphysics as in physics.

Over decades of scientific labor the Standard Model of Cosmology has become a multi-layered construction that resembles the children's game of Jenga - where the stability of the upper layers is dependent on the layers below.

 Picture 2 min3

The ‘crisis in cosmology’ often referred to today usually focuses on either Dark Matter, Dark Energy or Inflation - all ideas that caught on more than 40 years ago and that have become perpetuated in scientific research. But these are Jenga blocks that rest on the core theories at the base of the structure, where more problems reside.

In this sense, the Standard Model of Cosmology is exemplary of what philosopher of science Imre Lakatos defined as a research program - a better description of Kuhn’s more famous concept of a paradigm in science.

For a scientist doing research, it is more constructive to propose "new physics" that is compatible with the hard core framework than to call fundamentals into question.

A research program consists of a hard core of theoretical assumptions that cannot be abandoned or altered without abandoning the programme altogether - and around this core a set of auxiliary hypotheses that are expandable, that may be altered or abandoned as empirical discoveries require in order to protect the core. Dark matter, dark energy, inflation, are all auxiliary parts of the cosmological research program. The hard core includes General Relativity, the Big Bang model with its afterglow, the expansion of space and the not inconsiderable assumption that the universe is uniform in all directions.

It is common scientific practice to add to or tweak the auxiliary hypotheses rather than question the core. For a scientist doing research, it is more constructive to propose "new physics" that is compatible with the hard core framework than to call fundamentals into question - at least if you want to get funding, publications, graduate students, and tenure.

Because cosmology as a professional discipline really only came about with the invention of the Big Bang Theory in the mid-20th century it has effectively been the only major operative hypothesis for astronomical research. Therefore it has become the only model that cosmologists can get funded to research. The observational evidence it produces and accumulates is usually interpreted in its favour. This gives it the appearance of solidity while giving cosmologists a false sense of security.

However, it would take a lot of scientists, funding and time to be able to produce a reasonable alternative theory that could account for almost nine decades of observations using the Big Bang framework. As a result, cosmology seems locked into a ‘zombie state’ - path dependent and stuck - and too big to fail.

As astrophysicist Stacy McGaugh says in the context of dark matter theory, “like a fifty year mortgage, we are still basically stuck with this decision we made in the 1980s… we’re stuck still pounding these ideas into the heads of innocent students, creating a closed ecosystem of stagnant ideas self-perpetuated by the echo chamber effect.”

McGaugh and Hossenfelder are among a growing group of scientists concerned about the ‘dark stuff’ who are making progress in questioning some of the most critical theories in cosmology.

Their effort may help the new generation of cosmologists realize that if these decade-old theories can be overturned, there is hope in solving cosmology’s deeper problems by re-examining the core principles of cosmology.

Latest Releases
Join the conversation

Ian Williams 1 6 November 2021

This article didn't set out to prove anything, rather sets out to either disprove or critically question current thinking upon which current popular cosmological 'understanding' is built. Therfore, although welcomed, not revolutionary.

The real challenge is to create an alternative perspective that explains everything we observe. Therein lies the golden biscuit!

I've never been comfortable with Dark Matter or Dark Energy, especially the latter. I believe the problem arises from a fundamental misunderstanding of the universal construct.

What if our observations and assumptions are correct and the big bang happened 13.8 billion years ago, bringing about the creation of our spacetime construct? However what if our spacetime doesn't mark the beginning of time overall?

How about this for a crazy idea:

Imagine the birth of time, trillions and trillions of years ago. Beforehand there is nothing but a vacuum, although we now know that the 'nothing' of a vacuum consists of something. In this infinity of nothing, probability suggests that something must eventually occur. At one point, there is an 'imbalance' in the vacuum, and the first wave, the first particle is born (working on the principal that every particle is a wave and every wave is a particle). This is the chronos. If we think of 4 dimensional time as a three dimensional construct, then this imbalance in the vacuum can be thought of as the birth of a singularity. And so, the multiverse is born.

However, this multiverse should not be thought of as a series of parallel universes, rather as a common chronological construct upon which separate universes can be 'built'. Thinking about time three dimensionally, the multiverse can be thought if as an ever expanding 'bubble'. Each unique 'direction' from the singularity (of which there are an infinite number) is a separate timeline/ dimension.

However, elsewhere in the vacuum, another 'bubble-like' multiverse is born, with its own three dimensional chronos bubble expanding outwards, with each unique direction from its singularity presenting a separate timeline/ dimension.

The two multiversal, chronological bubbles expand towards each other over trillions of years, and eventually they meet. A dimension from our multiverse and a dimension from the other multiverse have a head-on collision. This direct hit from two chronological timelines from two different multiverses creates a natural particle collider at a magnitude unimaginable to us. New particles are formed at this collision and with it the birth of our space time. This is what we call the Big Bang.

Matter and antimatter are created, however they are created for two colliding dimensions from separate multiverses travelling in opposite directiions. The particles in our timeline / dimension are matter and the particles from the opposing dimension are antimatter. However rather than matter and antimatter canceling each other out, our timeline takes more of the matter with us and the opposing timeline takes more of the antimatter with it. Spacetime constructs for both opposing timelines are created, across or on the 'surfaces' of the already existing time constructs of each multiverse.

This theory proposes that our Universe is built upon a double construct, namely a time construct built on a unique multiversal dimension born trillions of years ago, and a spacetime construct, built 13.8 billon years ago, upon the much older time construct.

Our 'flat' universe grows (i.e. inflation) 'outwards' across the surface of the much older multiversal time construct. Its 'growth' is subject to two separate forces, namely the inflationary growth emanating from the big bang, plus the existing expansion of the multiversal construct, which is expanding at a much faster rate because it is a much older, much bigger construct (i.e. the 'bubble').

The universe appears 'flat' to us (looked at four dimensionally across the CMB) because it is built across a bubble-like chronological construct that is so old and large, the 'part' we are able to see appears to be flat. The older the age of the multiverse, the flatter the appearance of the time construct.

Following this premis, there are two 'forces' acting upon the rate of expansion of our universe, namely inflation and multiversal chronological expansion. This being the case, could this explain the increasing rate of universal expansion? With this model, would Dark Energy even be needed?

Furthermore, could the double construct also explain the impact of so-called Dark Matter? Could matter be being held together by the presence of two contrusts working together, and not just the gravity apparent within a single construct?

Or are these the ramblings of a mad and ignorant fool with an overactive imagination? ????

David Dilworth 6 November 2021

Excellent article.
Great metaphors and it doesn't draw any lines outside the bounds of accuracy.
The article exposes the extremely awkward state cosmology has developed into.
Not unlike an avalanche of mathematical snow that has stopped and hardened into a solid block of ice fiction.

Jim Balter 5 November 2021

The one good thing about this article is the link to Stacy McGaugh's blog entry, but it rips that quote out of context and severely misrepresents that context as "of dark matter theory". The decision he refers to is to "alter Newtonian gravitational theory only as a last resort". As he says,
"[LCDM and MOND] are both examples of what philosophers of science call a No Miracles Argument. The problem is that it cuts both ways. I will refrain from editorializing here on which would be the bigger miracle, and simply note that the obvious thing to do is try to combine the successes of both, especially given that they don’t overlap much. And yet, the Venn diagram of scientists working to satisfy both ends is vanishingly small."

I advise ignoring this article and just reading McGaugh's piece and the comments on it (ignoring the stuff from crackpots like J Mark Morris)..

Jim Balter 5 November 2021

"Bjørn Ekeberg and Louis Marmet point the way to a new paradigm."

No they don't ... at best they point to a very old paradigm: question everything.

This is a lot of pointless handwaving. Science is in the business of inference to the best explanation. If a better explanation appears, it will be adopted. Nothing in this piece helps us get there.

Roy Lofquist 5 November 2021

There are two assumptions that underpin moden cosmology that are in question due to recent observations: the expansion of the universe and that gravity is the dominant force.

That the Universe is expanding is based on the premise that the Hubble Red Shift is due to a Doppler effect recessional velocity. When Hubble published his observations of red shifted light from distant objects there were two possible explanations that came to the fore. One, originated by Georges Lemaitre, was that the Universe was expanding. The other, from Fritz Zwicky, was that light lost energy as it traveled, termed "tired light". At that time, ca. 1930, interstellar and intergalactic space were assumed to be perfect vacuums, and thus there was no mechanism to redden the light.

Now, 90 years later, we have actual observational evidence that Zwicky was right. In the radio astronomy of Pulsars we find that the shorter wavelengths of the leading edge of the pulse arrive before longer wavelengths. The velocity of light, c, is NOT constant but varies by wavelength. The implication is that the interstellar medium is not a vacuum but rather affects light waves in a way best described as having an Index of Refraction greater then 1, unity. We find the same phenomenon in the observation of Fast Radio Bursts from other galaxies, thus indicating that the intergalactic media is not an electromagnetic vacuum. The distance to these pulsars can be computed from the time dispersion by a formula that is algebraically identical to the one used to compute the distance to distant objects by red shift. This implies that the Hubble red shift is the result of the light traversing a distance through a medium denser than Eintein's "in vaccuo" rather than a recessional velocity.

The second questionable assumption is that gravity is the dominant force in the universe, this despite the fact that electromagnetism is 36 orders of magnitude stronger than gravity. Electromagnetism was thought to be a strictly local phenomenon, effective only near stars and planetary bodies. Since that time we have discovered the Solar Wind (Russian Luna 7, 1959); interstellar magnetic fields (Voyager 1, 2012, and Voyager 2); galactic magnetic fields; and magnetic fields BETWEEN galaxies. Magnetic fields manifest only in conjunction with electrical currents. That we have detected magnetic fields between galaxies means that vast electrical currents permeate the universe and the potential differences (voltages) are, can we say it, astronomical.