The Hubble Tension describes the discrepancy between measurements of the early and late universe, of the current rate of cosmic expansion. The initial cause of this tension was assumed to be systematic, with many thinking that some error in our measurements could be eliminated by better telescopes and better data, but following James Webb Space Telescope (JWST) results many believe that something more mysterious might be at play. Do we need new physics outside of cosmology’s standard model? In this piece, Marco Forgione explores recent attempts to resolve the tension and highlights the role of philosophy when science can’t make its mind up.
The best tool that scientists have for describing the history and structure of our universe is called the Lambda Cold Dark Matter model (ΛCDM). Thanks to this model we can describe (among other things) the acceleration of the universe, its large-scale structures, temporal evolution, and the many different forms of radiation we observe with our telescopes. However, the model is also characterized by six free-parameters that need to be added “manually” since they cannot be determined by theory only. One such parameter is the expansion rate of the universe (also: Hubble’s constant (H0)), which is given by the relationship between the velocity of a cosmological object receding from us and its distance.
The existence of this parameter is not new in the literature, since it was already in 1927 that Hubble was able to give a first H0 estimate of 500(km/s)/Mpc as a relation between redshift and galaxy distance. However, it was soon found that such a value implied a universe that was only 2 billion years old; not enough. With the data from the Hale-Telescope in 1950s, Humason, Mayall, and Sandage calculated a value for the Hubble constant of approximately 180(km/s)/Mpc, while a few years later, in 1958, Sandage and collaborators placed the value of H0 between 50 and 100(km/s)/Mpc. The task was clear, but its execution was impeded by inadequate technology. Yet, even with the development of better instruments, the discrepancy between measured values of H0 persisted and it became known as the 50-100 controversy in the 1970s and 1980s (see: Tully 2024).
Then, the launch of the Hubble Space Telescope (HST) and the HST key project study in the 1990s used Cepheid stars to calibrate the distance to Supernovae Type Ia in distant galaxies. The project delivered a value for the Hubble parameter of 72±8(km/s)/Mpc. It seemed that the H0 tension was appeased.
___
Each rung of the distance-ladder method comes with systematic uncertainties
___
Then, with the Planck mission (2009-2013), observations of the Cosmic Microwave Background radiation (CMB) aligned well with the predictions of ΛCDM, but with a Hubble constant of 67.5±0.5(km/s)Mpc: a difference of about 5sigma with the previous results.
How is it possible that cosmologists and astrophysicists are measuring the same parameter but they are obtaining significantly different results? Better analyses and measurements seem to confirm this tension, which is one of the reasons why philosophers started looking into the matter from an epistemological perspective.
Figure 1. Plot showing previous attempts to measure the Hubble Constant H0. (Source Verde, 2019).
In theory, the value of H0 is not hard to determine, since it should be enough to observe a sufficiently large number of cosmological objects to determine their velocity with respect to us (the observers). However, as oftentimes happens, the practical aspects of such observations and measurements hide significant difficulties. For example, while the velocity can be measured using the redshift of spectral lines, cosmological objects also move with respect to one another (peculiar velocity) due to their reciprocal gravitational effects. Since it is very difficult to isolate such effects from the calculation of the recessional velocity, scientists look at galaxies in the very far distance whose motion is purely due to the expansion of the universe. However, an accurate measurement of the distance with these far-away objects is much more difficult since direct measurements are not possible.
To overcome these difficulties, scientists have implemented the so-called cosmological distance-ladder: a stepwise approach that uses different methods to calculate the distance of astronomical objects. Each method is calibrated on the previous one so that, when combined, they allow us to indirectly measure the distance of galaxies in the Hubble flow. Generally speaking, the distance-ladder consists of three rungs: (1) the determination of the distance of nearby objects (usually within the Large Magellanic Cloud (LMC), or Milky way, or NG4258) by using geometric-based methods such as trigonometric parallax. That is, the first rung consists of a “direct measurement” since scientists observe the apparent change in position of an object from two different points of the Earth’s orbit around the Sun. Once we know the parallax angle and the distance between the two observation points, it is relatively easy to use trigonometric relations to calculate the distance between the observers and the target.
Fig. 2. Diagram showing the trigonometric parallax method of determining astrophysical distances. (Source Encyclopedia Brittanica)
The second rung (2) of the ladder measures the distance of objects that can be used as standard candles, such as, for example, Supernovae Type-Ia (SNeIa), or stars with fixed pulsating periodicity (Cepheid stars).
Finally, astronomers measure the distance of galaxies that are far enough into the Hubble flow so that their peculiar velocity is negligible in comparison to their recessional velocity. To do so, they cannot rely on variable stars anymore, and thus they look at extremely bright objects such as supernovae type Ia, or at the correlation between luminosity and maximum rotation velocity of spiral galaxies. Each rung of the distance-ladder method comes with systematic uncertainties which, one might argue, can be the cause for the discrepancy with the values of H0 calculated from the CMB.
Here, philosophers of science can contribute to the discussion by employing analytical tools to evaluate whether some systematics can indeed be taken to justify the Hubble Tension. One such tools is called Robustness Analysis. The idea, pace the obvious differences among authors (see: Wimsatt 1981; Soler et al. 2012; Weisberg 2006; Levins 1966, and others), is that the trustworthiness of an experimental result (model, or theory) is strengthened by its invariance under different independent derivations. This means that, when applied to the case of the Hubble constant, and with respect to the distance-ladder method: if different independent methods deliver the same (or similar) values for the Hubble constant, does this warrant our trust in those results? For example, the recent work by (Riess et al. 2019) presents a determination of the value of H0 from the observations of the HST using different and independent estimators: “The higher local value results from the use of any one of 5 independently determined, geometric distance estimators used to determine the luminosity of Cepheids, including masers in NGC 4258 […], 8 detached eclipsing binaries [in the Large Magellanic Cloud (LMC)] […], and 3 distinct approaches to measuring Milky Way (MW) parallaxes […]” (Riess et al. 2019, p.2). The use of different estimators on the same datasets (HST observations), and the fact that the results converge on the high value of H0, might be considered as a genuine case of robustness.
SUGGESTED READING
Cosmology’s crisis challenges scientific realism
By David Merritt
However, the robustness of the results obtained from the observation of Cepheids might be tainted by phenomena of crowding and blending (among others), which can affect the precision of our measurements. Indeed, Cepheid stars are good distance indicators because they are bright and reliably identifiable due to their characteristic pulsating periodicity. But, the nearby presence of bright celestial objects can affect the apparent magnitude of the calibrators, thereby negatively affecting the precision of the distance measurements. Such phenomena can give new strength to the hypothesis that the Hubble tension is reducible to the presence of unknown systematics in the calculation of H0 (as compared to the early universe measurements from CMB or BAO).
But, Riess and collaborators seem to suggest that the new observations from the James Webb Space Telescope (JWST) provide evidence that systematic errors in calculating the Hubble constant via Cepheids-distance measurements play no role in the current Hubble tension (see: Riess et al. 2023). Indeed, the JWST ((Gardner et al. 2023) and (Rigby et al. 2023)) delivers observations with much higher resolution, so that it is possible to separate individual Cepheids from their surrounding crowds, thereby mitigating the effect of different systematics. Hence, there might be room to argue that the convergence of the values of the Hubble constant, determined by observations from the HST and with the JWST, is an indicator of the robustness of such results. But, would such conclusions resolve the infamous tensions?
Not at all. If anything, the tension might be even more solidified. Indeed, while robustness analysis and the recent observations from the JWST might strengthen our trust in the correctness of the results obtained from observations from the late universe, the discrepancy with the values of H0 calculated from the early universe remains unaccounted for (see, for example, the results from Planck Collaboration 2018).
___
Then, the explanation for the tension should be that there are some problems with our cosmological model
___
If we accept the analysis presented thus far, we are justified to believe that the results from the distance-ladder method are correct. Then, the explanation for the tension should be that there are some problems with our cosmological model (which is not a novel statement either).
It would be interesting to run a robustness analysis against the measurements of H0 based on CMB and BAO, but one considerable difficulty would need to be accounted for. The theoretical background that grounds the late-universe measurements of Hubble’s constant are well-known and tested. The ΛCDM model is also well-tested, but we still lack a complete knowledge of the physics behind cold dark matter. While we have evidence that non-baryonic matter plays an important role in the model, we do not have a theory that describes its properties. Unfortunately, in most cases, robustness analysis requires that the assumptions and inner working of the models are all well-known, otherwise it would be impossible to determine the culprit of an alleged discrepancy in the empirical data. What should we expect and what can we do then?
One possibility, is that our measurements of H0 from the early universe are affected by some physics that is still not properly accounted for by the ΛCDM model. For example, cold dark matter allegedly influences the formation of large-scale structures, and its gravitational effects might impact the current and past cosmological expansion rate. Therefore, dark matter might affect our observations and the measurement of H0.
A second possibility is that our cosmological model ought to be replaced with, for example, some modified gravity model (among others: Modified Newtonian Dynamics, f(R) gravity models, or Tensor-Vector-Scalar-Gravity). Philosophers of science might and should contribute to the debate by, among other things, employing tailored forms of robustness analysis against selected predictions of the LCDM and alternative models. However, non-empirical arguments, such as robustness analysis, will not be sufficient for justifying the retention or dismissal of an empirically well-tested cosmological model, but they might still offer additional insights on the debate by narrowing the space of possible solutions.
SUGGESTED READING
The Delusions of Cosmology
By Bjørn Ekeberg
Finally, the possibility that unknown systematics are the cause of the tension cannot be ruled out yet. Most recent results from Wendy Freedman and collaborators (see: Freedman et al. 2024) have provided new measures of H0 based the observations from the JWST using three independent methods: (i) Tip of the Red Giant Branch stars (TRGB), (ii) Cepheids, and (iii) J-Region Asymptotic Giant Branch stars (JAGB). These new results appear to be consistent with the standards of the cosmological model, thereby suggesting the possibility of further systematics: “These differences are pointing to systematics affecting one or more of the distances and need to be better understood. However, while they do not rule it out, the results presented here do not lend strong support to the suggestion that there is missing fundamental physics in the early universe” (Freedman et al., 2024, p.54). Therefore, the debate on whether the Hubble tension is due to unknown systematics or new physics is still open; new data, better analysis, more accurate and precise observations are still needed before we can draw more definitive conclusions.
Acknowledgments
This result has been made possible thanks to the project (2021-0567 - COSMOS) funded by Cariplo Foundation.
Join the conversation