Astronomers Improve Their Distance Scale for the Universe. Unfortunately, it Doesn't Resolve the Crisis in Cosmology

Measuring the expansion of the universe is hard. For one thing, because the universe is expanding, the scale of your distance measurements affects the scale of the expansion. And since light from distant galaxies takes time to reach us, you can’t measure what the universe is, but rather what it was. Then there is the challenge of the cosmic distance ladder.

The distance ladder stems from the fact that while we have lots of ways to measure cosmic distance, none of them work at all scales. For example, the greatest distances are determined by measuring the apparent brightness of supernovae in distant galaxies. That works great across billions of light-years, but there aren’t enough supernovae in the Milky Way to nearby measure distances. Perhaps the most accurate distance measurement uses parallax, which measures the apparent shift in the position of a star as the Earth orbits the Sun. Parallax is a matter of simple geometry, but it’s only accurate to a couple of thousand light-years.

Some of the methods used to measure cosmic distances. Credit: Tabitha Dillinger

Because of this, astronomers often measure scale by building one method upon the other. Use parallax for the closest stars, including a type of variable star known as Cepheid variables. Cepheids vary in brightness proportional to their average luminosity, so you can use them to measure distances up to 100 million light-years or so. Supernovae occur all the time within that range, so you can then use supernova measurements to determine distances over billions of light-years. These aren’t the only methods used in the cosmic distance ladder, but each method has a limited range and a limited accuracy.

Since there is an uncertainty to any measurement you make, errors can build in the distance ladder. If your parallax measurements are a bit off, then your Cepheid measurements will be more off from the get-go, and your supernova measurements are even less accurate. Because of this, when we measure cosmic expansion using different methods we get results that disagree slightly. This is known as cosmic tension. In the past, this wasn’t a huge problem. While different methods gave different results, the uncertainty of measurement was large enough that results overlapped. But as our measurements get more accurate, they aren’t overlapping anymore. They downright disagree.

The new distance ladder measure disagrees with the Planck measure. Credit: Riess, et al

To resolve this problem, a team of astronomers recently focused on making the cosmic distance ladder more accurate. Their focus is on parallax measurements, which is the ground on which the distance ladder stands. In this case, they use data from the Gaia spacecraft. Gaia has measured the parallax and motion of more than a billion stars, including Cepheid variable stars. From this, the team reduced the uncertainty of the Cepheid distance method to just 1%. Using this new result in the cosmic distance ladder, they get a measurement for the Hubble constant (the rate of cosmic expansion) to be between 71.6 and 74.4 km/sec/Mpc. This is great, but it further conflicts with other methods, particularly data from the Planck satellite measurement of the cosmic microwave background, which gives a value of between 67.2 and 68.1 km/sec/Mpc.

It seems the more accurate our measurements, the worse the tension problem becomes. There’s something about cosmic expansion we clearly don’t understand, and we can only hope that more and better data will lead us to a solution.

Reference: Riess, Adam G., et al. “Cosmic Distances Calibrated to 1% Precision with Gaia EDR3 Parallaxes and Hubble Space Telescope Photometry of 75 Milky Way Cepheids Confirm Tension with LambdaCDM.” arXiv preprint arXiv:2012.08534 (2020).

3 Replies to “Astronomers Improve Their Distance Scale for the Universe. Unfortunately, it Doesn't Resolve the Crisis in Cosmology”

  1. As foreshadowed here: https://www.quantamagazine.org/astronomers-get-their-wish-and-the-hubble-crisis-gets-worse-20201217/ .

    The supernova data is ladder dependent, data sparse and contains two populations.

    “But he thinks cosmologists will run into trouble as they put their theories to more rigorous tests that require more precise standard candles. “Supernovae could be less useful for precision cosmology,” he says.

    Astronomers already knew the peak brightness of type Ia supernovae isn’t perfectly consistent. To cope, they have worked out an empirical formula, known as the Phillips relation, that links peak brightness to the rate at which the light fades: Flashes that decay slowly are overall brighter than those that fade quickly. But more than 30% of type Ia supernovae stray far from the Phillips relation. Perhaps low-mass D6 explosions can explain these oddballs, Shen says. For now, those who wield the cosmic yardstick will need to “throw away anything that looks weird,” Gaensicke says, and hope for the best.”

    [ https://www.sciencemag.org/news/2020/06/galaxy-s-brightest-explosions-go-nuclear-unexpected-trigger-pairs-dead-stars ]

    That leads to uncertainties in the determination of cosmological parameters when different spectral techniques give discordant results.

    [ https://iopscience.iop.org/article/10.3847/1538-4357/abb140 ]

    Meanwhile there has been new H_0 results.

    A paper has used strong lensing of quasars like H0LiCOW that customary also reports low-z, high-H_0 values from sparse data like the supernova papers thus far.

    H0liCOW got H_0 = 73 km s^-1 Mpc^-1 in January.

    [ https://hubblesite.org/contents/news-releases/2020/news-2020-04 ]

    But the new paper got H_0 = 71 (+2 -3) km?s^?1?Mpc^?1 in August by modeling each lens by their possible distributions.

    [ https://academic.oup.com/mnras/article-abstract/498/2/2871/5894941?redirectedFrom=fulltext ]

    The real interesting mass statistics result in this context is from looking at galaxies with different methods of estimating distances. Nearly 12,000 galaxies with z < 1.3 yielded consistent distances for all of them, resulting in H_0 = 70 km s^-1 Mpc^-1.

    [ https://www.forbes.com/sites/startswithabang/2020/10/23/ask-ethan-could-measurement-inaccuracies-explain-our-cosmic-controversies/?sh=40a627be4f8d , https://iopscience.iop.org/article/10.3847/1538-3881/abafba ]

    And we should not forget eBOSS galaxy survey 20 year results.

    "The inverse distance ladder measurement under this model yields H0 = 68.20 ± 0.81 km s^-1Mpc^-1, remaining in tension with several direct determination methods; the BAO data allow Hubble constant estimates that are robust against the assumption of the cosmological model. In addition, the BAO data allow estimates of H0 that are independent of the CMB data, with similar central values and precision under a ?CDM model."

    [ https://arxiv.org/abs/2007.08991 ]

    Or the the first multimessenger binary neutron star merger results, which now is extended but with still scant data. It prefers the CMB + BAO integrative measurements.

    "We performed a joint analysis of the gravitational-wave event GW170817 with its electromagnetic counterparts AT2017gfo and GRB170817A, and the gravitational-wave event GW190425, both originating from neutron-star mergers. We combined these with previous measurements of pulsars using x-ray and radio observations, and nuclear-theory computations using chiral effective field theory, to constrain the neutron-star equation of state. We found that the radius of a 1.4–solar mass neutron star is 11.75+0.86?0.81 km at 90% confidence and the Hubble constant is 66.2+4.4?4.2 at 1? uncertainty."

    [ https://disq.us/url?url=https%3A%2F%2Fscience.sciencemag.org%2Fcontent%2F370%2F6523%2F1450%3AAvtM2BRrv4ofl4de1jWAPyqAMz0&cuid=5131328 ]

    Perhaps the most fascinating reconciliation would be, besides the cosmic ladder results sampling two supernova populations, if magnetic fields from before recombination would add to the CMB results towards the median 70 km s^-1 Mpc^-1 which is also the result of the fit to nearby galaxies,

    "Here we show that accounting for the enhanced recombination rate due to additional small-scale inhomogeneities in the baryon density may solve both the H0 and the S8 ? ?m tensions. The additional baryon inhomogeneities can be induced by primordial magnetic fields present in the plasma prior to recombination. The required field strength to solve the Hubble tension is just what is needed to explain the existence of galactic, cluster, and extragalactic magnetic fields without relying on dynamo amplification."

    "Allowing for clumping using Model 1 makes the decisive difference, moving the best fit to H0 = 71.03 ± 0.74 km s?1 Mpc?1 … This means that Planck+H3 M1 is essentially as good a fit to CMB as the Planck ?CDM."

    [ https://www.quantamagazine.org/the-hidden-magnetic-universe-begins-to-come-into-view-20200702/ , https://arxiv.org/abs/2004.09487 ]

  2. This just in:

    “”Now we’ve come up with an answer where Planck and ACT agree,” said Simone Aiola, a researcher at the Flatiron Institute’s Center for Computational Astrophysics and first author of one of two papers. “It speaks to the fact that these difficult measurements are reliable.”

    [ https://www.sciencedaily.com/releases/2021/01/210104131925.htm ]

    “?CDM is a good fit. The best-fit model has a reduced ?^2 of 1.07 (PTE=0.07) with H_0=67.9±1.5 km/s/Mpc.”

    [ https://arxiv.org/pdf/2007.07289.pdf ]

  3. Short, but great article! Very useful in explaining cosmic distance calculations to students.

Comments are closed.