For over thirty years, the Hubble Space Telescope has been in continuous operation in Low Earth Orbit (LEO) and revealing never-before-seen aspects of the Universe. In addition to capturing breathtaking images of our Solar System and discovering extrasolar planets, Hubble also probed the deepest reaches of time and space, causing astrophysicists to revise many of their previously-held theories about the cosmos.
Unfortunately, Hubble may finally be reaching the end of its lifespan. In recent weeks, NASA identified a problem with the telescope’s payload computer which suddenly stopped working. This caused Hubble and all of its scientific instruments to go into safe mode and shut down. After many days of tests and checks, technicians at the NASA Goddard Space Flight Center have yet to identify the root of the problem and get Hubble back online.
Cosmologists have been struggling to understand an apparent tension in their measurements of the present-day expansion rate of the universe, known as the Hubble constant. Observations of the early cosmos – mostly the cosmic microwave background – point to a significantly lower Hubble constant than the value obtained through observations of the late universe, primarily from supernovae. A team of astronomers have dug into the data to find that one possible way to relieve this tension is to allow for the Hubble constant to paradoxically evolve with time. This result could point to either new physics…or just a misunderstanding of the data.
“The point is that there seems to be a tension between the larger values for late universe observations and lower values for early universe observation,” said Enrico Rinaldi, a research fellow in the University of Michigan Department of Physics and coauthor on the study. “The question we asked in this paper is: What if the Hubble constant is not constant? What if it actually changes?”
If you’ve been following developments in astronomy over the last few years, you may have heard about the so-called “crisis in cosmology,” which has astronomers wondering whether there might be something wrong with our current understanding of the Universe. This crisis revolves around the rate at which the Universe expands: measurements of the expansion rate in the present Universe don’t line up with measurements of the expansion rate during the early Universe. With no indication for why these measurements might disagree, astronomers are at a loss to explain the disparity.
The first step in solving this mystery is to try out new methods of measuring the expansion rate. In a paper published last week, researchers at University College London (UCL) suggested that we might be able to create a new, independent measure of the expansion rate of the Universe by observing black hole-neutron star collisions.
In 2025, the Nancy Grace Roman space telescope will launch to space. Named in honor of NASA’s first chief astronomer (and the “Mother of Hubble“), the Roman telescope will be the most advanced and powerful observatory ever deployed. With a camera as sensitive as its predecessors, and next-generation surveying capabilities, Roman will have the power of “One-Hundred Hubbles.”
In order to meet its scientific objectives and explore some of the greatest mysteries of the cosmos, Roman will be fitted with a number of infrared filters. But with the decision to add a new near-infrared filter, Roman will exceed its original design and be able to explore 20% of the infrared Universe. This opens the door for exciting new research and discoveries, from the edge of the Solar System to the farthest reaches of space.
The oldest light in the universe is that of the cosmic microwave background (CMB). This light was formed when the dense matter at the beginning of the universe finally cooled enough to become transparent. It has traveled for billions of years to reach us, stretched from a bright orange glow to cool, invisible microwaves. Naturally, it is an excellent source for understanding the history and expansion of the cosmos.
Measuring the expansion of the universe is hard. For one thing, because the universe is expanding, the scale of your distance measurements affects the scale of the expansion. And since light from distant galaxies takes time to reach us, you can’t measure what the universe is, but rather what it was. Then there is the challenge of the cosmic distance ladder.
Our universe is best described by the LCDM model. That is an expanding universe filled with dark energy (Lambda), and dense clumps of cold dark matter (CDM). It is also sprinkled with regular matter that makes up planets, stars, and us, but that only makes up about 4% of the cosmos. While we don’t know what dark matter and dark energy are, we know how they behave, so the ?CDM model works exceptionally well. There’s just one small problem.
Once again a new measurement of cosmic expansion is encouraging astronomers to reconsider the standard cosmological model. The problem is the Hubble constant and dark energy. While we have a broad understanding of dark energy, pinning down the value of the Hubble constant has been a problem, since different measurements keep getting different results. Now a new study has been published which further complicates things.
In the standard model of cosmology, dark energy fills the universe. It causes the universe to expand at an ever-increasing rate, and it makes up more than 70% of the cosmos. But there’s a problem. When we measure the rate of cosmic expansion in different ways, we get results that disagree with each other.
In a recent post I wrote about a study that argued dark energy isn’t needed to explain the redshifts of distant supernovae. I also mentioned we shouldn’t rule out dark energy quite yet, because there are several independent measures of cosmic expansion that don’t require supernovae. Sure enough, a new study has measured cosmic expansion without all that mucking about with supernovae. The study confirms dark energy, but it also raises a few questions.