According to the most widely-accepted cosmological theories, the Universe began roughly 13.8 billion years ago in a massive explosion known as the Big Bang. Ever since then, the Universe has been in a constant state of expansion, what astrophysicists know as the Hubble Constant. For decades, astronomers have attempted to measure the rate of expansion, which has traditionally been done in two ways. One consists of measuring expansion locally using variable stars and supernovae, while the other involves cosmological models and redshift measurements of the Cosmic Microwave Background (CMB).
Unfortunately, these two methods have produced different values over the past decade, giving rise to what is known as the Hubble Tension. To resolve this discrepancy, astronomers believe that some additional force (like “Early Dark Energy“) may have been present during the early Universe that we haven’t accounted for yet. According to a team of particle physicists, the Hubble Tension could be resolved by a “New Early Dark Energy” (NEDE) in the early Universe. This energy, they argue, would have experienced a phase transition as the Universe began to expand, then disappeared.
In 1916, Einstein finished his Theory of General Relativity, which describes how gravitational forces alter the curvature of spacetime. Among other things, this theory predicted that the Universe is expanding, which was confirmed by the observations of Edwin Hubble in 1929. Since then, astronomers have looked farther into space (and hence, back in time) to measure how fast the Universe is expanding – aka. the Hubble Constant. These measurements have become increasingly accurate thanks to the discovery of the Cosmic Microwave Background (CMB) and observatories like the Hubble Space Telescope.
Astronomers have traditionally done this in two ways: directly measuring it locally (using variable stars and supernovae) and indirectly based on redshift measurements of the CMB and cosmological models. Unfortunately, these two methods have produced different values over the past decade. As a result, astronomers have been looking for a possible solution to this problem, known as the “Hubble Tension.” According to a new paper by a team of astrophysicists, the existence of “Early Dark Energy” may be the solution cosmologists have been looking for.
According to our current Cosmological models, the Universe began with a Big Bang roughly 13.8 billion years ago. During the earliest periods, the Universe was permeated by an opaque cloud of hot plasma, preventing atoms from forming. About 380,000 years later, the Universe began to cool and much of the energy generated by the Big Bang converted into light. This afterglow is now visible to astronomers as the Cosmic Microwave Background (CMB), first observed during the 1960s.
One peculiar characteristic about the CMB that attracted a lot of attention was the tiny fluctuations in temperature, which could provide information about the early Universe. In particular, there is a rather large spot in the CMB that is cooler than the surrounding afterglow, known as the CMB Cold Spot. After decades of studying the CMB’s temperature fluctuations, a team of scientists recently confirmed the existence of the largest cold spots in the CMB afterglow – the Eridanus Supervoid – might be the explanation for the CMB Cold Spot that astronomers have been looking for!
Since the “Golden Age of General Relativity” in the 1960s, scientists have held that much of the Universe consists of a mysterious invisible mass known as “Dark Matter“. Since then, scientists have attempted to resolve this mystery with a double-pronged approach. On the one hand, astrophysicists have attempted to find a candidate particle that could account for this mass.
On the other, astrophysicists have tried to find a theoretical basis that could explain Dark Matter’s behavior. So far, the debate has centered on the question of whether it is “hot” or “cold”, with cold enjoying an edge because of its relative simplicity. However, a new study conducted led by the Harvard-Smithsonian Center for Astrophysics (CfA) revits the idea that Dark Matter might actually be “warm”.
Back in 2013, the European Space Agency released its first analysis of the data gathered by the Planck observatory. Between 2009 and 2013, this spacecraft observed the remnants of the radiation that filled the Universe immediately after the Big Bang – the Cosmic Microwave Background (CMB) – with the highest sensitivity of any mission to date and in multiple wavelengths.
In addition to largely confirming current theories on how the Universe evolved, Planck’s first map also revealed a number of temperature anomalies – like the CMB “Cold Spot” – that are difficult to explain. Unfortunately, with the latest analysis of the mission data, the Planck Collaboration team has found no new evidence for these anomalies, which means that astrophysicists are still short of an explanation.
According to the Big Bang cosmological model, our Universe began 13.8 billion years ago when all the matter and energy in the cosmos began expanding. This period of “cosmic inflation” is believed to be what accounts for the large-scale structure of the Universe and why space and the Cosmic Microwave Background (CMB) appear to be largely uniform in all directions.
However, to date, no evidence has been discovered that can definitely prove the cosmic inflation scenario or rule out alternative theories. But thanks to a new study by a team of astronomers from Harvard University and the Harvard-Smithsonian Center for Astrophysics (CfA), scientists may have a new means of testing one of the key parts of the Big Bang cosmological model.
For thousands of years, human being have been contemplating the Universe and seeking to determine its true extent. And whereas ancient philosophers believed that the world consisted of a disk, a ziggurat or a cube surrounded by celestial oceans or some kind of ether, the development of modern astronomy opened their eyes to new frontiers. By the 20th century, scientists began to understand just how vast (and maybe even unending) the Universe really is.
And in the course of looking farther out into space, and deeper back in time, cosmologists have discovered some truly amazing things. For example, during the 1960s, astronomers became aware of microwave background radiation that was detectable in all directions. Known as the Cosmic Microwave Background (CMB), the existence of this radiation has helped to inform our understanding of how the Universe began. Continue reading “What is the Cosmic Microwave Background?”
For decades, scientists have theorized that beyond the edge of the Solar System, at a distance of up to 50,000 AU (0.79 ly) from the Sun, there lies a massive cloud of icy planetesimals known as the Oort Cloud. Named in honor of Dutch astronomer Jan Oort, this cloud is believed to be where long-term comets originate from. However, to date, no direct evidence has been provided to confirm the Oort Cloud’s existence.
This is due to the fact that the Oort Cloud is very difficult to observe, being rather far from the Sun and dispersed over a very large region of space. However, in a recent study, a team of astrophysicists from the University of Pennsylvania proposed a radical idea. Using maps of the Cosmic Microwave Background (CMB) created by the Planck mission and other telescopes, they believe that Oort Clouds around other stars can be detected.
The study – “Probing Oort clouds around Milky Way stars with CMB surveys“, which recently appeared online – was led by Eric J Baxter, a postdoctoral researcher from the Department of Physics and Astronomy at the University of Pennsylvania. He was joined by Pennsylvania professors Cullen H. Blake and Bhuvnesh Jain (Baxter’s primary mentor).
To recap, the Oort Cloud is a hypothetical region of space that is thought to extend from between 2,000 and 5,000 AU (0.03 and 0.08 ly) to as far as 50,000 AU (0.79 ly) from the Sun – though some estimates indicate it could reach as far as 100,000 to 200,000 AU (1.58 and 3.16 ly). Like the Kuiper Belt and the Scattered Disc, the Oort Cloud is a reservoir of trans-Neptunian objects, though it is over a thousands times more distant from our Sun as these other two.
This cloud is believed to have originated from a population of small, icy bodies within 50 AU of the Sun that were present when the Solar System was still young. Over time, it is theorized that orbital perturbations caused by the giant planets caused those objects that had highly-stable orbits to form the Kuiper Belt along the ecliptic plane, while those that had more eccentric and distant orbits formed the Oort Cloud.
According to Baxter and his colleagues, because the existence of the Oort Cloud played an important role in the formation of the Solar System, it is therefore logical to assume that other star systems have their own Oort Clouds – which they refer to as exo-Oort Clouds (EXOCs). As Dr. Baxter explained to Universe Today via email:
“One of the proposed mechanisms for the formation of the Oort cloud around our sun is that some of the objects in the protoplanetary disk of our solar system were ejected into very large, elliptical orbits by interactions with the giant planets. The orbits of these objects were then affected by nearby stars and galactic tides, causing them to depart from orbits restricted to the plane of the solar system, and to form the now-spherical Oort cloud. You could imagine that a similar process could occur around another star with giant planets, and we know that there are many stars out there that do have giant planets.”
As Baxter and his colleagues indicated in their study, detecting EXOCs is difficult, largely for the same reasons for why there is no direct evidence for the Solar System’s own Oort Cloud. For one, there is not a lot of material in the cloud, with estimates ranging from a few to twenty times the mass of the Earth. Second, these objects are very far away from our Sun, which means they do not reflect much light or have strong thermal emissions.
For this reason, Baxter and his team recommended using maps of the sky at the millimeter and submillimeter wavelengths to search for signs of Oort Clouds around other stars. Such maps already exist, thanks to missions like the Planck telescope which have mapped the Cosmic Microwave Background (CMB). As Baxter indicated:
“In our paper, we use maps of the sky at 545 GHz and 857 GHz that were generated from observations by the Planck satellite. Planck was pretty much designed *only* to map the CMB; the fact that we can use this telescope to study exo-Oort clouds and potentially processes connected to planet formation is pretty surprising!”
This is a rather revolutionary idea, as the detection of EXOCs was not part of the intended purpose of the Planck mission. By mapping the CMB, which is “relic radiation” left over from the Big Bang, astronomers have sought to learn more about how the Universe has evolved since the the early Universe – circa. 378,000 years after the Big Bang. However, their study does build on previous work led by Alan Stern (the principal investigator of the New Horizons mission).
In 1991, along with John Stocke (of the University of Colorado, Boulder) and Paul Weissmann (from NASA’s Jet Propulsion Laboratory), Stern conducted a study titled “An IRAS search for extra-solar Oort clouds“. In this study, they suggested using data from the Infrared Astronomical Satellite (IRAS) for the purpose of searching for EXOCs. However, whereas this study focused on certain wavelengths and 17 star systems, Baxter and his team relied on data for tens of thousands of systems and at a wider range of wavelengths.
“Furthermore, the Gaia satellite has recently mapped out very accurately the positions and distances of stars in our galaxy,” Baxter added. “This makes choosing targets for exo-Oort cloud searches relatively straightforward. We used a combination of Gaia and Planck data in our analysis.”
To test their theory, Baxter and is team constructed a series of models for the thermal emission of exo-Oort clouds. “These models suggested that detecting exo-Oort clouds around nearby stars (or at least putting limits on their properties) was feasible given existing telescopes and observations,” he said. “In particular, the models suggested that data from the Planck satellite could potentially come close to detecting an exo-Oort cloud like our own around a nearby star.”
In addition, Baxter and his team also detected a hint of a signal around some of the stars that they considered in their study – specifically in the Vega and Formalhaut systems. Using this data, they were able to place constraints on the possible existence of EXOCs at a distance of 10,000 to 100,000 AUs from these stars, which roughly coincides with the distance between our Sun and the Oort Cloud.
However, additional surveys will be needed before the existence any of EXOCs can be confirmed. These surveys will likely involve the James Webb Space Telescope, which is scheduled to launch in 2021. In the meantime, this study has some rather significant implications for astronomers, and not just because it involves the use of existing CMB maps for extra-solar studies. As Baxter put it:
“Just detecting an exo-Oort cloud would be really interesting, since as I mentioned above, we don’t have any direct evidence for the existence of our own Oort cloud. If you did get a detection of an exo-Oort cloud, it could in principle provide insights into processes connected to planet formation and the evolution of protoplanetary disks. For instance, imagine that we only detected exo-Oort clouds around stars that have giant planets. That would provide pretty convincing evidence that the formation of an Oort cloud is connected to giant planets, as suggested by popular theories of the formation of our own Oort cloud.”
As our knowledge of the Universe expands, scientists become increasingly interested in what our Solar System has in common with other star systems. This, in turn, helps us to learn more about the formation and evolution of our own system. It also provides possible hints as to how the Universe changed over time, and maybe even where life could be found someday.
In the 1920s, Edwin Hubble made the groundbreaking discovery that the Universe was in a state of expansion. Originally predicted as a consequence of Einstein’s Theory of General Relativity, measurements of this expansion came to be known as Hubble’s Constant. Today, and with the help of next-generation telescopes – like the aptly-named Hubble Space Telescope (HST) – astronomers have remeasured and revised this law many times.
These measurements confirmed that the rate of expansion has increased over time, though scientists are still unsure why. The latest measurements were conducted by an international team using Hubble, who then compared their results with data obtained by the European Space Agency’s (ESA) Gaia observatory. This has led to the most precise measurements of the Hubble Constant to date, though questions about cosmic acceleration remain.
Since 2005, Adam Riess – a Nobel Laureate Professor with the Space Telescope Science Institute and the Johns Hopkins University – has been working to refine the Hubble Constant value by streamlining and strengthening the “cosmic distance ladder”. Along with his team, known as Supernova H0 for the Equation of State (SH0ES), they have successfully reduced the uncertainty associated with the rate of cosmic expansion to just 2.2%
To break it down, astronomers have traditionally used the “cosmic distance ladder” to measure distances in the Universe. This consists of relying on distance markers like Cepheid variables in distant galaxies – pulsating stars whose distances can be inferred by comparing their intrinsic brightness with their apparent brightness. These measurements are then compared to the way light from distant galaxies is redshifted to determine how fast the space between galaxies is expanding.
From this, the Hubble Constant is derived. Another method that is used is to observe the Cosmic Microwave Background (CMB) to trace the expansion of the cosmos during the early Universe – circa. 378,000 years after the Big Bang – and then using physics to extrapolate that to the present expansion rate. Together, the measurements should provide an end-to-end measurement of how the Universe has expanded over time.
However, astronomers have known for some time that the two measurements don’t match up. In a previous study, Riess and his team conducted measurements using Hubble to obtain a Hubble Constant value of 73 km/s (45.36 mps) per megaparsec (3.3 million light-years). Meanwhile, results based on the ESA’ Planck observatory (which observed the CMB between 2009 and 2013) predicted that the Hubble constant value should now be 67 km/s (41.63 mps) per megaparsec and no higher than 69 km/s (42.87 mps) – which represents a discrepancy of 9%.
“The tension seems to have grown into a full-blown incompatibility between our views of the early and late time universe. At this point, clearly it’s not simply some gross error in any one measurement. It’s as though you predicted how tall a child would become from a growth chart and then found the adult he or she became greatly exceeded the prediction. We are very perplexed.”
In this case, Riess and his colleagues used Hubble to gauge the brightness of distant Cepheid variables while Gaia provided the parallax information – the apparent change in an objects position based on different points of view – needed to determine the distance. Gaia also added to the study by measuring the distance to 50 Cepheid variables in the Milky Way, which were combined with brightness measurements from Hubble.
This allowed the astronomers to more accurately calibrate the Cepheids and then use those seen outside the Milky Way as milepost markers. Using both the Hubble measurements and newly released data from Gaia, Riess and his colleagues were able to refine their measurements on the present rate of expansion to 73.5 kilometers (45.6 miles) per second per megaparsec.
As Stefano Casertano, of the Space Telescope Science Institute and a member of the SHOES team, added:
“Hubble is really amazing as a general-purpose observatory, but Gaia is the new gold standard for calibrating distance. It is purpose-built for measuring parallax—this is what it was designed to do. Gaia brings a new ability to recalibrate all past distance measures, and it seems to confirm our previous work. We get the same answer for the Hubble constant if we replace all previous calibrations of the distance ladder with just the Gaia parallaxes. It’s a crosscheck between two very powerful and precise observatories.”
Looking to the future, Riess and his team hope to continue to work with Gaia so they can reduce the uncertainty associated with the value of the Hubble Constant to just 1% by the early 2020s. In the meantime, the discrepancy between modern rates of expansion and those based on the CMB will continue to be a puzzle to astronomers.
In the end, this may be an indication that other physics are at work in our Universe, that dark matter interacts with normal matter in a way that is different than what scientists suspect, or that dark energy could be even more exotic than previously thought. Whatever the cause, it is clear the Universe still has some surprises in store for us!
For decades, the predominant cosmological model used by scientists has been based on the theory that in addition to baryonic matter – aka. “normal” or “luminous” matter, which we can see – the Universe also contains a substantial amount of invisible mass. This “Dark Matter” accounts for roughly 26.8% of the mass of the Universe, whereas normal matter accounts for just 4.9%.
While the search for Dark Matter is ongoing and direct evidence is yet to be found, scientists have also been aware that roughly 90% of the Universe’s normal matter still remained undetected. According to twonew studies that were recently published, much of this normal matter – which consists of filaments of hot, diffuse gas that links galaxies together – may have finally been found.
Based on cosmological simulations, the predominant theory has been that the previously-undetected normal matter of the Universe consists of strands of baryonic matter – i.e. protons, neutrons and electrons – that is floating between galaxies. These regions are what is known as the “Cosmic Web”, where low density gas exists at a temperatures of 105 to 107 K (-168 t0 -166 °C; -270 to 266 °F).
For the sake of their studies, both teams consulted data from the Planck Collaboration, a venture maintained by the European Space Agency that includes all those who contributed to the Planck mission (ESA). This was presented in 2015, where it was used to create a thermal map of the Universe by measuring the influence of the Sunyaev-Zeldovich (SZ) effect.
This effect refers to a spectral distortion in the Cosmic Microwave Background, where photons are scattered by ionized gas in galaxies and larger structures. During its mission to study the cosmos, the Planck satellite measured the spectral distortion of CMB photons with great sensitivity, and the resulting thermal map has since been used to chart the large-scale structure of the Universe.
However, the filaments between galaxies appeared too faint for scientists to examine at the time. To remedy this, the two teams consulted data from the North and South CMASS galaxy catalogues, which were produced from the 12th data release of the Sloan Digital Sky Survey (SDSS). From this data set, they then selected pairs of galaxies and focused on the space between them.
They then stacked the thermal data obtained by Planck for these areas on top of each other in order to strengthen the signals caused by SZ effect between galaxies. As Dr. Hideki told Universe Today via email:
“The SDSS galaxy survey gives a shape of the large-scale structure of the Universe. The Planck observation provides an all-sky map of gas pressure with a better sensitivity. We combine these data to probe the low-dense gas in the cosmic web.”
While Tanimura and his team stacked data from 260,000 galaxy pairs, de Graaff and her team stacked data from over a million. In the end, the two teams came up with strong evidence of gas filaments, though their measurements differed somewhat. Whereas Tanimura’s team found that the density of these filaments was around three times the average density in the surrounding void, de Graaf and her team found that they were six times the average density.
“We detect the low-dense gas in the cosmic web statistically by a stacking method,” said Hideki. “The other team uses almost the same method. Our results are very similar. The main difference is that we are probing a nearby Universe, on the other hand, they are probing a relatively farther Universe.”
This particular aspect of particularly interesting, in that it hints that over time, baryonic matter in the Cosmic Web has become less dense. Between these two results, the studies accounted for between 15 and 30% of the total baryonic content of the Universe. While that would mean that a significant amount of the Universe’s baryonic matter still remains to be found, it is nevertheless an impressive find.
As Hideki explained, their results not only support the current cosmological model of the Universe (the Lambda CDM model) but also goes beyond it:
“The detail in our universe is still a mystery. Our results shed light on it and reveals a more precise picture of the Universe. When people went out to the ocean and started making a map of our world, it was not used for most of the people then, but we use the world map now to travel abroad. In the same way, a map of the entire universe may not be valuable now because we do not have a technology to go far out to the space. However, it could be valuable 500 years later. We are in the first stage of making a map of the entire Universe.”
It also opens up opportunities for future studies of the Comsic Web, which will no doubt benefit from the deployment of next-generation instruments like James Webb Telescope, the Atacama Cosmology Telescope and the Q/U Imaging ExperimenT (QUIET). With any luck, they will be able to spot the remaining missing matter. Then, perhaps we can finally zero in on all the invisible mass!