Japanese astronomers have captured images of an astonishing 1800 supernovae. 58 of these supernovae are the scientifically-important Type 1a supernovae located 8 billion light years away. Type 1a supernovae are known as ‘standard candles’ in astronomy.Continue reading “Subaru Telescope Sees 1800 Supernovae”
In the 1920s, Edwin Hubble made the groundbreaking discovery that the Universe was in a state of expansion. Originally predicted as a consequence of Einstein’s Theory of General Relativity, measurements of this expansion came to be known as Hubble’s Constant. Today, and with the help of next-generation telescopes – like the aptly-named Hubble Space Telescope (HST) – astronomers have remeasured and revised this law many times.
These measurements confirmed that the rate of expansion has increased over time, though scientists are still unsure why. The latest measurements were conducted by an international team using Hubble, who then compared their results with data obtained by the European Space Agency’s (ESA) Gaia observatory. This has led to the most precise measurements of the Hubble Constant to date, though questions about cosmic acceleration remain.
The study which describes their findings appeared in the July 12th issue of the Astrophysical Journal, titled “Milky Way Cepheid Standards for Measuring Cosmic Distances and Application to Gaia DR2: Implications for the Hubble Constant.” The team behind the study included members from the Space Telescope Science Institute (STScI), the Johns Hopkins University, the National Institute for Astrophysics (INAF), UC Berkeley, Texas A&M University, and the European Southern Observatory (ESO).
Since 2005, Adam Riess – a Nobel Laureate Professor with the Space Telescope Science Institute and the Johns Hopkins University – has been working to refine the Hubble Constant value by streamlining and strengthening the “cosmic distance ladder”. Along with his team, known as Supernova H0 for the Equation of State (SH0ES), they have successfully reduced the uncertainty associated with the rate of cosmic expansion to just 2.2%
To break it down, astronomers have traditionally used the “cosmic distance ladder” to measure distances in the Universe. This consists of relying on distance markers like Cepheid variables in distant galaxies – pulsating stars whose distances can be inferred by comparing their intrinsic brightness with their apparent brightness. These measurements are then compared to the way light from distant galaxies is redshifted to determine how fast the space between galaxies is expanding.
From this, the Hubble Constant is derived. Another method that is used is to observe the Cosmic Microwave Background (CMB) to trace the expansion of the cosmos during the early Universe – circa. 378,000 years after the Big Bang – and then using physics to extrapolate that to the present expansion rate. Together, the measurements should provide an end-to-end measurement of how the Universe has expanded over time.
However, astronomers have known for some time that the two measurements don’t match up. In a previous study, Riess and his team conducted measurements using Hubble to obtain a Hubble Constant value of 73 km/s (45.36 mps) per megaparsec (3.3 million light-years). Meanwhile, results based on the ESA’ Planck observatory (which observed the CMB between 2009 and 2013) predicted that the Hubble constant value should now be 67 km/s (41.63 mps) per megaparsec and no higher than 69 km/s (42.87 mps) – which represents a discrepancy of 9%.
As Riess indicated in a recent NASA press release:
“The tension seems to have grown into a full-blown incompatibility between our views of the early and late time universe. At this point, clearly it’s not simply some gross error in any one measurement. It’s as though you predicted how tall a child would become from a growth chart and then found the adult he or she became greatly exceeded the prediction. We are very perplexed.”
In this case, Riess and his colleagues used Hubble to gauge the brightness of distant Cepheid variables while Gaia provided the parallax information – the apparent change in an objects position based on different points of view – needed to determine the distance. Gaia also added to the study by measuring the distance to 50 Cepheid variables in the Milky Way, which were combined with brightness measurements from Hubble.
This allowed the astronomers to more accurately calibrate the Cepheids and then use those seen outside the Milky Way as milepost markers. Using both the Hubble measurements and newly released data from Gaia, Riess and his colleagues were able to refine their measurements on the present rate of expansion to 73.5 kilometers (45.6 miles) per second per megaparsec.
As Stefano Casertano, of the Space Telescope Science Institute and a member of the SHOES team, added:
“Hubble is really amazing as a general-purpose observatory, but Gaia is the new gold standard for calibrating distance. It is purpose-built for measuring parallax—this is what it was designed to do. Gaia brings a new ability to recalibrate all past distance measures, and it seems to confirm our previous work. We get the same answer for the Hubble constant if we replace all previous calibrations of the distance ladder with just the Gaia parallaxes. It’s a crosscheck between two very powerful and precise observatories.”
Looking to the future, Riess and his team hope to continue to work with Gaia so they can reduce the uncertainty associated with the value of the Hubble Constant to just 1% by the early 2020s. In the meantime, the discrepancy between modern rates of expansion and those based on the CMB will continue to be a puzzle to astronomers.
In the end, this may be an indication that other physics are at work in our Universe, that dark matter interacts with normal matter in a way that is different than what scientists suspect, or that dark energy could be even more exotic than previously thought. Whatever the cause, it is clear the Universe still has some surprises in store for us!
Further Reading: NASA
Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.
This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.
The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.
The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).
Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.
According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:
“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³) then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”
Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:
“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works… Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”
For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.
As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:
“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”
In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.
While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.
Clearly I need to learn to be more specific when I write these articles. Everything time I open my mouth, I need to prepare for the collective imagination of the viewers.
We did a whole article about the biggest things in the Universe, and identified superclusters of galaxies as the best candidate. Well, the part of superclusters actually gravitationally bound enough to eventually merge together in the future. But you had other ideas, including dark energy, or the Universe itself as the biggest thing. Even love? Aww.
One intriguing suggestion, though, is the idea of the vast cosmic voids between galaxies. Hmm, is the absence of something a thing? Whoa, time to go to art school and talk about negative space.
Ah well, who cares? It’s a super interesting topic, so let’s go ahead and talk about voids.
When most people imagine the expansion of the Universe after the Big Bang, they probably envision an equally spaced smattering of galaxies zipping away from one another. And that’s pretty accurate at the smallest scales.
But at the largest scales, like when you can see billions of light-years in a cube that fits on your computer screen, then a larger structure starts to take shape.
It looks less like an explosion, and more like a tasty tasty sponge cake, with huge filaments, walls, and the vast gaps in between. The gaps, the voids, the supervoids, are the point of today’s article, but to understand the gaps, we’ve got to understand why the Universe is clumped up the way it is.
Run the Universe clock backwards, all the way to the beginning, to a fraction of a second after the Big Bang. When the entire cosmos was compressed down into a tiny region of superheated plasma.
Although it was mostly uniform in density, there were slight variations – quantum fluctuations in spacetime itself. And as the Universe expanded, those differences were magnified. What started out as tiny differences in the density of matter at the smallest scale, turned into regions of higher and lower density of matter in the Universe.
Here we are, 13.8 billion years after the Big Bang, and we can see how the microscopic variations at the beginning of time were magnified to the largest scales. Instead of individual galaxies, we see huge walls containing thousands of galaxies; filaments of galaxies connect in nodes. These structures are huge; hundreds of millions of light-years across, containing thousands of galaxies. But the gaps, the voids, between these clusters can be even larger.
Astronomers first started thinking about these voids back in the 1970s, when the first large-scale surveys of the Universe were made. By measuring the redshift of galaxies, and determining how fast they were speeding away from us, astronomers started to realize that the distribution of galaxies wasn’t even.
Some galaxies were relatively close, but then there were huge gaps in distance, and then another cluster of galaxies collected together.
Over the last few decades, astronomers have built sophisticated 3-dimensional models that map out the Universe in the largest scales. The Sloan Digital Sky Survey, updated in 2009, has provided the most accurate map so far. The Large Synoptic Survey Telescope, destined for first light in a few years will take this to the next level.
The largest void that we currently know of is known as the Giant Void (original, I know), and it’s located about 1.5 billion light-year away. It has a diameter of 1 billion to 1.3 billion light-years across.
To be fair, these regions aren’t really completely empty. They just have less density than the regions with galaxies. In general, they’ve got about a tenth the density of matter that’s average for the Universe.
Which means that there’s still gas and dust in these regions, as well as dark matter. There will still be stars and galaxies out in the middle of those voids. Even the Giant Void has 17 separate galaxy clusters inside it.
You might imagine continuing to scale outward. Maybe you’re wondering if the this spongy distribution of matter is actually just the next step to an even larger structure, and so on, and so on. But it isn’t. In fact, astronomers call this “the End of Greatness”, because it doesn’t seem like there’s any larger structure to the Universe.
As the expansion of the Universe continues, these voids are going to get even larger. The walls and filaments connecting clusters of galaxies will stretch and break. The voids will merge with each other, and only gravitationally bound galaxy clusters will remain as islands, adrift in the expanding emptiness.
The full scale of the observable Universe is truly mind boggling. We’re here in this tiny corner of the Local Group, which is part of the Virgo Supercluster, which is perched on the precipice of vast cosmic voids. So much to explore, so let’s get to work.
In 1929, Edwin Hubble forever changed our understanding of the cosmos by showing that the Universe is in a state of expansion. By the 1990s, astronomers determined that the rate at which it is expanding is actually speeding up, which in turn led to the theory of “Dark Energy“. Since that time, astronomers and physicists have sought to determine the existence of this force by measuring the influence it has on the cosmos.
The latest in these efforts comes from the Sloan Digital Sky Survey III (SDSS III), where an international team of researchers have announced that they have finished creating the most precise measurements of the Universe to date. Known as the Baryon Oscillation Spectroscopic Survey (BOSS), their measurements have placed new constraints on the properties of Dark Energy.
The new measurements were presented by Harvard University astronomer Daniel Eisenstein at a recent meeting of the American Astronomical Society. As the director of the Sloan Digital Sky Survey III (SDSS-III), he and his team have spent the past ten years measuring the cosmos and the periodic fluctuations in the density of normal matter to see how galaxies are distributed throughout the Universe.
And after a decade of research, the BOSS team was able to produce a three-dimensional map of the cosmos that covers more than six billion light-years. And while other recent surveys have looked further afield – up to distances of 9 and 13 billion light years – the BOSS map is unique in that it boasts the highest accuracy of any cosmological map.
In fact, the BOSS team was able to measure the distribution of galaxies in the cosmos, and at a distance of 6 billion light-years, to within an unprecedented 1% margin of error. Determining the nature of cosmic objects at great distances is no easy matter, due the effects of relativity. As Dr. Eisenstein told Universe Today via email:
“Distances are a long-standing challenge in astronomy. Whereas humans often can judge distance because of our binocular vision, galaxies beyond the Milky Way are much too far away to use that. And because galaxies come in a wide range of intrinsic sizes, it is hard to judge their distance. It’s like looking at a far-away mountain; one’s judgement of its distance is tied up with one’s judgement of its height.”
In the past, astronomers have made accurate measurements of objects within the local universe (i.e. planets, neighboring stars, star clusters) by relying on everything from radar to redshift – the degree to which the wavelength of light is shifted towards the red end of the spectrum. However, the greater the distance of an object, the greater the degree of uncertainty.
And until now, only objects that are a few thousand light-years from Earth – i.e. within the Milky Way galaxy – have had their distances measured to within a one-percent margin of error. As the largest of the four projects that make up the Sloan Digital Sky Survey III (SDSS-III), what sets BOSS apart is the fact that it relies primarily on the measurement of what are called “baryon acoustic oscillations” (BAOs).
These are essentially subtle periodic ripples in the distribution of visible baryonic (i.e. normal) matter in the cosmos. As Dr. Daniel Eisenstein explained:
“BOSS measures the expansion of the Universe in two primary ways. The first is by using the baryon acoustic oscillations (hence the name of the survey). Sound waves traveling in the first 400,000 years after the Big Bang create a preferred scale for separations of pairs of galaxies. By measuring this preferred separation in a sample of many galaxies, we can infer the distance to the sample.
“The second method is to measure how clustering of galaxies differs between pairs oriented along the line of sight compared to transverse to the line of sight. The expansion of the Universe can cause this clustering to be asymmetric if one uses the wrong expansion history when converting redshifts to distance.”
With these new, highly-accurate distance measurements, BOSS astronomers will be able to study the influence of Dark Matter with far greater precision. “Different dark energy models vary in how the acceleration of the expansion of the Universe proceeds over time,” said Eisenstein. “BOSS is measuring the expansion history, which allows us to infer the acceleration rate. We find results that are highly consistent with the predictions of the cosmological constant model, that is, the model in which dark energy has a constant density over time.”
In addition to measuring the distribution of normal matter to determine the influence of Dark Energy, the SDSS-III Collaboration is working to map the Milky Way and search for extrasolar planets. The BOSS measurements are detailed in a series of articles that were submitted to journals by the BOSS collaboration last month, all of which are now available online.
And BOSS is not the only effort to understand the large-scale structure of our Universe, and how all its mysterious forces have shaped it. Just last month, Professor Stephen Hawking announced that the COSMOS supercomputing center at Cambridge University would be creating the most detailed 3D map of the Universe to date.
Relying on data obtained by the CMB data obtained by the ESA’s Planck satellite and information from the Dark Energy Survey, they also hope to measure the influence Dark Energy has had on the distribution of matter in our Universe. Who knows? In a few years time, we may very well come to understand how all the fundamental forces governing the Universe work together.
Further Reading: SDSIII
Back in 1997, a team of leading scientists and cosmologists came together to establish the COSMOS supercomputing center at Cambridge University. Under the auspices of famed physicist Stephen Hawking, this facility and its supercomputer are dedicated to the research of cosmology, astrophysics and particle physics – ultimately, for the purpose of unlocking the deeper mysteries of the Universe.
Yesterday, in what was themed as a “tribute to Stephen Hawking”, the COSMOS center announced that it will be embarking on what is perhaps the boldest experiment in cosmological mapping. Essentially, they intend to create the most detailed 3D map of the early universe to date, plotting the position of billions of cosmic structures including supernovas, black holes, and galaxies.
This map will be created using the facility’s supercomputer, located in Cambridge’s Department of Applied Mathematics and Theoretical Physics. Currently, it is the largest shared-memory computer in Europe, boasting 1,856 Intel Xeon E5 processor cores, 31 Intel Many Integrated Core (MIC) co-processors, and 14.5 terabytes of globally shared memory.
The 3D will also rely on data obtained by two previous surveys – the ESA’s Planck satellite and the Dark Energy Survey. From the former, the COSMOS team will use the detailed images of the Cosmic Microwave Background (CMB) – the radiation leftover by the Big Ban – that were released in 2013. These images of the oldest light in the cosmos allowed physicists to refine their estimates for the age of the Universe (13.82 billion years) and its rate of expansion.
This information will be combined with data from the Dark Energy Survey which shows the expansion of the Universe over the course of the last 10 billion years. From all of this, the COSMOS team will compare the early distribution of matter in the Universe with its subsequent expansion to see how the two link up.
While cosmological simulations that looked at the evolution and large-scale structure of the Universe have been performed in the past – such as the Evolution and Assembly of GaLaxies and their Environments (EAGLE) project and the survey performed by the Institute for the Physics and Mathematics of the Universe at Tokyo University – this will be the first time where scientists compare data the early Universe to its evolution since.
The project is also expected to receive a boost from the deployment of the ESA’s Euclid probe, which is scheduled for launch in 2020. This mission will measure the shapes and redshifts of galaxies (looking 10 billion years into the past), thereby helping scientists to understand the geometry of the “dark Universe” – i.e. how dark matter and dark energy influence it as a whole.
The plans for the COSMOS center’s 3D map are will be unveiled at the Starmus science conference, which will be taking place from July 2nd to 27th, 2016, in Tenerife – the largest of the Canary Islands, located off the coast of Spain. At this conference, Hawking will be discussing the details of the COSMOS project.
In addition to being the man who brought the COSMOS team together, the theme of the project – “Beyond the Horizon – Tribute to Stephen Hawking” – was selected because of Hawking’s long-standing commitment to physics and cosmology. “Hawking is a great theorist but he always wants to test his theories against observations,” said Prof. Shellard in a Cambridge press release. “What will emerge is a 3D map of the universe with the positions of billions of galaxies.”
Hawking will also present the first ever Stephen Hawking Medal for Science Communication, an award established by Hawking that will be bestowed on those who help promote science to the public through media – i.e. cinema, music, writing and art. Other speakers who will attending the event include Neil deGrasse Tyson, Chris Hadfield, Martin Rees, Adam Riess, Rusty Schweickart, Eric Betzig, Neil Turok, and Kip Thorne.
Naturally, it is hoped that the creation of this 3D map will confirm current cosmological theories, which include the current age of the Universe and whether or not the Standard Model of cosmology – aka. the Lambda Cold Dark Matter (CDM) model – is in fact the correct one. As Hawking is surely hoping, this could bring us one step closer to a Theory of Everything!
Further Reading: Cambridge News
Ever since Lemaitre and Hubble’s first proposed it in the 1920s, scientists and astronomers have been aware that the Universe is expanding. And from these observations, cosmological theories like the Big Bang Theory and the “Arrow of Time” emerged. Whereas the former addresses the origins and evolution of our Universe, the latter argues that the flow of time in one-direction and is linked to the expansion of space.
For many years, scientists have been trying to ascertain why this is. Why does time flow forwards, but not backwards? According to new study produced by a research team from the Yerevan Institute of Physics and Yerevan State University in Armenia, the influence of dark energy may be the reason for the forward-flow of time, which may make one-directional time a permanent feature of our universe.
Today, theories like the Arrow of Time and the expansion of the universe are considered fundamental facts about the Universe. Between measuring time with atomic clocks, observing the red shift of galaxies, and created detailed 3D maps that show the evolution of our Universe over the course of billions of years, one can see how time and the expansion of space are joined at the hip.
The question of why this is the case though is one that has continued to frustrate physicists. Certain fundamental forces, like gravity, are not governed by time. In fact, one could argue without difficulty that Newton’s Laws of Motion and quantum mechanics work the same forwards or backwards. But when it comes to things on the grand scale like the behavior of planets, stars, and entire galaxies, everything seems to come down to the Second Law of Thermodynamics.
This law, which states that the total chaos (aka. entropy) of an isolated system always increases over time, the direction in which time moves is crucial and non-negotiable, has come to be accepted as the basis for the Arrow of Time. In the past, some have ventured that if the Universe began to contract, time itself would begin to flow backwards. However, since the 1990s and the observation that the Universe has been expanding at an accelerating rate, scientists have come to doubt that this.
If, in fact, the Universe is being driven to greater rates of expansion – the predominant explanation is that “Dark Energy” is what is driving it – then the flow of time will never cease being one way. Taking this logic a step further, two Armenian researchers – Armen E. Allahverdyan of the Center for Cosmology and Astrophysics at the Yerevan Institute of Physics and Vahagn G. Gurzadyan of Yerevan State University – argue that dark energy is the reason why time always moves forward.
In their paper, titled “Time Arrow is Influenced by the Dark Energy“, they argue that dark energy accelerating the expansion of the universe supports the asymmetrical nature of time. Often referred to as the “cosmological constant” – referring to Einstein’s original theory about a force which held back gravity to achieve a static universe – dark energy is now seen as a “positive” constant, pushing the Universe forward, rather than holding it back.
To test their theory, Allahverdyan and Gurzadyan used a large scale scenario involving gravity and mass – a planet with increasing mass orbiting a star. What they found was that if dark energy had a value of 0 (which is what physicists thought before the 1990s), or if gravity were responsible for pulling space together, the planet would simply orbit the star without any indication as to whether it was moving forwards or backwards in time.
But assuming that the value of dark energy is a positive (as all the evidence we’ve seen suggests) then the planet would eventually be thrown clear of the star. Running this scenario forward, the planet is expelled because of its increasing mass; whereas when it is run backwards, the planet closes in on the star and is captured by it’s gravity.
In other words, the presence of dark energy in this scenario was the difference between having an “arrow of time” and not having one. Without dark energy, there is no time, and hence no way to tell the difference between past, present and future, or whether things are running in a forward direction or backwards.
But of course, Allahverdyan and Gurzadyan were also sure to note in their study that this is a limited test and doesn’t answer all of the burning questions. “We also note that the mechanism cannot (and should not) explain all occurrences of the thermodynamic arrow,” they said. “However, note that even when the dark energy (cosmological constant) does not dominate the mean density (early universe or today’s laboratory scale), it still exists.”
Limited or not, this research is representative of some exciting new steps that astrophysicists have been taking of late. This involves not only questioning the origins of dark energy and the expansion force it creates, but also questioning its implication in basic physics. In so doing, researchers may finally be able to answer the age-old question about why time exists, and whether or not it can be manipulated (i.e. time travel!)
Further Reading: Physical Review E
On June 30th, 1905, Albert Einstein started a revolution with the publication of theory of Special Relativity. This theory, among other things, stated that the speed of light in a vacuum is the same for all observers, regardless of the source. In 1915, he followed this up with the publication of his theory of General Relativity, which asserted that gravity has a warping effect on space-time. For over a century, these theories have been an essential tool in astrophysics, explaining the behavior of the Universe on the large scale.
However, since the 1990s, astronomers have been aware of the fact that the Universe is expanding at an accelerated rate. In an effort to explain the mechanics behind this, suggestions have ranged from the possible existence of an invisible energy (i.e. Dark Energy) to the possibility that Einstein’s field equations of General Relativity could be breaking down. But thanks to the recent work of an international research team, it is now known that Einstein had it right all along.
This graph illustrates the Cepheid period-luminosity relationship, which scientists use to calculate the size, age and expansion rate of the Universe. Credit: NASA/JPL-Caltech/Carnegie
How fast is our Universe expanding? Over the decades, there have been different estimates used and heated debates over those approximations, but now data from the Spitzer Space Telescope have provided the most precise measurement yet of the Hubble constant, or the rate at which our universe is stretching apart. The result? The Universe is getting bigger a little bit faster than previously thought.
The newly refined value for the Hubble constant is 74.3 plus or minus 2.1 kilometers per second per megaparsec.
The most previous estimation came from a study from the Hubble Space Telescope, at 74.2 plus or minus 3.6 kilometers per second per megaparsec. A megaparsec is roughly 3 million light-years.
To make the new measurements, Spitzer scientists looked at pulsating stars called cephied variable stars, taking advantage of being able to observe them in long-wavelength infrared light. In addition, the findings were combined with previously published data from NASA’s Wilkinson Microwave Anisotropy Probe (WMAP) on dark energy. The new determination brings the uncertainty down to 3 percent, a giant leap in accuracy for cosmological measurements, scientists say.
WMAP obtained an independent measurement of dark energy, which is thought to be winning a battle against gravity, pulling the fabric of the universe apart. Research based on this acceleration garnered researchers the 2011 Nobel Prize in physics.
The Hubble constant is named after the astronomer Edwin P. Hubble, who astonished the world in the 1920s by confirming our universe has been expanding since it exploded into being 13.7 billion years ago. In the late 1990s, astronomers discovered the expansion is accelerating, or speeding up over time. Determining the expansion rate is critical for understanding the age and size of the universe.
“This is a huge puzzle,” said the lead author of the new study, Wendy Freedman of the Observatories of the Carnegie Institution for Science in Pasadena. “It’s exciting that we were able to use Spitzer to tackle fundamental problems in cosmology: the precise rate at which the universe is expanding at the current time, as well as measuring the amount of dark energy in the universe from another angle.” Freedman led the groundbreaking Hubble Space Telescope study that earlier had measured the Hubble constant.
Glenn Wahlgren, Spitzer program scientist at NASA Headquarters in Washington, said the better views of cepheids enabled Spitzer to improve on past measurements of the Hubble constant.
“These pulsating stars are vital rungs in what astronomers call the cosmic distance ladder: a set of objects with known distances that, when combined with the speeds at which the objects are moving away from us, reveal the expansion rate of the universe,” said Wahlgren.
Cepheids are crucial to the calculations because their distances from Earth can be measured readily. In 1908, Henrietta Leavitt discovered these stars pulse at a rate directly related to their intrinsic brightness.
To visualize why this is important, imagine someone walking away from you while carrying a candle. The farther the candle traveled, the more it would dim. Its apparent brightness would reveal the distance. The same principle applies to cepheids, standard candles in our cosmos. By measuring how bright they appear on the sky, and comparing this to their known brightness as if they were close up, astronomers can calculate their distance from Earth.
Spitzer observed 10 cepheids in our own Milky Way galaxy and 80 in a nearby neighboring galaxy called the Large Magellanic Cloud. Without the cosmic dust blocking their view, the Spitzer research team was able to obtain more precise measurements of the stars’ apparent brightness, and thus their distances. These data opened the way for a new and improved estimate of our universe’s expansion rate.
“Just over a decade ago, using the words ‘precision’ and ‘cosmology’ in the same sentence was not possible, and the size and age of the universe was not known to better than a factor of two,” said Freedman. “Now we are talking about accuracies of a few percent. It is quite extraordinary.”
“Spitzer is yet again doing science beyond what it was designed to do,” said project scientist Michael Werner at NASA’s Jet Propulsion Laboratory. Werner has worked on the mission since its early concept phase more than 30 years ago. “First, Spitzer surprised us with its pioneering ability to study exoplanet atmospheres,” said Werner, “and now, in the mission’s later years, it has become a valuable cosmology tool.”
The study appears in the Astrophysical Journal.
Paper on arXiv: A Mid-Infrared Calibration of the Hubble Constant