Astronomers Closing in on Dark Energy with Refined Hubble Constant



The name “dark energy” is just a placeholder for the force — whatever it is — that is causing the Universe to expand. But astronomers are perhaps getting closer to understanding this force. New observations of several Cepheid variable stars by the Hubble Space Telescope has refined the measurement of the Universe’s present expansion rate to a precision where the error is smaller than five percent. The new value for the expansion rate, known as the Hubble constant, or H0 (after Edwin Hubble who first measured the expansion of the universe nearly a century ago), is 74.2 kilometers per second per megaparsec (error margin of ± 3.6). The results agree closely with an earlier measurement gleaned from Hubble of 72 ± 8 km/sec/megaparsec, but are now more than twice as precise.

The Hubble measurement, conducted by the SHOES (Supernova H0 for the Equation of State) Team and led by Adam Riess, of the Space Telescope Science Institute and the Johns Hopkins University, uses a number of refinements to streamline and strengthen the construction of a cosmic “distance ladder,” a billion light-years in length, that astronomers use to determine the universe’s expansion rate.

Hubble observations of the pulsating Cepheid variables in a nearby cosmic mile marker, the galaxy NGC 4258, and in the host galaxies of recent supernovae, directly link these distance indicators. The use of Hubble to bridge these rungs in the ladder eliminated the systematic errors that are almost unavoidably introduced by comparing measurements from different telescopes.

Steps to the Hubble Constant.  Credit: NASA, ESA, and A. Feild (STScI)
Steps to the Hubble Constant. Credit: NASA, ESA, and A. Feild (STScI)

Riess explains the new technique: “It’s like measuring a building with a long tape measure instead of moving a yard stick end over end. You avoid compounding the little errors you make every time you move the yardstick. The higher the building, the greater the error.”

Lucas Macri, professor of physics and astronomy at Texas A&M, and a significant contributor to the results, said, “Cepheids are the backbone of the distance ladder because their pulsation periods, which are easily observed, correlate directly with their luminosities. Another refinement of our ladder is the fact that we have observed the Cepheids in the near-infrared parts of the electromagnetic spectrum where these variable stars are better distance indicators than at optical wavelengths.”

This new, more precise value of the Hubble constant was used to test and constrain the properties of dark energy, the form of energy that produces a repulsive force in space, which is causing the expansion rate of the universe to accelerate.

By bracketing the expansion history of the universe between today and when the universe was only approximately 380,000 years old, the astronomers were able to place limits on the nature of the dark energy that is causing the expansion to speed up. (The measurement for the far, early universe is derived from fluctuations in the cosmic microwave background, as resolved by NASA’s Wilkinson Microwave Anisotropy Probe, WMAP, in 2003.)

Their result is consistent with the simplest interpretation of dark energy: that it is mathematically equivalent to Albert Einstein’s hypothesized cosmological constant, introduced a century ago to push on the fabric of space and prevent the universe from collapsing under the pull of gravity. (Einstein, however, removed the constant once the expansion of the universe was discovered by Edwin Hubble.)

Detail from NGC 3021. Credit: NASA, ESA, and A. Riess (STScI/JHU)
Detail from NGC 3021. Credit: NASA, ESA, and A. Riess (STScI/JHU)

“If you put in a box all the ways that dark energy might differ from the cosmological constant, that box would now be three times smaller,” says Riess. “That’s progress, but we still have a long way to go to pin down the nature of dark energy.”

Though the cosmological constant was conceived of long ago, observational evidence for dark energy didn’t come along until 11 years ago, when two studies, one led by Riess and Brian Schmidt of Mount Stromlo Observatory, and the other by Saul Perlmutter of Lawrence Berkeley National Laboratory, discovered dark energy independently, in part with Hubble observations. Since then astronomers have been pursuing observations to better characterize dark energy.

Riess’s approach to narrowing alternative explanations for dark energy—whether it is a static cosmological constant or a dynamical field (like the repulsive force that drove inflation after the big bang)—is to further refine measurements of the universe’s expansion history.

Before Hubble was launched in 1990, the estimates of the Hubble constant varied by a factor of two. In the late 1990s the Hubble Space Telescope Key Project on the Extragalactic Distance Scale refined the value of the Hubble constant to an error of only about ten percent. This was accomplished by observing Cepheid variables at optical wavelengths out to greater distances than obtained previously and comparing those to similar measurements from ground-based telescopes.

The SHOES team used Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) and the Advanced Camera for Surveys (ACS) to observe 240 Cepheid variable stars across seven galaxies. One of these galaxies was NGC 4258, whose distance was very accurately determined through observations with radio telescopes. The other six galaxies recently hosted Type Ia supernovae that are reliable distance indicators for even farther measurements in the universe. Type Ia supernovae all explode with nearly the same amount of energy and therefore have almost the same intrinsic brightness.

By observing Cepheids with very similar properties at near-infrared wavelengths in all seven galaxies, and using the same telescope and instrument, the team was able to more precisely calibrate the luminosity of supernovae. With Hubble’s powerful capabilities, the team was able to sidestep some of the shakiest rungs along the previous distance ladder involving uncertainties in the behavior of Cepheids.

Riess would eventually like to see the Hubble constant refined to a value with an error of no more than one percent, to put even tighter constraints on solutions to dark energy.

Source: Space Telescope Science Institute

Dark Matter, Dark Energy; Now There’s “Dark Gulping”

The HST WFPC2 image of gravitational lensing in the galaxy cluster Abell 2218, indicating the presence of large amount of dark matter (credit Andrew Fruchter at STScI).

[/caption]
For all you dark matter and dark energy fans out there, now there’s another new “dark” to add to the list. It’s called “dark gulping,” and it involves a process which may explain how supermassive black holes were able to form in the early universe. Astronomers from the University College of London (UCL) propose that dark gulping occurred when there were gravitational interactions between the invisible halo of dark matter in a cluster of galaxies and the gas embedded in the dark matter halo. This occurred when the Universe was less than a billion years old. They found that the interactions cause the dark matter to form a compact central mass, which can be gravitationally unstable, and collapse. The fast dynamical collapse is the dark gulping.

Dr. Curtis Saxton and Professor Kinwah Wu, both of UCL’s Mullard Space Science Laboratory, developed a model to study the process. They say that the dark gulping would have happened very rapidly, without a trace of electro-magnetic radiation being emitted.

There are several theories for how supermassive black holes form. One possibility is that a single large gas cloud collapses. Another is that a black hole formed by the collapse of a giant star swallows up enormous amounts of matter. Still another possibility is that a cluster of small black holes merge together. However, all these options take many millions of years and are at odds with recent observations that suggest that black holes were present when the Universe was less than a billion years old. Dark gulping may provide a solution to how the slowness of gas accretion was circumvented, enabling the rapid emergence of giant black holes. The affected dark mass in the compact core is compatible with the scale of supermassive black holes in galaxies today.

Dark matter appears to gravitationally dominate the dynamics of galaxies and galaxy clusters. However, there is still a great deal of conjecture about origin, properties and distribution of dark particles. While it appears that dark matter doesn’t interact with light, it does interacts with ordinary matter via gravity. “Previous studies have ignored the interaction between gas and the dark matter,” said Saxton, “but, by factoring it into our model, we’ve achieved a much more realistic picture that fits better with observations and may also have gained some insight into the presence of early supermassive black holes.”?

According to the model, the development of a compact mass at the core is inevitable. Cooling by the gas causes it to flow gently in towards the center. The gas can be up to 10 million degrees at the outskirts of the halos, which are few million light years in diameter, with a cooler zone towards the core, which surrounds a warmer interior a few thousand light years across. The gas doesn’t cool indefinitely, but reaches a minimum temperature, which fits well with X-ray observations of galaxy clusters.

The model also investigates how many dimensions the dark particles move in, as these determine the rate at which the dark halo expands and absorbs and emits heat, and ultimately affect the distribution of dark mass the system.

“In the context of our model, the observed core sizes of galaxy cluster halos and the observed range of giant black hole masses imply that dark matter particles have between seven and ten degrees of freedom,”?said Saxton. ?”With more than six, the inner region of the dark matter approaches the threshold of gravitational instability, opening up the possibility of dark gulping taking place.?

The findings have been published in the Monthly Notices of the Royal Astronomical Society.

Source: RAS

Cosmologists Search for Gravity Waves to Prove Inflation Theory

The South Pole Telescope under the aurora australis (southern lights). Photo by Keith Vanderlinde

[/caption]

During the next decade, cosmologists will attempt to observe the first moments of the Universe, hoping to prove a popular theory. They’ll be searching for extremely weak gravity waves to measure primordial light, looking for convincing evidence for the Cosmic Inflation Theory, which proposes that a random, microscopic density fluctuation in the fabric of space and time gave birth to the Universe in a hot big bang approximately 13.7 billion years ago. A new instrument called a polarimeter is being attached to the South Pole Telescope (SPT), which operates at submillimeter wavelengths, between microwaves and the infrared on the electromagnetic spectrum. Einstein’s theory of general relativity predicts that Cosmic Inflation should produce the weak gravity waves.

Inflation Theory proposes a period of extremely rapid and exponential expansion of the Universe during its first few moments prior to the more gradual Big Bang expansion, during which time the energy density of the universe was dominated by a cosmological constant-type of vacuum energy that later decayed to produce the matter and radiation that fill the Universe today.

In 1979, physicist Alan Guth proposed the Cosmic Inflation Theory, which also predicts the existence of an infinite number of universes. Unfortunately, cosmologists have no way of testing that particular prediction.

The South Pole Telescope takes advantage of the clear, dry skies at the National  Science Foundation’s South Pole Station to study the cosmic background  radiation, the afterglow of the big bang. The SPT measures eight meters (26.4  feet) in diameter.  Photo by Jeff McMahon
The South Pole Telescope takes advantage of the clear, dry skies at the National Science Foundation’s South Pole Station to study the cosmic background radiation, the afterglow of the big bang. The SPT measures eight meters (26.4 feet) in diameter. Photo by Jeff McMahon

“Since these are separate universes, by definition that means we can never have any contact with them. Nothing that happens there has any impact on us,” said Scott Dodelson, a scientist at Fermi National Accelerator Laboratory and a Professor in Astronomy & Astrophysics at the University of Chicago.

But there is a way to probe the validity of cosmic inflation. The phenomenon would have produced two classes of perturbations. The first, fluctuations in the density of subatomic particles happen continuously throughout the universe, and scientists have already observed them.

“Usually they’re just taking place on the atomic scale. We never even notice them,” Dodelson said. But inflation would instantaneously stretch these perturbations into cosmic proportions. “That picture actually works. We can calculate what those perturbations should look like, and it turns out they are exactly right to produce the galaxies we see in the universe.”

The second class of perturbations would be gravity waves—Einsteinian distortions in space and time. Gravity waves also would get promoted to cosmic proportions, perhaps even strong enough for cosmologists to detect them with sensitive telescopes tuned to the proper frequency of electromagnetic radiation.

If the new polarimeter is sensitive enough, scientists should be able to detect the waves.

“If you detect gravity waves, it tells you a whole lot about inflation for our universe,” said John Carlstrom from the University of Chicago, who developed the new instrument. Carlstrom said detecting the waves would rule out various competing ideas for the origin of the universe. “There are fewer than there used to be, but they don’t predict that you have such an extreme, hot big bang, this quantum fluctuation, to start with,” he said. Nor would they produce gravity waves at detectable levels.

A simulation at this link portrays the distortions in space and time at the subatomic scale, the result of quantum fluctuations occurring continuously throughout the universe. Near the end of the simulation, cosmic inflation begins to stretch space-time to the cosmic proportions of the universe.

Cosmologists also use the SPT in their quest to solve the mystery of dark energy. A repulsive force, dark energy pushes the universe apart and overwhelms gravity, the attractive force exerted by all matter.
Dark energy is invisible, but astronomers are able to see its influence on clusters of galaxies that formed within the last few billion years.

NASA’s Wilkinson Microwave Anisotropy Probe collected data that produced this  chart of sound waves from the universe. Called a power spectrum, the chart  plots the cosmic microwave background radiation as ripples of different sizes  across the sky. The data are consistent with predictions of cosmic inflation  theory.  Courtesy of the WMAP Science Team
NASA’s Wilkinson Microwave Anisotropy Probe collected data that produced this chart of sound waves from the universe. Called a power spectrum, the chart plots the cosmic microwave background radiation as ripples of different sizes across the sky. The data are consistent with predictions of cosmic inflation theory. Courtesy of the WMAP Science Team

The SPT detects the cosmic microwave background (CMB) radiation, the afterglow of the big bang. Cosmologists have mined a fortune of data from the CMB, which represent the forceful drums and horns of the cosmic symphony. But now the scientific community has its ears cocked for the tones of a subtler instrument—gravitational waves—that underlay the CMB.

“We have these key components to our picture of the universe, but we really don’t know what physics produces any of them,” said Dodelson of inflation, dark energy and the equally mysterious dark matter. “The goal of the next decade is to identify the physics.”

Source: University of Chicago

Next-Generation Telescope Gets Team

Artist's rendering of the Giant Magellan Telescope and support facilities at Las Campanas Observatory, Chile, high in the Andes Mountains. Photo by Todd Mason/Mason Productions

 

[/caption]

Astronomy organizations in the United States, Australia and Korea have signed on to build the largest ground-based telescope in the world – unless another team gets there first. The Giant Magellan Telescope, or GMT, will have the resolving power of a single 24.5-meter (80-foot) primary mirror, which will make it three times more powerful than any of the Earth’s existing ground-based optical telescopes. Its domestic partners include the Carnegie Institution for Science, Harvard University, the Smithsonian Institution, Texas A & M University, the University of Arizona, and the University of Texas at Austin. Although the telescope has been in the works since 2003, the formal collaboration was announced Friday.

Charles Alcock, director of the Harvard-Smithsonian Center for Astrophysics, said the Giant Magellan Telescope is being designed to build on the legacy of a rash of smaller telescopes from the 1990s in California, Hawaii and Arizona. The existing telescopes have mirrors in the range of six to 10 meters (18 to 32 feet), and – while they’re making great headway in the nearby universe – they’re only able to make out the largest planets around other stars and the most luminous distant galaxies.

With a much larger primary mirror, the GMT will be able to detect much smaller and fainter objects in the sky, opening a window to the most distant, and therefore the oldest, stars and galaxies. Formed within the first billion years of the Big Bang, such objects reveal tantalizing insight into the universe’s infancy.

Earlier this year, a different consortium including the California Institute of Technology and the University of California, with Canadian and Japanese institutions, unveiled its own next-generation concept: the Thirty Meter Telescope. Whereas the GMT’s 24.5-meter primary mirror will come from a collection of eight smaller mirrors, the TMT will combine 492 segments to achieve the power of a single 30-meter (98-foot) mirror design.

In addition, the European Extremely Large Telescope is in the concept stage.

In terms of science, Alcock acknowledged that the two telescopes with US participation are headed toward redundancy. The main differences, he said, are in the engineering arena.

“They’ll probably both work,” he said. But Alcock thinks the GMT is most exciting from a technological point of view. Each of the GMT’s seven 8.4-meter primary segments will weigh 20 tons, and the telescope enclosure has a height of about 200 feet. The GMT partners aim to complete their detailed design within two years.

The TMT’s segmented concept builds on technology pioneered at the W.M. Keck Observatory in Hawaii, a past project of the Cal-Tech and University of California partnership.

Construction on the GMT is expected to begin in 2012 and completed in 2019, at Las Campanas Observatory in the Andes Mountains of Chile. The total cost is projected to be $700 million, with $130 million raised so far. 

Artists concept of the Thirty Meter Telescope Observatory. Credit: TMT
Artists concept of the Thirty Meter Telescope Observatory. Credit: TMT

Construction on the TMT could begin as early as 2011 with an estimated completion date of 2018. The telescope could go to Hawaii or Chile, and final site selection will be announced this summer. The total cost is estimated to be as high as $1 billion, with $300 million raised at last count.

 

Alcock said the next generation of telescopes is crucial for forward progress in 21st Century astronomy.

“The goal is to start discovering and characterizing planets that might harbor life,” he said. “It’s very clear that we’re going to need the next generation of telescopes to do that.”

And far from being a competition, the real race is to contribute to science, said Charles Blue, a TMT spokesman.

“All next generation observatories would really like to be up and running as soon as possible to meet the scientific demand,” he said.

In the shorter term, long distance space studies will get help from the James Webb Space Telescope, designed to replace the Hubble Space Telescope when it launches in 2013. And the Atacama Large Millimeter Array (ALMA), a large interferometer being completed in Chile, could join the fore by 2012.

Sources: EurekAlert and interviews with Charles Alcock, Charles Blue

Profiling Potential Supernovae

Astronomical plate showing Sagittarius. Credit: Ashley Pagnotta

[/caption]

Just as psychologists and detectives try to “profile” serial killers and other criminals, astronomers are trying to determine what type of star system will explode as a supernova. While criminals can sometimes be caught or rehabilitated before they do the crime, supernovae, well, there’s no stopping them. But there’s the potential of learning a great deal in both astronomy and cosmology by theorizing about potential stellar explosions. At the American Astronomical Society meeting last week, Professor Bradley E. Schaefer of Louisiana State University, Baton Rouge, discussed how searching through old astronomical archives can produce unique and front-line science about supernovae – as well as providing information about dark energy — in ways that no combination of modern telescopes can provide. Additionally, Schaefer said amateur astronomers can help in the search, too.

Schaefer has been studying archived data back to 1890. “Archival data is the only way to see the long-term behavior of stars, unless you want to keep watch nightly for the next century, and this is central to many front-line astronomy questions,” he said.

Bradley E. Schaefer of Louisiana State University, Baton Rouge
Bradley E. Schaefer of Louisiana State University, Baton Rouge

The main question Schaefer is trying answer is what stars are progenitors for type Ia supernovae. Astronomers have been trying to track down this mystery for over 40 years.

Type Ia supernovae are remarkably bright but also remarkably uniform in their brightness, and therefore are regarded as the best astronomical “standard candles” for measurement across cosmological distances. Type Ia supernovae are also key to the search for dark energy. These blasts have been used as distance markers for measuring how fast the Universe is expanding.

However, a potential problem is that distant supernovae might be different from nearby events, thus confounding the measures. Schaefer said the only way to solve this problem is to identify the type of stars that explode as Type Ia supernovae so that corrections can be calculated. “The upcoming big-money supernova-cosmology programs require the answer to this problem for them to achieve their goal of precision cosmology,” said Schaefer.

Supernova 1994D in the outskirts of the galaxy NGC 4526.
Supernova 1994D in the outskirts of the galaxy NGC 4526.

Many types of star systems have been proposed as being the potential supernovae, such as double white dwarf binaries which were not discovered until 1988, and symbiotic stars which are very rare. But the most promising progenitor is the recurrent novae (RN) which are usually binary systems with matter flowing off a companion star onto a white dwarf. The matter accumulates onto the white dwarf’s surface until the pressure gets high enough to trigger a thermonuclear reaction (like an H-bomb). RNs can have multiple eruptions every century (as opposed to classical novae which have only one observed eruption).

To answer the question if RN are supernova progenitors, Schaefer conducted extensive research to get RN orbital periods, accretion rates, outburst dates, eruption light curves, and the average magnitudes between outbursts.

Artists impression of a recurrent nova.
Artists impression of a recurrent nova.

One big question was whether there were enough RN occurrences to supply the observed rate of supernovae. Another question was if the nova eruption itself blows off more material than is accumulated between eruptions, so the white dwarf would not be gaining mass.

In looking at the old sky photos, he was able count all the discovered eruptions and measure the frequency of RN eruptions back to 1890. He could also measure the mass ejected during an eruption by measuring eclipse times on the archived photos, and then looking at the change in the orbital period across an eruption.

In doing so, Schaefer was able to answer both questions: There was enough RN occurrences to provide sources for the observed Type Ia supernovae rate. “With 10,000 recurrent novae in our Milky Way, their numbers are high enough to account for all of the Type Ia supernovae,” he said.

He also found the mass of the white dwarf is increasing and its collapse will occur within a million years or so, and cause a Type Ia supernova.

Schaefer concluded that roughly one-third of all ‘classical novae’ are really RNe with two-or-more eruptions in the last century.

With this knowledge, astronomical theorists can now perform the calculations to make subtle corrections in using supernovae to measure the Universe’s expansion, which may help the search for dark energy.

An important result from this archival search is the prediction of a RN that will erupt at any time. An RN named U Scorpii (U Sco) is ready to “blow,” and already a large worldwide collaboration (dubbed ‘USCO2009’) has been formed to make concentrated observations (in x-ray, ultraviolet, optical, and infrared wavelengths) of the upcoming event. This is the first time that a confident prediction has identified which star will go nova and which year it will blow up in.

During this search Schaefer also discovered one new RN (V2487 Oph), six new eruptions, five orbital periods, and two mysterious sudden drops in brightness during eruptions.

Another discovery is that the nova discovery efficiency is “horrifyingly low,” Schaefer said, being typically 4%. That is, only 1-out-of-25 novae are ever spotted. Schaefer said this is an obvious opportunity for amateur astronomers to use digital cameras to monitor the sky and discover all the missing eruptions.

Photo archive at Harvard.  Credit: Ashley Pagnotta
Photo archive at Harvard. Credit: Ashley Pagnotta

Schaefer used archives from around the world, with the two primary archives being the Harvard College Observatory in Boston, Massachusetts and at the headquarters of the American Association of Variable Star Observers (AAVSO) in Cambridge, Massachusetts. Harvard has a collection of half-a-million old sky photos covering the entire sky with 1000-3000 pictures of each star going back to 1890. The AAVSO is the clearinghouse for countless measures of star brightness by many thousands of amateurs worldwide from 1911 to present.

Source: Louisiana State University, AAS meeting press conference

More Evidence Earth is Not Center of Universe

Spiral Galaxy NGC 4414

[/caption]
If you’re certain the Universe revolves around you, I have some bad news for you. Researchers from the University of British Columbia say Earth’s location in the Universe is utterly unremarkable, despite recent theories that propose Earth is at the center of a giant void in space. A decade ago, it was discovered the Universe’s expansion was accelerating. This continually expanding Universe was attributed to dark energy, the highly repulsive and mysterious stuff that has yet to be detected. But some scientists came up with an alternate theory where Earth was near the centre of a giant void or bubble, mostly empty of matter. But new calculations solidify the case that dark energy permeates the cosmos.

While dark energy sometimes seems pretty far-fetched – with its mysterious and so far undetectable properties – the alternate “void” theory of why the Universe is ever-expanding contains a problem, in that it violates the long held Copernican Principle.

Polish astronomer Nicolaus Copernicus’s 1543 book, On the Revolutions of the Heavenly Spheres, moved Earth from being the center of the Universe to just another planet orbiting the Sun. Since then, astronomers have extended the idea and formed the Copernican Principle, which says that our place in the Universe as a whole is completely ordinary. Although the Copernican Principle has become a pillar of modern cosmology, finding conclusive evidence that our neighborhood of the Universe really isn’t special has proven difficult.

Nicolaus Copernicus
Nicolaus Copernicus

In 1998, studies of distant explosions called “type Ia supernovae” indicated that the expansion of the Universe is accelerating, an observation attributed to the repulsive force of a mysterious “dark energy.” But some cosmologist proposed that Earth was at the center of a void, and that gravity would create the illusion of acceleration, mimicking the effect of dark energy on the supernova observations.

Now some advanced analysis and modeling performed by UBC post-doctoral fellows Jim Zibin and Adam Moss and Astronomy Prof. Douglas Scott is showing that this alternate “void theory” just doesn’t add up.

The researchers used data from the Wilkinson Microwave Anisotropy Probe satellite, which includes members from UBC on its international team, as well as data from various ground-based instruments and surveys.

“We tested void models against the latest data, including subtle features in the cosmic microwave background radiation – the afterglow of the Big Bang – and ripples in the large-scale distribution of matter,” says Zibin. “We found that void models do a very poor job of explaining the combination of these data.”

The team’s calculations instead solidify the conventional view that an enigmatic dark energy fills the cosmos and is responsible for the acceleration of the Universe. “Recent advances in data collection have brought us to the era of precision cosmology,” says Zibin. “Void models are terrible at explaining the new data, but the standard dark energy model works very well.

“Since we can only observe the Universe from Earth, it’s really hard to determine if we’re in a ‘special place,'” says Zibin. “But we’ve now learned that our location is much more ordinary than the strange dark energy that fills the Universe.”

The team’s research is available at Physical Review Letters

Source: EurekAlert

No “Big Rip” in our Future: Chandra Provides Insights Into Dark Energy

Galaxy cluster Abell 85, seen by Chandra, left, and a model of the growth of cosmic structure when the Universe was 0.9 billion, 3.2 billion and 13.7 billion years old (now). Credit: Chandra

[/caption]
When you throw a ball up into the air, you expect gravity will eventually slow the ball, and it will come back down again. But what if you threw a ball up into the air and instead of coming back down, it accelerated away from you? That’s basically what is happening with our universe: everything is accelerating away from everything else. This acceleration was discovered in 1998, and scientists believe “dark energy” is responsible, a form of repulsive gravity, and it composes a majority of the universe, about 72%. We don’t know what it is yet, but now, for the first time, astronomers have clearly seen the effects of dark energy. Using the Chandra X-ray Observatory, scientists have tracked how dark energy has stifled the growth of galaxy clusters. Combining this new data with previous studies, scientists have obtained the best clues yet about what dark energy is, confirming its existence. And there’s good news, too: the expanding Universe won’t rip itself apart.

Previous methods of dark energy research measured Type Ia supernovae. The new X-ray results provide a crucial independent test of dark energy, long sought by scientists, which depends on how gravity competes with accelerated expansion in the growth of cosmic structures.

“This result could be described as ‘arrested development of the universe’,” said Alexey Vikhlinin of the Smithsonian Astrophysical Observatory in Cambridge, Mass., who led the research. “Whatever is forcing the expansion of the universe to speed up is also forcing its development to slow down.”

Vikhlinin and his colleagues used Chandra to observe the hot gas in dozens of galaxy clusters, which are the largest collapsed objects in the universe. Some of these clusters are relatively close and others are more than halfway across the universe.

The results show the increase in mass of the galaxy clusters over time aligns with a universe dominated by dark energy. It is more difficult for objects like galaxy clusters to grow when space is stretched, as caused by dark energy. Vikhlinin and his team see this effect clearly in their data. The results are remarkably consistent with those from the distance measurements, revealing general relativity applies, as expected, on large scales.

Previously, it wasn’t known for sure if dark energy was a constant across space, with a strength that never changes with distance or time, or if it is a function of space itself and as space expands dark energy would expand and get stronger. In other words, it wasn’t known if Einstein’s theory of general relativity and his cosmological constant was correct or if the theory would have to be modified for large scales.

But the Chandra study strengthens the evidence that dark energy is the cosmological constant, and is not growing in strength with time, which would cause the Universe to eventually rip itself apart.

“Putting all of this data together gives us the strongest evidence yet that dark energy is the cosmological constant, or in other words, that ‘nothing weighs something’,” said Vikhlinin. “A lot more testing is needed, but so far Einstein’s theory is looking as good as ever.”

These results have consequences for predicting the ultimate fate of the universe. If dark energy is explained by the cosmological constant, the expansion of the universe will continue to accelerate, and everything will disappear from sight of the Milky Way and its gravitationally bound neighbor galaxy, Andromeda. This won’t happen soon, but Vikhlinin said, “Double the age of Universe from today, and you will see strong affect. An astronomer would say this may be a good time to fund cosmological research because further down the road there will be nothing to observe!”

Vikhlinin’s paper can be found here.

Source: Chandra Press Release, press conference

‘Laser Comb’ To Measure the Accelerating Universe

Back in April, UT published an article about using a device called a ‘laser comb’ to search for Earth-like planets. But astronomers also hope to use the device to search for dark energy in an ambitious project that would measure the velocities of distant galaxies and quasars over a 20-year period. This would let astronomers test Einstein’s theory of general relativity and the nature of the mysterious dark energy. The device uses femto-second (one millionth of one billionth of a second) pulses of laser light coupled with an atomic clock to provide a precise standard for measuring wavelengths of light. Also known as an “astro-comb,” these devices should give astronomers the ability to use the Doppler shift method with incredible precision to measure spectral lines of starlight up to 60 times greater than any current high-tech method. Astronomers have been testing the device, and hope to use one in conjunction with the new Extremely Large Telescope which is being designed by ESO, the European Southern Observatory.

Astronomers use instruments called spectrographs to spread the light from celestial objects into its component colors, or frequencies, in the same way water droplets create a rainbow from sunlight. They can then measure the velocities of stars, galaxies and quasars, search for planets around other stars, or study the expansion of the Universe. A spectrograph must be accurately calibrated so that the frequencies of light can be correctly measured. This is similar to how we need accurate rulers to measure lengths correctly. In the present case, a laser provides a sort of ruler, for measuring colors rather than distances, with an extremely accurate and fine grid.

New, extremely precise spectrographs will be needed in experiments planned for the future Extremely Large Telescope.

“We’ll need something beyond what current technology can offer, and that’s where the laser frequency comb comes in. It is worth recalling that the kind of precision required, 1 cm/s, corresponds, on the focal plane of a typical high-resolution spectrograph, to a shift of a few tenths of a nanometre, that is, the size of some molecules,” explains PhD student and team member Constanza Araujo-Hauck from ESO.

The new calibration technique comes from the combination of astronomy and quantum optics, in a collaboration between researchers at ESO and the Max Planck Institute for Quantum Optics. It uses ultra-short pulses of laser light to create a ‘frequency comb’ – light at many frequencies separated by a constant interval – to create just the kind of precise ‘ruler’ needed to calibrate a spectrograph.

The device has been tested on a solar telescope, a new version of the system is now being built for the HARPS planet-finder instrument on ESO’s 3.6-metre telescope at La Silla in Chile, before being considered for future generations of instruments.

More information on laser combs.

Source: ESO

Cosmic ‘Needle in a Haystack’ Confirms Dark Energy

The bright blue blob is an ancient galaxy cluster. Credits: ESA XMM-Newton/EPIC, LBT/LBC, AIP (J. Kohnert)

[/caption]

A massive cluster of galaxies seen in the distant universe by ESA’s orbiting XMM-Newton x-ray observatory is so big that astronomers believe there can only be a few of them that far away in space and time. “Such massive galaxy clusters are thought to be rare objects in the distant Universe,” said Georg Lamer, Astrophysikalisches Institut in Potsdam, Germany. “They can be used to test cosmological theories. Indeed, the very presence of this cluster confirms the existence of a mysterious component of the Universe called dark energy.” The astronomers compared the rare find to a cosmic ‘needle in a haystack.’

The newly-discovered monster, known by the catalogue number 2XMM J083026+524133, is 7.7 thousand million light-years distant and is estimated to contain as much mass as a thousand large galaxies. Much of it is in the form of 100-million-degree hot gas. The bright blue blob of gas was found during a systematic analysis of catalogued objects as Lamer and his team were looking for patches of X-rays that could either be nearby galaxies of distant clusters of galaxies.

Based on 3,500 observations performed with XMM-Newton’s European Photon Imaging Camera (EPIC) covering about 1% of the entire sky, the catalogue contains more than 190,000 individual X-ray sources. J083026+524133 stood out because it was so bright. While checking visual images from the Sloan Digital Sky Survey, the team could not find any obvious nearby galaxy in that location. So they turned to the Large Binocular Telescope in Arizona and took a deep exposure, which found a cluster of galaxies in that location.

The astronomers were surprised to find the cluster contains a thousand times the mass of our own Milky Way Galaxy.

No one knows what dark energy is, but it is causing the expansion of the Universe to accelerate. This hampers the growth of massive galaxy clusters in more recent times, indicating that they must have formed earlier in the Universe. “The existence of the cluster can only be explained with dark energy,” says Lamer.

Yet he does not expect to find more of them in the XMM-Newton catalogue. “According to the current cosmological theories, we should only expect to find this one cluster in the 1% of sky that we have searched,” says Lamer.

Source: ESA

The Cosmic Void: Could we be in the Middle of it?

Is our region of space unique? As in there isn't much here? Credit: ESO. Edit: Ian O'Neill

[/caption]
On large scales, the Universe is homogeneous and isotropic. This means that no matter where you are located in the cosmos, give or take the occasional nebula or galactic cluster, the night sky will appear approximately the same. Naturally there is some ‘clumpiness’ in the distribution of the stars and galaxies, but generally the density of any given location will be the same as a location hundreds of light years away. This assumption is known as the Copernican Principle. By invoking the Copernican Principle, astronomers have predicted the existence of the elusive dark energy, accelerating the galaxies away from one another, thus expanding the Universe. But say if this basic assumption is incorrect? What if our region of the Universe is unique in that we are sitting in in a location where the average density is a lot lower than other regions of space? Suddenly our observations of light from Type 1a supernovae are not anomalous and can be explained by the local void. If this were to be the case, dark energy (or any other exotic substance for that matter) wouldn’t be required to explain the nature of our Universe after all…

Dark energy is a hypothetical energy predicted to permeate through the Cosmos, causing the observed expansion of the Universe. This strange energy is believed to account for 73% of the total mass-energy (i.e. E=mc2) of the Universe. But where is the evidence for dark energy? One of the main tools when measuring the accelerated expansion of the Universe is to analyse the red-shift of a distant object with a known brightness. In a Universe filled with stars, what object generates a “standard” brightness?

NASA, ESA, and A. Field (STScI)
The progenitor of a Type Ia Supernova. Credit: NASA, ESA, and A. Field (STScI)

Type 1a supernovae are known as ‘standard candles’ for this very reason. No matter where they explode in the observable universe, they will always blow with the same amount of energy. So, in the mid-1990’s astronomers observed distant Type 1a’s a little dimmer than expected. With the basic assumption (it may be an accepted view, but it is an assumption all the same) that the Universe obeys the Copernican Principle, this dimming suggested that there was some force in the Universe causing not only an expansion, but an accelerated expansion of the Universe. This mystery force was dubbed dark energy and it is now a commonly held view that the cosmos must be filled with it to explain these observations. (There are many other factors explaining the existence of dark energy, but this is a critical factor.)

According to a new publication headed by Timothy Clifton, from the University of Oxford, UK, the controversial suggestion that the widely accepted Copernican Principle is wrong is investigated. Perhaps we do exist in a unique region of space where the average density is much lower than the rest of the Universe. The observations of distant supernovae suddenly wouldn’t require dark energy to explain the nature of the expanding Universe. No exotic substances, no modifications to gravity and no extra dimensions required.

Clifton explains conditions that could explain supernova observations are that we live in an extremely rarefied region, right near the centre, and this void could be on a scale of the same order of magnitude as the observable Universe. If this were the case, the geometry of space-time would be different, influencing the passage of light in a different way than we’d expect. What’s more, he even goes as far as saying that any given observer has a high probability of finding themselves in such a location. However, in an inflationary Universe such as ours, the likelihood of the generation of such a void is low, but should be considered nonetheless. Finding ourselves in the middle of a unique region of space would out rightly violate the Copernican Principle and would have massive implications on all facets of cosmology. Quite literally, it would be a revolution.

The Copernican Principle is an assumption that forms the bedrock of cosmology. As pointed out by Amanda Gefter at New Scientist, this assumption should be open to scrutiny. After all, good science should not be akin to religion where an assumption (or belief) becomes unquestionable. Although Clifton’s study is speculative for now, it does pose some interesting questions about our understanding of the Universe and whether we are willing to test our fundamental ideas.

Sources: arXiv:0807.1443v1 [astro-ph], New Scientist Blog