Astronomers Now Closer to Understanding Dark Energy

Dark Energy
The Hubble Space Telescope image of the inner regions of the lensing cluster Abell 1689 that is 2.2 billion light?years away. Light from distant background galaxies is bent by the concentrated dark matter in the cluster (shown in the blue overlay) to produce the plethora of arcs and arclets that were in turn used to constrain dark energy. Image courtesy of NASA?ESA, Jullo (JPL), Natarajan (Yale), Kneib (LAM)

Understanding something we can’t see has been a problem that astronomers have overcome in the past. Now, a group of scientists believe a new technique will meet the challenge of helping to solve one of the biggest mysteries in cosmology today: understanding the nature of dark energy. Using the strong gravitational lensing method — where a massive galaxy cluster acts as a cosmic magnifying lens — an international team of astronomers have been able to study elusive dark energy for the first time. The team reports that when combined with existing techniques, their results significantly improve current measurements of the mass and energy content of the universe.

Using data taken by the Hubble Space Telescope as well as ground-based telescopes, the team analyzed images of 34 extremely distant galaxies situated behind Abell 1689, one of the biggest and most massive known galaxy clusters in the universe.

Through the gravitational lens of Abell 1689, the astronomers, led by Eric Jullo from JPL and Priyamvada Natarajan from Yale University, were able to detect the faint, distant background galaxies—whose light was bent and projected by the cluster’s massive gravitational pull—in a similar way that the lens of a magnifying lens distorts an object’s image.

Using this method, they were able to reduce the overall error in its equation-of-state parameter by 30 percent, when combined with other methods.

The way in which the images were distorted gave the astronomers clues as to the geometry of the space that lies between the Earth, the cluster and the distant galaxies. “The content, geometry and fate of the universe are linked, so if you can constrain two of those things, you learn something about the third,” Natarajan said.

The team was able to narrow the range of current estimates about dark energy’s effect on the universe, denoted by the value w, by 30 percent. The team combined their new technique with other methods, including using supernovae, X-ray galaxy clusters and data from the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft, to constrain the value for w.

“Dark energy is characterized by the relationship between its pressure and its density: this is known as its equation of state,” said Jullo. “Our goal was to try to quantify this relationship. It teaches us about the properties of dark energy and how it has affected the development of the Universe.”

Dark energy makes up about 72 percent of all the mass and energy in the universe and will ultimately determine its fate. The new results confirm previous findings that the nature of dark energy likely corresponds to a flat universe. In this scenario, the expansion of the universe will continue to accelerate and the universe will expand forever.

The astronomers say the real strength of this new result is that it devises a totally new way to extract information about the elusive dark energy, and it offers great promise for future applications.

According to the scientists, their method required multiple, meticulous steps to develop. They spent several years developing specialized mathematical models and precise maps of the matter — both dark and “normal” — that together constitute the Abell 1689 cluster.

The findings appear in the August 20 issue of the journal Science.

Sources: Yale University, Science Express. ESA Hubble.

New Technique Could Track Down Dark Energy

Robert C. Byrd Green Bank Telescope CREDIT: NRAO/AUI/NSF

[/caption]

From an NRAO press release:

Dark energy is the label scientists have given to what is causing the Universe to expand at an accelerating rate, and is believed to make up nearly three-fourths of the mass and energy of the Universe. While the acceleration was discovered in 1998, its cause remains unknown. Physicists have advanced competing theories to explain the acceleration, and believe the best way to test those theories is to precisely measure large-scale cosmic structures. A new technique developed for the Robert C. Byrd Green Bank Telescope (GBT) have given astronomers a new way to map large cosmic structures such as dark energy.

Sound waves in the matter-energy soup of the extremely early Universe are thought to have left detectable imprints on the large-scale distribution of galaxies in the Universe. The researchers developed a way to measure such imprints by observing the radio emission of hydrogen gas. Their technique, called intensity mapping, when applied to greater areas of the Universe, could reveal how such large-scale structure has changed over the last few billion years, giving insight into which theory of dark energy is the most accurate.

“Our project mapped hydrogen gas to greater cosmic distances than ever before, and shows that the techniques we developed can be used to map huge volumes of the Universe in three dimensions and to test the competing theories of dark energy,” said Tzu-Ching Chang, of the Academia Sinica in Taiwan and the University of Toronto.

To get their results, the researchers used the GBT to study a region of sky that previously had been surveyed in detail in visible light by the Keck II telescope in Hawaii. This optical survey used spectroscopy to map the locations of thousands of galaxies in three dimensions. With the GBT, instead of looking for hydrogen gas in these individual, distant galaxies — a daunting challenge beyond the technical capabilities of current instruments — the team used their intensity-mapping technique to accumulate the radio waves emitted by the hydrogen gas in large volumes of space including many galaxies.

“Since the early part of the 20th Century, astronomers have traced the expansion of the Universe by observing galaxies. Our new technique allows us to skip the galaxy-detection step and gather radio emissions from a thousand galaxies at a time, as well as all the dimly-glowing material between them,” said Jeffrey Peterson, of Carnegie Mellon University.

The astronomers also developed new techniques that removed both man-made radio interference and radio emission caused by more-nearby astronomical sources, leaving only the extremely faint radio waves coming from the very distant hydrogen gas. The result was a map of part of the “cosmic web” that correlated neatly with the structure shown by the earlier optical study. The team first proposed their intensity-mapping technique in 2008, and their GBT observations were the first test of the idea.

“These observations detected more hydrogen gas than all the previously-detected hydrogen in the Universe, and at distances ten times farther than any radio wave-emitting hydrogen seen before,” said Ue-Li Pen of the University of Toronto.

“This is a demonstration of an important technique that has great promise for future studies of the evolution of large-scale structure in the Universe,” said National Radio Astronomy Observatory Chief Scientist Chris Carilli, who was not part of the research team.

In addition to Chang, Peterson, and Pen, the research team included Kevin Bandura of Carnegie Mellon University. The scientists reported their work in the July 22 issue of the scientific journal Nature.

This is Getting Boring: General Relativity Passes Yet another Big Test!

Princeton University scientists (from left) Reinabelle Reyes, James Gunn and Rachel Mandelbaum led a team that analyzed more than 70,000 galaxies and demonstrated that the universe - at least up to a distance of 3.5 billion light years from Earth - plays by the rules set out by Einstein in his theory of general relativity. (Photo: Brian Wilson)

[/caption]
Published in 1915, Einstein’s theory of general relativity (GR) passed its first big test just a few years later, when the predicted gravitational deflection of light passing near the Sun was observed during the 1919 solar eclipse.

In 1960, GR passed its first big test in a lab, here on Earth; the Pound-Rebka experiment. And over the nine decades since its publication, GR has passed test after test after test, always with flying colors (check out this review for an excellent summary).

But the tests have always been within the solar system, or otherwise indirect.

Now a team led by Princeton University scientists has tested GR to see if it holds true at cosmic scales. And, after two years of analyzing astronomical data, the scientists have concluded that Einstein’s theory works as well in vast distances as in more local regions of space.

A partial map of the distribution of galaxies in the SDSS, going out to a distance of 7 billion light years. The amount of galaxy clustering that we observe today is a signature of how gravity acted over cosmic time, and allows as to test whether general relativity holds over these scales. (M. Blanton, SDSS)

The scientists’ analysis of more than 70,000 galaxies demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out by Einstein in his famous theory. While GR has been accepted by the scientific community for over nine decades, until now no one had tested the theory so thoroughly and robustly at distances and scales that go way beyond the solar system.

Reinabelle Reyes, a Princeton graduate student in the Department of Astrophysical Sciences, along with co-authors Rachel Mandelbaum, an associate research scholar, and James Gunn, the Eugene Higgins Professor of Astronomy, outlined their assessment in the March 11 edition of Nature.

Other scientists collaborating on the paper include Tobias Baldauf, Lucas Lombriser and Robert Smith of the University of Zurich and Uros Seljak of the University of California-Berkeley.

The results are important, they said, because they shore up current theories explaining the shape and direction of the universe, including ideas about dark energy, and dispel some hints from other recent experiments that general relativity may be wrong.

“All of our ideas in astronomy are based on this really enormous extrapolation, so anything we can do to see whether this is right or not on these scales is just enormously important,” Gunn said. “It adds another brick to the foundation that underlies what we do.”

GR is one, of two, core theories underlying all of contemporary astrophysics and cosmology (the other is the Standard Model of particle physics, a quantum theory); it explains everything from black holes to the Big Bang.

In recent years, several alternatives to general relativity have been proposed. These modified theories of gravity depart from general relativity on large scales to circumvent the need for dark energy, dark matter, or both. But because these theories were designed to match the predictions of general relativity about the expansion history of the universe, a factor that is central to current cosmological work, it has become crucial to know which theory is correct, or at least represents reality as best as can be approximated.

“We knew we needed to look at the large-scale structure of the universe and the growth of smaller structures composing it over time to find out,” Reyes said. The team used data from the Sloan Digital Sky Survey (SDSS), a long-term, multi-institution telescope project mapping the sky to determine the position and brightness of several hundred million galaxies and quasars.

By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material – due to weak lensing, primarily by dark matter – the researchers have shown that Einstein’s theory explains the nearby universe better than alternative theories of gravity.

Some of the 70,000 luminous galaxies in SDSS analyzed (Image: SDSS Collaboration)

The Princeton scientists studied the effects of gravity on the SDSS galaxies and clusters of galaxies over long periods of time. They observed how this fundamental force drives galaxies to clump into larger collections of galaxies and how it shapes the expansion of the universe.

Critically, because relativity calls for the curvature of space to be equal to the curvature of time, the researchers could calculate whether light was influenced in equal amounts by both, as it should be if general relativity holds true.

“This is the first time this test was carried out at all, so it’s a proof of concept,” Mandelbaum said. “There are other astronomical surveys planned for the next few years. Now that we know this test works, we will be able to use it with better data that will be available soon to more tightly constrain the theory of gravity.”

Firming up the predictive powers of GR can help scientists better understand whether current models of the universe make sense, the scientists said.

“Any test we can do in building our confidence in applying these very beautiful theoretical things but which have not been tested on these scales is very important,” Gunn said. “It certainly helps when you are trying to do complicated things to understand fundamentals. And this is a very, very, very fundamental thing.”

“The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe,” said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. “Those alternative theories that do not require dark matter fail these tests.”

Sources: “Princeton scientists say Einstein’s theory applies beyond the solar system” (Princeton University), “Study validates general relativity on cosmic scale, existence of dark matter” (University of California Berkeley), “Confirmation of general relativity on large scales from weak lensing and galaxy velocities” (Nature, arXiv preprint)

Using Gravitational Lensing to Measure Age and Size of Universe

A graviational lens image of the B1608+656 system. Image courtesy Sherry Suyu of the Argelander Institut für Astronomie in Bonn, Germany. Click on image for larger version.

[/caption]

Handy little tool, this gravitational lensing! Astronomers have used it to measure the shape of stars, look for exoplanets, and measure dark matter in distant galaxies. Now its being used to measure the size and age of the Universe. Researchers say this new use of gravitation lensing provides a very precise way to measure how rapidly the universe is expanding. The measurement determines a value for the Hubble constant, which indicates the size of the universe, and confirms the age of Universe as 13.75 billion years old, within 170 million years. The results also confirm the strength of dark energy, responsible for accelerating the expansion of the universe.

Gravitational lensing occurs when two galaxies happen to aligned with one another along our line of sight in the sky. The gravitational field of the nearer galaxy distorts the image of the more distant galaxy into multiple arc-shaped images. Sometimes this effect even creates a complete ring, known as an “Einstein Ring.”
Researchers at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) used gravitational lensing to measure the distances light traveled from a bright, active galaxy to the earth along different paths. By understanding the time it took to travel along each path and the effective speeds involved, researchers could infer not just how far away the galaxy lies but also the overall scale of the universe and some details of its expansion.

Distinguishing distances in space is difficult. A bright light far away and a dimmer source lying much closer can look like they are at the same distance. A gravitational lens circumvents this problem by providing multiple clues as to the distance light travels. That extra information allows them to determine the size of the universe, often expressed by astrophysicists in terms of a quantity called Hubble’s constant.

“We’ve known for a long time that lensing is capable of making a physical measurement of Hubble’s constant,” KIPAC’s Phil Marshall said. However, gravitational lensing had never before been used in such a precise way. This measurement provides an equally precise measurement of Hubble’s constant as long-established tools such as observation of supernovae and the cosmic microwave background. “Gravitational lensing has come of age as a competitive tool in the astrophysicist’s toolkit,” Marshall said.

When a large nearby object, such as a galaxy, blocks a distant object, such as another galaxy, the light can detour around the blockage. But instead of taking a single path, light can bend around the object in one of two, or four different routes, thus doubling or quadrupling the amount of information scientists receive. As the brightness of the background galaxy nucleus fluctuates, physicists can measure the ebb and flow of light from the four distinct paths, such as in the B1608+656 system that was the subject of this study. Lead author on the study Sherry Suyu, from the University of Bonn, said, “In our case, there were four copies of the source, which appear as a ring of light around the gravitational lens.”

Though researchers do not know when light left its source, they can still compare arrival times. Marshall likens it to four cars taking four different routes between places on opposite sides of a large city, such as Stanford University to Lick Observatory, through or around San Jose. And like automobiles facing traffic snarls, light can encounter delays, too.

“The traffic density in a big city is like the mass density in a lens galaxy,” Marshall said. “If you take a longer route, it need not lead to a longer delay time. Sometimes the shorter distance is actually slower.”

The gravitational lens equations account for all the variables such as distance and density, and provide a better idea of when light left the background galaxy and how far it traveled.

In the past, this method of distance estimation was plagued by errors, but physicists now believe it is comparable with other measurement methods. With this technique, the researchers have come up with a more accurate lensing-based value for Hubble’s constant, and a better estimation of the uncertainty in that constant. By both reducing and understanding the size of error in calculations, they can achieve better estimations on the structure of the lens and the size of the universe.

There are several factors scientists still need to account for in determining distances with lenses. For example, dust in the lens can skew the results. The Hubble Space Telescope has infra-red filters useful for eliminating dust effects. The images also contain information about the number of galaxies lying around the line of vision; these contribute to the lensing effect at a level that needs to be taken into account.

Marshall says several groups are working on extending this research, both by finding new systems and further examining known lenses. Researchers are already aware of more than twenty other astronomical systems suitable for analysis with gravitational lensing.

These results of this study was published in the March 1 issue of The Astrophysical Journal. The researchers used data collected by the NASA/ESA Hubble Space Telescope, and showed the improved precision they provide in combination with the Wilkinson Microwave Anisotropy Probe (WMAP).

Source: SLAC

Dark Matter in Distant Galaxy Groups Mapped for the First Time

X-ray emission in the COSMOS field (XMM-Newton/ESA)

[/caption]
Galaxy density in the Cosmic Evolution Survey (COSMOS) field, with colors representing the redshift of the galaxies, ranging from redshift of 0.2 (blue) to 1 (red). Pink x-ray contours show the extended x-ray emission as observed by XMM-Newton.

Dark matter (actually cold, dark – non-baryonic – matter) can be detected only by its gravitational influence. In clusters and groups of galaxies, that influence shows up as weak gravitational lensing, which is difficult to nail down. One way to much more accurately estimate the degree of gravitational lensing – and so the distribution of dark matter – is to use the x-ray emission from the hot intra-cluster plasma to locate the center of mass.

And that’s just what a team of astronomers have recently done … and they have, for the first time, given us a handle on how dark matter has evolved over the last many billion years.

COSMOS is an astronomical survey designed to probe the formation and evolution of galaxies as a function of cosmic time (redshift) and large scale structure environment. The survey covers a 2 square degree equatorial field with imaging by most of the major space-based telescopes (including Hubble and XMM-Newton) and a number of ground-based telescopes.

Understanding the nature of dark matter is one of the key open questions in modern cosmology. In one of the approaches used to address this question astronomers use the relationship between mass and luminosity that has been found for clusters of galaxies which links their x-ray emissions, an indication of the mass of the ordinary (“baryonic”) matter alone (of course, baryonic matter includes electrons, which are leptons!), and their total masses (baryonic plus dark matter) as determined by gravitational lensing.

To date the relationship has only been established for nearby clusters. New work by an international collaboration, including the Max Planck Institute for Extraterrestrial Physics (MPE), the Laboratory of Astrophysics of Marseilles (LAM), and Lawrence Berkeley National Laboratory (Berkeley Lab), has made major progress in extending the relationship to more distant and smaller structures than was previously possible.

To establish the link between x-ray emission and underlying dark matter, the team used one of the largest samples of x-ray-selected groups and clusters of galaxies, produced by the ESA’s x-ray observatory, XMM-Newton.

Groups and clusters of galaxies can be effectively found using their extended x-ray emission on sub-arcminute scales. As a result of its large effective area, XMM-Newton is the only x-ray telescope that can detect the faint level of emission from distant groups and clusters of galaxies.

“The ability of XMM-Newton to provide large catalogues of galaxy groups in deep fields is astonishing,” said Alexis Finoguenov of the MPE and the University of Maryland, a co-author of the recent Astrophysical Journal (ApJ) paper which reported the team’s results.

Since x-rays are the best way to find and characterize clusters, most follow-up studies have until now been limited to relatively nearby groups and clusters of galaxies.

“Given the unprecedented catalogues provided by XMM-Newton, we have been able to extend measurements of mass to much smaller structures, which existed much earlier in the history of the Universe,” says Alexie Leauthaud of Berkeley Lab’s Physics Division, the first author of the ApJ study.

COSMOS-XCL095951+014049 (Subaru/NAOJ, XMM-Newton/ESA)

Gravitational lensing occurs because mass curves the space around it, bending the path of light: the more mass (and the closer it is to the center of mass), the more space bends, and the more the image of a distant object is displaced and distorted. Thus measuring distortion, or ‘shear’, is key to measuring the mass of the lensing object.

In the case of weak gravitational lensing (as used in this study) the shear is too subtle to be seen directly, but faint additional distortions in a collection of distant galaxies can be calculated statistically, and the average shear due to the lensing of some massive object in front of them can be computed. However, in order to calculate the lens’ mass from average shear, one needs to know its center.

“The problem with high-redshift clusters is that it is difficult to determine exactly which galaxy lies at the centre of the cluster,” says Leauthaud. “That’s where x-rays help. The x-ray luminosity from a galaxy cluster can be used to find its centre very accurately.”

Knowing the centers of mass from the analysis of x-ray emission, Leauthaud and colleagues could then use weak lensing to estimate the total mass of the distant groups and clusters with greater accuracy than ever before.

The final step was to determine the x-ray luminosity of each galaxy cluster and plot it against the mass determined from the weak lensing, with the resulting mass-luminosity relation for the new collection of groups and clusters extending previous studies to lower masses and higher redshifts. Within calculable uncertainty, the relation follows the same straight slope from nearby galaxy clusters to distant ones; a simple consistent scaling factor relates the total mass (baryonic plus dark) of a group or cluster to its x-ray brightness, the latter measuring the baryonic mass alone.

“By confirming the mass-luminosity relation and extending it to high redshifts, we have taken a small step in the right direction toward using weak lensing as a powerful tool to measure the evolution of structure,” says Jean-Paul Kneib a co-author of the ApJ paper from LAM and France’s National Center for Scientific Research (CNRS).

The origin of galaxies can be traced back to slight differences in the density of the hot, early Universe; traces of these differences can still be seen as minute temperature differences in the cosmic microwave background (CMB) – hot and cold spots.

“The variations we observe in the ancient microwave sky represent the imprints that developed over time into the cosmic dark-matter scaffolding for the galaxies we see today,” says George Smoot, director of the Berkeley Center for Cosmological Physics (BCCP), a professor of physics at the University of California at Berkeley, and a member of Berkeley Lab’s Physics Division. Smoot shared the 2006 Nobel Prize in Physics for measuring anisotropies in the CMB and is one of the authors of the ApJ paper. “It is very exciting that we can actually measure with gravitational lensing how the dark matter has collapsed and evolved since the beginning.”

One goal in studying the evolution of structure is to understand dark matter itself, and how it interacts with the ordinary matter we can see. Another goal is to learn more about dark energy, the mysterious phenomenon that is pushing matter apart and causing the Universe to expand at an accelerating rate. Many questions remain unanswered: Is dark energy constant, or is it dynamic? Or is it merely an illusion caused by a limitation in Einstein’s General Theory of Relativity?

The tools provided by the extended mass-luminosity relationship will do much to answer these questions about the opposing roles of gravity and dark energy in shaping the Universe, now and in the future.

Sources: ESA, and a paper published in the 20 January, 2010 issue of the Astrophysical Journal (arXiv:0910.5219 is the preprint)

ESA’s Tough Choice: Dark Matter, Sun Close Flyby, Exoplanets (Pick Two)

Thales Alenia Space and EADS Astrium concepts for Euclid (ESA)


Key questions relevant to fundamental physics and cosmology, namely the nature of the mysterious dark energy and dark matter (Euclid); the frequency of exoplanets around other stars, including Earth-analogs (PLATO); take the closest look at our Sun yet possible, approaching to just 62 solar radii (Solar Orbiter) … but only two! What would be your picks?

These three mission concepts have been chosen by the European Space Agency’s Science Programme Committee (SPC) as candidates for two medium-class missions to be launched no earlier than 2017. They now enter the definition phase, the next step required before the final decision is taken as to which missions are implemented.

These three missions are the finalists from 52 proposals that were either made or carried forward in 2007. They were whittled down to just six mission proposals in 2008 and sent for industrial assessment. Now that the reports from those studies are in, the missions have been pared down again. “It was a very difficult selection process. All the missions contained very strong science cases,” says Lennart Nordh, Swedish National Space Board and chair of the SPC.

And the tough decisions are not yet over. Only two missions out of three of them: Euclid, PLATO and Solar Orbiter, can be selected for the M-class launch slots. All three missions present challenges that will have to be resolved at the definition phase. A specific challenge, of which the SPC was conscious, is the ability of these missions to fit within the available budget. The final decision about which missions to implement will be taken after the definition activities are completed, which is foreseen to be in mid-2011.
[/caption]
Euclid is an ESA mission to map the geometry of the dark Universe. The mission would investigate the distance-redshift relationship and the evolution of cosmic structures. It would achieve this by measuring shapes and redshifts of galaxies and clusters of galaxies out to redshifts ~2, or equivalently to a look-back time of 10 billion years. It would therefore cover the entire period over which dark energy played a significant role in accelerating the expansion.

By approaching as close as 62 solar radii, Solar Orbiter would view the solar atmosphere with high spatial resolution and combine this with measurements made in-situ. Over the extended mission periods Solar Orbiter would deliver images and data that would cover the polar regions and the side of the Sun not visible from Earth. Solar Orbiter would coordinate its scientific mission with NASA’s Solar Probe Plus within the joint HELEX program (Heliophysics Explorers) to maximize their combined science return.

Thales Alenis Space concept, from assessment phase (ESA)

PLATO (PLAnetary Transit and Oscillations of stars) would discover and characterize a large number of close-by exoplanetary systems, with a precision in the determination of mass and radius of 1%.

In addition, the SPC has decided to consider at its next meeting in June, whether to also select a European contribution to the SPICA mission.

SPICA would be an infrared space telescope led by the Japanese Space Agency JAXA. It would provide ‘missing-link’ infrared coverage in the region of the spectrum between that seen by the ESA-NASA Webb telescope and the ground-based ALMA telescope. SPICA would focus on the conditions for planet formation and distant young galaxies.

“These missions continue the European commitment to world-class space science,” says David Southwood, ESA Director of Science and Robotic Exploration, “They demonstrate that ESA’s Cosmic Vision programme is still clearly focused on addressing the most important space science.”

Source: ESA chooses three scientific missions for further study

Universe to WMAP: ΛCDM Rules, OK?

Temperature and polarization around hot and cold spots (Credit: NASA / WMAP Science Team)

[/caption]
The Wilkinson Microwave Anisotropy Probe (WMAP) science team has finished analyzing seven full years’ of data from the little probe that could, and once again it seems we can sum up the universe in six parameters and a model.

Using the seven-year WMAP data, together with recent results on the large-scale distribution of galaxies, and an updated estimate of the Hubble constant, the present-day age of the universe is 13.75 (plus-or-minus 0.11) billion years, dark energy comprises 72.8% (+/- 1.5%) of the universe’s mass-energy, baryons 4.56% (+/- 0.16%), non-baryonic matter (CDM) 22.7% (+/- 1.4%), and the redshift of reionization is 10.4 (+/- 1.2).

In addition, the team report several new cosmological constraints – primordial abundance of helium (this rules out various alternative, ‘cold big bang’ models), and an estimate of a parameter which describes a feature of density fluctuations in the very early universe sufficiently precisely to rule out a whole class of inflation models (the Harrison-Zel’dovich-Peebles spectrum), to take just two – as well as tighter limits on many others (number of neutrino species, mass of the neutrino, parity violations, axion dark matter, …).

The best eye-candy from the team’s six papers are the stacked temperature and polarization maps for hot and cold spots; if these spots are due to sound waves in matter frozen in when radiation (photons) and baryons parted company – the cosmic microwave background (CMB) encodes all the details of this separation – then there should be nicely circular rings, of rather exact sizes, around the spots. Further, the polarization directions should switch from radial to tangential, from the center out (for cold spots; vice versa for hot spots).

And that’s just what the team found!

Concerning Dark Energy. Since the Five-Year WMAP results were published, several independent studies with direct relevance to cosmology have been published. The WMAP team took those from observations of the baryon acoustic oscillations (BAO) in the distribution of galaxies; of Cepheids, supernovae, and a water maser in local galaxies; of time-delay in a lensed quasar system; and of high redshift supernovae, and combined them to reduce the nooks and crannies in parameter space in which non-cosmological constant varieties of dark energy could be hiding. At least some alternative kinds of dark energy may still be possible, but for now Λ, the cosmological constant, rules.

Concerning Inflation. Very, very, very early in the life of the universe – so the theory of cosmic inflation goes – there was a period of dramatic expansion, and the tiny quantum fluctuations before inflation became the giant cosmic structures we see today. “Inflation predicts that the statistical distribution of primordial fluctuations is nearly a Gaussian distribution with random phases. Measuring deviations from a Gaussian distribution,” the team reports, “is a powerful test of inflation, as how precisely the distribution is (non-) Gaussian depends on the detailed physics of inflation.” While the limits on non-Gaussianity (as it is called), from analysis of the WMAP data, only weakly constrain various models of inflation, they do leave almost nowhere for cosmological models without inflation to hide.

Concerning ‘cosmic shadows’ (the Sunyaev-Zel’dovich (SZ) effect). While many researchers have looked for cosmic shadows in WMAP data before – perhaps the best known to the general public is the 2006 Lieu, Mittaz, and Zhang paper (the SZ effect: hot electrons in the plasma which pervades rich clusters of galaxies interact with CMB photons, via inverse Compton scattering) – the WMAP team’s recent analysis is their first to investigate this effect. They detect the SZ effect directly in the nearest rich cluster (Coma; Virgo is behind the Milky Way foreground), and also statistically by correlation with the location of some 700 relatively nearby rich clusters. While the WMAP team’s finding is consistent with data from x-ray observations, it is inconsistent with theoretical models. Back to the drawing board for astrophysicists studying galaxy clusters.

Seven Year Microwave Sky (Credit: NASA/WMAP Science Team)

I’ll wrap up by quoting Komatsu et al. “The standard ΛCDM cosmological model continues to be an exquisite fit to the existing data.”

Primary source: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Cosmological Interpretation (arXiv:1001.4738). The five other Seven-Year WMAP papers are: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Are There Cosmic Microwave Background Anomalies? (arXiv:1001.4758), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Planets and Celestial Calibration Sources (arXiv:1001.4731), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Sky Maps, Systematic Errors, and Basic Results (arXiv:1001.4744), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Power Spectra and WMAP-Derived Parameters (arXiv:1001.4635), and Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Galactic Foreground Emission (arXiv:1001.4555). Also check out the official WMAP website.

New Search for Dark Energy Goes Back in Time

This is a previous optical image of one of the approximately 200 quasars captured in the Baryon Oscillation Spectroscopic Survey (BOSS) "first light" exposure is shown at top, with the BOSS spectrum of the object at bottom. The spectrum allows astronomers to determine the object's redshift. With millions of such spectra, BOSS will measure the geometry of the Universe. Credit: David Hogg, Vaishali Bhardwaj, and Nic Ross of SDSS-III

[/caption]
Baryon acoustic oscillation (BAO) sounds like it could be technobabble from a Star Trek episode. BAO is real, but astronomers are searching for these particle fluctuations to do what seems like science fiction: look back in time to find clues about dark energy. The Baryon Oscillation Spectroscopic Survey(BOSS), a part of the Sloan Digital Sky Survey III (SDSS-III), took its “first light” of astronomical data last month, and will map the expansion history of the Universe.

“Baryon oscillation is a fast-maturing method for measuring dark energy in a way that’s complementary to the proven techniques of supernova cosmology,” said David Schlegel from the Lawrence Berkeley National Laboratory (Berkeley Lab), the Principal Investigator of BOSS. “The data from BOSS will be some of the best ever obtained on the large-scale structure of the Universe.”

BOSS uses the same telescope as the original Sloan Digital Sky Survey — 2.5-meter telescope
at Apache Point Observatory in New Mexico — but equipped with new, specially-built spectrographs to measure the spectra.

Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long
Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long

Baryon oscillations began when pressure waves traveled through the early universe. The same density variations left their mark as the Universe evolved, in the periodic clustering of visible matter in galaxies, quasars, and intergalactic gas, as well as in the clumping of invisible dark matter.

Comparing these scales at different eras makes it possible to trace the details of how the Universe has expanded throughout its history – information that can be used to distinguish among competing theories of dark energy.

“Like sound waves passing through air, the waves push some of the matter closer together as they travel” said Nikhil Padmanabhan, a BOSS researcher who recently moved from Berkeley Lab to Yale University. “In the early universe, these waves were moving at half the speed of light, but when the universe was only a few hundred thousand years old, the universe cooled enough to halt the waves, leaving a signature 500 million light-years in length.”

“We can see these frozen waves in the distribution of galaxies today,” said Daniel Eisenstein of the University of Arizona, the Director of the SDSS-III. “By measuring the length of the baryon oscillations, we can determine how dark energy has affected the expansion history of the universe. That in turn helps us figure out what dark energy could be.”

“Studying baryon oscillations is an exciting method for measuring dark energy in a way that’s complementary to techniques in supernova cosmology,” said Kyle Dawson of the University of Utah, who is leading the commissioning of BOSS. “BOSS’s galaxy measurements will be a revolutionary dataset that will provide rich insights into the universe,” added Martin White of Berkeley Lab, BOSS’s survey
scientist.

On Sept. 14-15, 2009, astronomers used BOSS to measure the spectra of a thousand galaxies and quasars. The goal of BOSS is to measure 1.4 million luminous red galaxies at redshifts up to 0.7 (when the Universe was roughly seven billion years old) and 160,000 quasars at redshifts between 2.0 and 3.0 (when the Universe was only about three billion years old). BOSS will also measure variations in the density of hydrogen gas between the galaxies. The observation program will take five years.

Source: Sloan Digital Sky Survey

Variability in Type 1A Supernovae Has Implications for Studying Dark Energy

A Hubble Space Telescope-Image of Supernova 1994D (SN1994D) in galaxy NGC 4526 (SN 1994D is the bright spot on the lower left). Image Credit:HST

[/caption]

The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae, and these stellar explosions have long been used as “standard candles” for measuring the expansion. But not all type 1A supernovae are created equal. A new study reveals sources of variability in these supernovae, and to accurately probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to find a way to measure cosmic distances with much greater precision than they have in the past.

“As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance,” said lead author Daniel Kasen, of a study published in Nature this week. “We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness.”

Kasen and his coauthors–Fritz Röpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz–used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass–1.4 times the mass of the Sun, packed into an object the size of the Earth–the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their “light curves” (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. “The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry,” Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

“The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light,” Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. “Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based,” Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher “metallicity,” in astronomers’ terminology) than stars formed in the distant past.

“That’s the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity,” Kasen said. “When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less.”

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

Source: EurekAlert

New Cosmic “Yardstick” Could Help Understand Dark Energy

This visible-light image shows the galaxy dubbed UGC 3789, which is 160 million light-years from Earth. Credit: STScI

[/caption]
A new method for measuring large astronomical distances is providing researchers with a cosmic yardstick to determine precisely how far away distant galaxies are. This could also offer a way to help determine how fast the Universe is expanding, as well as the nature of the mysterious Dark Energy that pervades the Universe. “We measured a direct, geometric distance to the galaxy, independent of the complications and assumptions inherent in other techniques. The measurement highlights a valuable method that can be used to determine the local expansion rate of the Universe, which is essential in our quest to find the nature of Dark Energy,” said James Braatz, of the National Radio Astronomy Observatory (NRAO), who spoke today at the American Astronomical Society’s meeting in Pasadena, California.

Braatz and his colleagues used the National Science Foundation’s Very Long Baseline Array (VLBA) and Robert C. Byrd Green Bank Telescope (GBT), and the Effelsberg Radio Telescope of the Max Planck Institute for Radioastronomy (MPIfR) in Germany to determine that a galaxy dubbed UGC 3789 is 160 million light-years from Earth. To do this, they precisely measured both the linear and angular size of a disk of material orbiting the galaxy’s central black hole. Water molecules in the disk act as masers to amplify, or strengthen, radio waves the way lasers amplify light waves.

The observation is a key element of a major effort to measure the expansion rate of the Universe, known as the Hubble Constant, with greatly improved precision. That effort, cosmologists say, is the best way to narrow down possible explanations for the nature of Dark Energy. “The new measurement is important because it demonstrates a one-step, geometric technique for measuring distances to galaxies far enough to infer the expansion rate of the Universe,” said Braatz.
Dark Energy was discovered in 1998 with the observation that the expansion of the Universe is accelerating. It constitutes 70 percent of the matter and energy in the Universe, but its nature remains unknown. Determining its nature is one of the most important problems in astrophysics.

“Measuring precise distances is one of the oldest problems in astronomy, and applying a relatively new radio-astronomy technique to this old problem is vital to solving one of the greatest challenges of 21st Century astrophysics,” said team member Mark Reid of the Harvard-Smithsonian Center for Astrophysics (CfA).

The work on UGC 3789 follows a landmark measurement done with the VLBA in 1999, in which the distance to the galaxy NGC 4258 — 23 million light-years — was directly measured by observing water masers in a disk of material orbiting its central black hole. That measurement allowed refinement of other, indirect distance-measuring techniques using variable stars as “standard candles.”

The measurement to UGC 3789 adds a new milepost seven times more distant than NGC 4258, which itself is too close to measure the Hubble Constant directly. The speed at which NGC 4258 is receding from the Milky Way can be influenced by local effects. “UGC 3789 is far enough that the speed at which it is moving away from the Milky Way is more indicative of the expansion of the Universe,” said team member Elizabeth Humphreys of the CfA.

Following the achievement with NGC 4258, astronomers used the highly-sensitive GBT to search for other galaxies with similar water-molecule masers in disks orbiting their central black holes. Once candidates were found, astronomers then used the VLBA and the GBT together with the Effelsberg telescope to make images of the disks and measure their detailed rotational structure, needed for the distance measurements. This effort requires multi-year observations of each galaxy. UGC 3789 is the first galaxy in the program to yield such a precise distance.

Team member Cheng-Yu Kuo of the University of Virginia presented an image of the maser disk in NGC 6323, a galaxy even more distant than UGC 3789. This is a step toward using this galaxy to provide another valuable cosmic milepost. “The very high sensitivity of the telescopes allows making such images of galaxies even beyond 300 million light years,” said Kuo.

Source: AAS