Hubble Confirms Cosmic Acceleration with Weak Lensing

This image shows a smoothed reconstruction of the total (mostly dark) matter distribution in the COSMOS field, created from data taken by the NASA/ESA Hubble Space Telescope and ground-based telescopes.Credit: NASA, ESA, P. Simon (University of Bonn) and T. Schrabback (Leiden Observatory)

[/caption]

Need more evidence that the expansion of the Universe is accelerating? Just look to the Hubble Space Telescope. An international team of astronomers has indeed confirmed that the expansion of the universe is accelerating. The team, led by Tim Schrabback of the Leiden Observatory, conducted an intensive study of over 446,000 galaxies within the COSMOS (Cosmological Evolution Survey) field, the result of the largest survey ever conducted with Hubble. In making the COSMOS survey, Hubble photographed 575 slightly overlapping views of the same part of the Universe using the Advanced Camera for Surveys (ACS) onboard the orbiting telescope. It took nearly 1,000 hours of observations.

In addition to the Hubble data, researchers used redshift data from ground-based telescopes to assign distances to 194,000 of the galaxies surveyed (out to a redshift of 5). “The sheer number of galaxies included in this type of analysis is unprecedented, but more important is the wealth of information we could obtain about the invisible structures in the Universe from this exceptional dataset,” said co-author Patrick Simon from Edinburgh University.

In particular, the astronomers could “weigh” the large-scale matter distribution in space over large distances. To do this, they made use of the fact that this information is encoded in the distorted shapes of distant galaxies, a phenomenon referred to as weak gravitational lensing. Using complex algorithms, the team led by Schrabback has improved the standard method and obtained galaxy shape measurements to an unprecedented precision. The results of the study will be published in an upcoming issue of Astronomy and Astrophysics.

The meticulousness and scale of this study enables an independent confirmation that the expansion of the Universe is accelerated by an additional, mysterious component named dark energy. A handful of other such independent confirmations exist. Scientists need to know how the formation of clumps of matter evolved in the history of the Universe to determine how the gravitational force, which holds matter together, and dark energy, which pulls it apart by accelerating the expansion of the Universe, have affected them. “Dark energy affects our measurements for two reasons. First, when it is present, galaxy clusters grow more slowly, and secondly, it changes the way the Universe expands, leading to more distant — and more efficiently lensed — galaxies. Our analysis is sensitive to both effects,” says co-author Benjamin Joachimi from the University of Bonn. “Our study also provides an additional confirmation for Einstein’s theory of general relativity, which predicts how the lensing signal depends on redshift,” adds co-investigator Martin Kilbinger from the Institut d’Astrophysique de Paris and the Excellence Cluster Universe.

The large number of galaxies included in this study, along with information on their redshifts is leading to a clearer map of how, exactly, part of the Universe is laid out; it helps us see its galactic inhabitants and how they are distributed. “With more accurate information about the distances to the galaxies, we can measure the distribution of the matter between them and us more accurately,” notes co-investigator Jan Hartlap from the University of Bonn. “Before, most of the studies were done in 2D, like taking a chest X-ray. Our study is more like a 3D reconstruction of the skeleton from a CT scan. On top of that, we are able to watch the skeleton of dark matter mature from the Universe’s youth to the present,” comments William High from Harvard University, another co-author.

The astronomers specifically chose the COSMOS survey because it is thought to be a representative sample of the Universe. With thorough studies such as the one led by Schrabback, astronomers will one day be able to apply their technique to wider areas of the sky, forming a clearer picture of what is truly out there.

Source: EurekAlert

Paper: Schrabback et al., ‘Evidence for the accelerated expansion of the Universe from weak lensing tomography with COSMOS’, Astronomy and Astrophysics, March 2010,

This is Getting Boring: General Relativity Passes Yet another Big Test!

Princeton University scientists (from left) Reinabelle Reyes, James Gunn and Rachel Mandelbaum led a team that analyzed more than 70,000 galaxies and demonstrated that the universe - at least up to a distance of 3.5 billion light years from Earth - plays by the rules set out by Einstein in his theory of general relativity. (Photo: Brian Wilson)

[/caption]
Published in 1915, Einstein’s theory of general relativity (GR) passed its first big test just a few years later, when the predicted gravitational deflection of light passing near the Sun was observed during the 1919 solar eclipse.

In 1960, GR passed its first big test in a lab, here on Earth; the Pound-Rebka experiment. And over the nine decades since its publication, GR has passed test after test after test, always with flying colors (check out this review for an excellent summary).

But the tests have always been within the solar system, or otherwise indirect.

Now a team led by Princeton University scientists has tested GR to see if it holds true at cosmic scales. And, after two years of analyzing astronomical data, the scientists have concluded that Einstein’s theory works as well in vast distances as in more local regions of space.

A partial map of the distribution of galaxies in the SDSS, going out to a distance of 7 billion light years. The amount of galaxy clustering that we observe today is a signature of how gravity acted over cosmic time, and allows as to test whether general relativity holds over these scales. (M. Blanton, SDSS)

The scientists’ analysis of more than 70,000 galaxies demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out by Einstein in his famous theory. While GR has been accepted by the scientific community for over nine decades, until now no one had tested the theory so thoroughly and robustly at distances and scales that go way beyond the solar system.

Reinabelle Reyes, a Princeton graduate student in the Department of Astrophysical Sciences, along with co-authors Rachel Mandelbaum, an associate research scholar, and James Gunn, the Eugene Higgins Professor of Astronomy, outlined their assessment in the March 11 edition of Nature.

Other scientists collaborating on the paper include Tobias Baldauf, Lucas Lombriser and Robert Smith of the University of Zurich and Uros Seljak of the University of California-Berkeley.

The results are important, they said, because they shore up current theories explaining the shape and direction of the universe, including ideas about dark energy, and dispel some hints from other recent experiments that general relativity may be wrong.

“All of our ideas in astronomy are based on this really enormous extrapolation, so anything we can do to see whether this is right or not on these scales is just enormously important,” Gunn said. “It adds another brick to the foundation that underlies what we do.”

GR is one, of two, core theories underlying all of contemporary astrophysics and cosmology (the other is the Standard Model of particle physics, a quantum theory); it explains everything from black holes to the Big Bang.

In recent years, several alternatives to general relativity have been proposed. These modified theories of gravity depart from general relativity on large scales to circumvent the need for dark energy, dark matter, or both. But because these theories were designed to match the predictions of general relativity about the expansion history of the universe, a factor that is central to current cosmological work, it has become crucial to know which theory is correct, or at least represents reality as best as can be approximated.

“We knew we needed to look at the large-scale structure of the universe and the growth of smaller structures composing it over time to find out,” Reyes said. The team used data from the Sloan Digital Sky Survey (SDSS), a long-term, multi-institution telescope project mapping the sky to determine the position and brightness of several hundred million galaxies and quasars.

By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material – due to weak lensing, primarily by dark matter – the researchers have shown that Einstein’s theory explains the nearby universe better than alternative theories of gravity.

Some of the 70,000 luminous galaxies in SDSS analyzed (Image: SDSS Collaboration)

The Princeton scientists studied the effects of gravity on the SDSS galaxies and clusters of galaxies over long periods of time. They observed how this fundamental force drives galaxies to clump into larger collections of galaxies and how it shapes the expansion of the universe.

Critically, because relativity calls for the curvature of space to be equal to the curvature of time, the researchers could calculate whether light was influenced in equal amounts by both, as it should be if general relativity holds true.

“This is the first time this test was carried out at all, so it’s a proof of concept,” Mandelbaum said. “There are other astronomical surveys planned for the next few years. Now that we know this test works, we will be able to use it with better data that will be available soon to more tightly constrain the theory of gravity.”

Firming up the predictive powers of GR can help scientists better understand whether current models of the universe make sense, the scientists said.

“Any test we can do in building our confidence in applying these very beautiful theoretical things but which have not been tested on these scales is very important,” Gunn said. “It certainly helps when you are trying to do complicated things to understand fundamentals. And this is a very, very, very fundamental thing.”

“The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe,” said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. “Those alternative theories that do not require dark matter fail these tests.”

Sources: “Princeton scientists say Einstein’s theory applies beyond the solar system” (Princeton University), “Study validates general relativity on cosmic scale, existence of dark matter” (University of California Berkeley), “Confirmation of general relativity on large scales from weak lensing and galaxy velocities” (Nature, arXiv preprint)

Dark Matter in Distant Galaxy Groups Mapped for the First Time

X-ray emission in the COSMOS field (XMM-Newton/ESA)

[/caption]
Galaxy density in the Cosmic Evolution Survey (COSMOS) field, with colors representing the redshift of the galaxies, ranging from redshift of 0.2 (blue) to 1 (red). Pink x-ray contours show the extended x-ray emission as observed by XMM-Newton.

Dark matter (actually cold, dark – non-baryonic – matter) can be detected only by its gravitational influence. In clusters and groups of galaxies, that influence shows up as weak gravitational lensing, which is difficult to nail down. One way to much more accurately estimate the degree of gravitational lensing – and so the distribution of dark matter – is to use the x-ray emission from the hot intra-cluster plasma to locate the center of mass.

And that’s just what a team of astronomers have recently done … and they have, for the first time, given us a handle on how dark matter has evolved over the last many billion years.

COSMOS is an astronomical survey designed to probe the formation and evolution of galaxies as a function of cosmic time (redshift) and large scale structure environment. The survey covers a 2 square degree equatorial field with imaging by most of the major space-based telescopes (including Hubble and XMM-Newton) and a number of ground-based telescopes.

Understanding the nature of dark matter is one of the key open questions in modern cosmology. In one of the approaches used to address this question astronomers use the relationship between mass and luminosity that has been found for clusters of galaxies which links their x-ray emissions, an indication of the mass of the ordinary (“baryonic”) matter alone (of course, baryonic matter includes electrons, which are leptons!), and their total masses (baryonic plus dark matter) as determined by gravitational lensing.

To date the relationship has only been established for nearby clusters. New work by an international collaboration, including the Max Planck Institute for Extraterrestrial Physics (MPE), the Laboratory of Astrophysics of Marseilles (LAM), and Lawrence Berkeley National Laboratory (Berkeley Lab), has made major progress in extending the relationship to more distant and smaller structures than was previously possible.

To establish the link between x-ray emission and underlying dark matter, the team used one of the largest samples of x-ray-selected groups and clusters of galaxies, produced by the ESA’s x-ray observatory, XMM-Newton.

Groups and clusters of galaxies can be effectively found using their extended x-ray emission on sub-arcminute scales. As a result of its large effective area, XMM-Newton is the only x-ray telescope that can detect the faint level of emission from distant groups and clusters of galaxies.

“The ability of XMM-Newton to provide large catalogues of galaxy groups in deep fields is astonishing,” said Alexis Finoguenov of the MPE and the University of Maryland, a co-author of the recent Astrophysical Journal (ApJ) paper which reported the team’s results.

Since x-rays are the best way to find and characterize clusters, most follow-up studies have until now been limited to relatively nearby groups and clusters of galaxies.

“Given the unprecedented catalogues provided by XMM-Newton, we have been able to extend measurements of mass to much smaller structures, which existed much earlier in the history of the Universe,” says Alexie Leauthaud of Berkeley Lab’s Physics Division, the first author of the ApJ study.

COSMOS-XCL095951+014049 (Subaru/NAOJ, XMM-Newton/ESA)

Gravitational lensing occurs because mass curves the space around it, bending the path of light: the more mass (and the closer it is to the center of mass), the more space bends, and the more the image of a distant object is displaced and distorted. Thus measuring distortion, or ‘shear’, is key to measuring the mass of the lensing object.

In the case of weak gravitational lensing (as used in this study) the shear is too subtle to be seen directly, but faint additional distortions in a collection of distant galaxies can be calculated statistically, and the average shear due to the lensing of some massive object in front of them can be computed. However, in order to calculate the lens’ mass from average shear, one needs to know its center.

“The problem with high-redshift clusters is that it is difficult to determine exactly which galaxy lies at the centre of the cluster,” says Leauthaud. “That’s where x-rays help. The x-ray luminosity from a galaxy cluster can be used to find its centre very accurately.”

Knowing the centers of mass from the analysis of x-ray emission, Leauthaud and colleagues could then use weak lensing to estimate the total mass of the distant groups and clusters with greater accuracy than ever before.

The final step was to determine the x-ray luminosity of each galaxy cluster and plot it against the mass determined from the weak lensing, with the resulting mass-luminosity relation for the new collection of groups and clusters extending previous studies to lower masses and higher redshifts. Within calculable uncertainty, the relation follows the same straight slope from nearby galaxy clusters to distant ones; a simple consistent scaling factor relates the total mass (baryonic plus dark) of a group or cluster to its x-ray brightness, the latter measuring the baryonic mass alone.

“By confirming the mass-luminosity relation and extending it to high redshifts, we have taken a small step in the right direction toward using weak lensing as a powerful tool to measure the evolution of structure,” says Jean-Paul Kneib a co-author of the ApJ paper from LAM and France’s National Center for Scientific Research (CNRS).

The origin of galaxies can be traced back to slight differences in the density of the hot, early Universe; traces of these differences can still be seen as minute temperature differences in the cosmic microwave background (CMB) – hot and cold spots.

“The variations we observe in the ancient microwave sky represent the imprints that developed over time into the cosmic dark-matter scaffolding for the galaxies we see today,” says George Smoot, director of the Berkeley Center for Cosmological Physics (BCCP), a professor of physics at the University of California at Berkeley, and a member of Berkeley Lab’s Physics Division. Smoot shared the 2006 Nobel Prize in Physics for measuring anisotropies in the CMB and is one of the authors of the ApJ paper. “It is very exciting that we can actually measure with gravitational lensing how the dark matter has collapsed and evolved since the beginning.”

One goal in studying the evolution of structure is to understand dark matter itself, and how it interacts with the ordinary matter we can see. Another goal is to learn more about dark energy, the mysterious phenomenon that is pushing matter apart and causing the Universe to expand at an accelerating rate. Many questions remain unanswered: Is dark energy constant, or is it dynamic? Or is it merely an illusion caused by a limitation in Einstein’s General Theory of Relativity?

The tools provided by the extended mass-luminosity relationship will do much to answer these questions about the opposing roles of gravity and dark energy in shaping the Universe, now and in the future.

Sources: ESA, and a paper published in the 20 January, 2010 issue of the Astrophysical Journal (arXiv:0910.5219 is the preprint)

ESA’s Tough Choice: Dark Matter, Sun Close Flyby, Exoplanets (Pick Two)

Thales Alenia Space and EADS Astrium concepts for Euclid (ESA)


Key questions relevant to fundamental physics and cosmology, namely the nature of the mysterious dark energy and dark matter (Euclid); the frequency of exoplanets around other stars, including Earth-analogs (PLATO); take the closest look at our Sun yet possible, approaching to just 62 solar radii (Solar Orbiter) … but only two! What would be your picks?

These three mission concepts have been chosen by the European Space Agency’s Science Programme Committee (SPC) as candidates for two medium-class missions to be launched no earlier than 2017. They now enter the definition phase, the next step required before the final decision is taken as to which missions are implemented.

These three missions are the finalists from 52 proposals that were either made or carried forward in 2007. They were whittled down to just six mission proposals in 2008 and sent for industrial assessment. Now that the reports from those studies are in, the missions have been pared down again. “It was a very difficult selection process. All the missions contained very strong science cases,” says Lennart Nordh, Swedish National Space Board and chair of the SPC.

And the tough decisions are not yet over. Only two missions out of three of them: Euclid, PLATO and Solar Orbiter, can be selected for the M-class launch slots. All three missions present challenges that will have to be resolved at the definition phase. A specific challenge, of which the SPC was conscious, is the ability of these missions to fit within the available budget. The final decision about which missions to implement will be taken after the definition activities are completed, which is foreseen to be in mid-2011.
[/caption]
Euclid is an ESA mission to map the geometry of the dark Universe. The mission would investigate the distance-redshift relationship and the evolution of cosmic structures. It would achieve this by measuring shapes and redshifts of galaxies and clusters of galaxies out to redshifts ~2, or equivalently to a look-back time of 10 billion years. It would therefore cover the entire period over which dark energy played a significant role in accelerating the expansion.

By approaching as close as 62 solar radii, Solar Orbiter would view the solar atmosphere with high spatial resolution and combine this with measurements made in-situ. Over the extended mission periods Solar Orbiter would deliver images and data that would cover the polar regions and the side of the Sun not visible from Earth. Solar Orbiter would coordinate its scientific mission with NASA’s Solar Probe Plus within the joint HELEX program (Heliophysics Explorers) to maximize their combined science return.

Thales Alenis Space concept, from assessment phase (ESA)

PLATO (PLAnetary Transit and Oscillations of stars) would discover and characterize a large number of close-by exoplanetary systems, with a precision in the determination of mass and radius of 1%.

In addition, the SPC has decided to consider at its next meeting in June, whether to also select a European contribution to the SPICA mission.

SPICA would be an infrared space telescope led by the Japanese Space Agency JAXA. It would provide ‘missing-link’ infrared coverage in the region of the spectrum between that seen by the ESA-NASA Webb telescope and the ground-based ALMA telescope. SPICA would focus on the conditions for planet formation and distant young galaxies.

“These missions continue the European commitment to world-class space science,” says David Southwood, ESA Director of Science and Robotic Exploration, “They demonstrate that ESA’s Cosmic Vision programme is still clearly focused on addressing the most important space science.”

Source: ESA chooses three scientific missions for further study

Seven-Year WMAP Results: No, They’re NOT Anomalies

CMB cool fingers, cold spots I and II (red; credit: NASA/WMAP science team)

Since the day the first Wilkinson Microwave Anisotropy Probe (WMAP) data were released, in 2003, all manner of cosmic microwave background (CMB) anomalies have been reported; there’s been the cold spot that might be a window into a parallel universe, the “Axis of Evil”, pawprints of local interstellar neutral hydrogen, and much, much more.

But do the WMAP data really, truly, absolutely contain evidence of anomalies, things that just do not fit within the six-parameters-and-a-model the WMAP team recently reported?

In a word, no.

Seven Year Microwave Sky (Credit: NASA/WMAP Science Team)

Every second year since 2003 the WMAP science team has published a set of papers on their analyses of the cumulative data, and their findings (with the mission due to end later this year, their next set will, sadly, be their last). With time and experience – not to mention inputs from the thousands of other researchers who have picked over the data – the team has not only amassed a lot more data, but has also come to understand how WMAP operates far better. As a consequence, not only are the published results – such as limits on the nature of dark energy, and the number of different kinds of neutrinos – more stringent and robust, but the team has also become very au fait with the various anomalies reported.

For the first time, the team has examined these anomalies, in detail, and has concluded that the answer to the question, in their words, “are there potential deviations from ?CDM within the context of the allowed parameter ranges of the existing WMAP observations?” is “no”.

The reported anomalies the team examined are many – two prominent cold spots, strength of the quadrupole, lack of large angular scale CMB power, alignment of the quadrupole and octupole components, hemispherical or dipole power asymmetry, to name but a handful – but the reasons for the apparent anomalies are few.

“Human eyes and brains are excellent at detecting visual patterns, but poor at assessing probabilities. Features seen in the WMAP maps, such as the large Cold Spot I near the Galactic center region, can stand out as unusual. However, the likelihood of such features can not be discerned by visual inspection of our particular realization of the universe,” they write, and “Monte Carlo simulations are an invaluable way to determine the expected deviations within the ?CDM model. Claims of anomalies without Monte Carlo simulations are necessarily weak claims”.

Stephen Hawking’s initials in the CMB (Credit: NASA/WMAP Science Team)

An amusing example: Stephen Hawking’s initials (“SH”) can be clearly seen in the WMAP sky map. “The “S” and “H” are in roughly the same font size and style, and both letters are aligned neatly along a line of fixed Galactic latitude,” the team says; “A calculation would show that the probability of this particular occurrence is vanishingly small. Yet, there is no case to made for a non-standard cosmology despite this extraordinarily low probability event,” they dryly note.

Many of the reports of WMAP CMB anomalies would likely make for good teaching material, as they illustrate well the many traps that you can so easily fall into when doing after-the-fact (a posteriori) statistical analyses. Or, as the team puts it in regard to the Stephen Hawking initials: “It is clear that the combined selection of looking for initials, these particular initials, and their alignment and location are all a posteriori choices. For a rich data set, as is the case with WMAP, there are a lot of data and a lot of ways of analyzing the data.”

And what happens when you have a lot of data? Low probability events are guaranteed to occur! “For example, it is not unexpected to find a 2? feature when analyzing a rich data set in a number of different ways. However, to assess whether a particular 2? feature is interesting, one is often tempted to narrow in on it to isolate its behavior. That process involves a posteriori choices that amplify the apparent significance of the feature.”

So, does the team conclude that all this anomaly hunting is a waste of effort? Absolutely not! I’ll quote from the team’s own conclusion: “The search for oddities in the data is essential for testing the model. The success of the model makes these searches even more important. A detection of any highly significant a posteriori feature could become a serious challenge for the model. The less significant features discussed in this paper provided the motivation for considering alternative models and developing new analysis of WMAP (and soon Planck) data. The oddities have triggered proposed new observations that can further test the models. It is often difficult to assess the statistical claims. It may well be that an oddity could be found that motivates a new theory, which then could be tested as a hypothesis against ?CDM. The data support these comparisons. Of course, other cosmological measurements must also play a role in testing new hypotheses. No CMB anomaly reported to date has caused the scientific community to adopt a new standard model of cosmology, but claimed anomalies have been used to provoke thought and to search for improved theories.”

Primary source: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Are There Cosmic Microwave Background Anomalies? (arXiv:1001.4758). The five other Seven-Year WMAP papers are: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Cosmological Interpretation (arXiv:1001.4538), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Planets and Celestial Calibration Sources (arXiv:1001.4731), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Sky Maps, Systematic Errors, and Basic Results (arXiv:1001.4744), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Power Spectra and WMAP-Derived Parameters (arXiv:1001.4635), and Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Galactic Foreground Emission (arXiv:1001.4555). Also check out the official WMAP website.

Universe to WMAP: ΛCDM Rules, OK?

Temperature and polarization around hot and cold spots (Credit: NASA / WMAP Science Team)

[/caption]
The Wilkinson Microwave Anisotropy Probe (WMAP) science team has finished analyzing seven full years’ of data from the little probe that could, and once again it seems we can sum up the universe in six parameters and a model.

Using the seven-year WMAP data, together with recent results on the large-scale distribution of galaxies, and an updated estimate of the Hubble constant, the present-day age of the universe is 13.75 (plus-or-minus 0.11) billion years, dark energy comprises 72.8% (+/- 1.5%) of the universe’s mass-energy, baryons 4.56% (+/- 0.16%), non-baryonic matter (CDM) 22.7% (+/- 1.4%), and the redshift of reionization is 10.4 (+/- 1.2).

In addition, the team report several new cosmological constraints – primordial abundance of helium (this rules out various alternative, ‘cold big bang’ models), and an estimate of a parameter which describes a feature of density fluctuations in the very early universe sufficiently precisely to rule out a whole class of inflation models (the Harrison-Zel’dovich-Peebles spectrum), to take just two – as well as tighter limits on many others (number of neutrino species, mass of the neutrino, parity violations, axion dark matter, …).

The best eye-candy from the team’s six papers are the stacked temperature and polarization maps for hot and cold spots; if these spots are due to sound waves in matter frozen in when radiation (photons) and baryons parted company – the cosmic microwave background (CMB) encodes all the details of this separation – then there should be nicely circular rings, of rather exact sizes, around the spots. Further, the polarization directions should switch from radial to tangential, from the center out (for cold spots; vice versa for hot spots).

And that’s just what the team found!

Concerning Dark Energy. Since the Five-Year WMAP results were published, several independent studies with direct relevance to cosmology have been published. The WMAP team took those from observations of the baryon acoustic oscillations (BAO) in the distribution of galaxies; of Cepheids, supernovae, and a water maser in local galaxies; of time-delay in a lensed quasar system; and of high redshift supernovae, and combined them to reduce the nooks and crannies in parameter space in which non-cosmological constant varieties of dark energy could be hiding. At least some alternative kinds of dark energy may still be possible, but for now Λ, the cosmological constant, rules.

Concerning Inflation. Very, very, very early in the life of the universe – so the theory of cosmic inflation goes – there was a period of dramatic expansion, and the tiny quantum fluctuations before inflation became the giant cosmic structures we see today. “Inflation predicts that the statistical distribution of primordial fluctuations is nearly a Gaussian distribution with random phases. Measuring deviations from a Gaussian distribution,” the team reports, “is a powerful test of inflation, as how precisely the distribution is (non-) Gaussian depends on the detailed physics of inflation.” While the limits on non-Gaussianity (as it is called), from analysis of the WMAP data, only weakly constrain various models of inflation, they do leave almost nowhere for cosmological models without inflation to hide.

Concerning ‘cosmic shadows’ (the Sunyaev-Zel’dovich (SZ) effect). While many researchers have looked for cosmic shadows in WMAP data before – perhaps the best known to the general public is the 2006 Lieu, Mittaz, and Zhang paper (the SZ effect: hot electrons in the plasma which pervades rich clusters of galaxies interact with CMB photons, via inverse Compton scattering) – the WMAP team’s recent analysis is their first to investigate this effect. They detect the SZ effect directly in the nearest rich cluster (Coma; Virgo is behind the Milky Way foreground), and also statistically by correlation with the location of some 700 relatively nearby rich clusters. While the WMAP team’s finding is consistent with data from x-ray observations, it is inconsistent with theoretical models. Back to the drawing board for astrophysicists studying galaxy clusters.

Seven Year Microwave Sky (Credit: NASA/WMAP Science Team)

I’ll wrap up by quoting Komatsu et al. “The standard ΛCDM cosmological model continues to be an exquisite fit to the existing data.”

Primary source: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Cosmological Interpretation (arXiv:1001.4738). The five other Seven-Year WMAP papers are: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Are There Cosmic Microwave Background Anomalies? (arXiv:1001.4758), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Planets and Celestial Calibration Sources (arXiv:1001.4731), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Sky Maps, Systematic Errors, and Basic Results (arXiv:1001.4744), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Power Spectra and WMAP-Derived Parameters (arXiv:1001.4635), and Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Galactic Foreground Emission (arXiv:1001.4555). Also check out the official WMAP website.

Searching for Life in the Multiverse

Multiverse Theory
Artist concept of the multiverse. Credit: Florida State University

[/caption]
Other intelligent and technologically capable alien civilizations may exist in our Universe, but the problems with finding and communicating with them is that they are simply too far away for any meaningful two-way conversations. But what about the prospect of finding if life exists in other universes outside of our own?

Theoretical physics has brought us the notion that our single universe is not necessarily all there is. The “multiverse” idea is a hypothetical mega-universe full of numerous smaller universes, including our own.

In this month’s Scientific American, Alejandro Jenkins from Florida State University and Gilad Perez, a theorist at the Weizmann Institute of Science in Israel, discuss how multiple other universes—each with its own laws of physics—may have emerged from the same primordial vacuum that gave rise to ours. Assuming they exist, many of those universes may contain intricate structures and perhaps even some forms of life. But the latest theoretical research suggests that our own universe may not be as “finely tuned” for the emergence of life as previously thought.

Jenkns and Perez write about a provocative hypothesis known as the anthropic principle, which states that the existence of intelligent life (capable of studying physical processes) imposes constraints on the possible form of the laws of physics.

Alejandro Jenkins. Credit: Florida State University

“Our lives here on Earth — in fact, everything we see and know about the universe around us — depend on a precise set of conditions that makes us possible,” Jenkins said. “For example, if the fundamental forces that shape matter in our universe were altered even slightly, it’s conceivable that atoms never would have formed, or that the element carbon, which is considered a basic building block of life as we know it, wouldn’t exist. So how is it that such a perfect balance exists? Some would attribute it to God, but of course, that is outside the realm of physics.”

The theory of “cosmic inflation,” which was developed in the 1980s in order to solve certain puzzles about the structure of our universe, predicts that ours is just one of countless universes to emerge from the same primordial vacuum. We have no way of seeing those other universes, although many of the other predictions of cosmic inflation have recently been corroborated by astrophysical measurements.

Given some of science’s current ideas about high-energy physics, it is plausible that those other universes might each have different physical interactions. So perhaps it’s no mystery that we would happen to occupy the rare universe in which conditions are just right to make life possible. This is analogous to how, out of the many planets in our universe, we occupy the rare one where conditions are right for organic evolution.

“What theorists like Dr. Perez and I do is tweak the calculations of the fundamental forces in order to predict the resulting effects on possible, alternative universes,” Jenkins said. “Some of these results are easy to predict; for example, if there was no electromagnetic force, there would be no atoms and no chemical bonds. And without gravity, matter wouldn’t coalesce into planets, stars and galaxies.

“What is surprising about our results is that we found conditions that, while very different from those of our own universe, nevertheless might allow — again, at least hypothetically — for the existence of life. (What that life would look like is another story entirely.) This actually brings into question the usefulness of the anthropic principle when applied to particle physics, and might force us to think more carefully about what the multiverse would actually contain.”

A brief overview of the article is available for free on Scientific American’s website.

Source: Florida State University

The Extremely Large Telescope

The European Southern Observatory (ESO) is planning on building a massive – and I do mean massive – telescope in the next decade. The European Extremely Large Telescope (E-ELT) is a 42-meter telescope in its final planning stages. Weighing in at 5,000 tonnes, and made up of 984 individual mirrors, it will be able to image the discs of extrasolar planets and resolve individual stars in galaxies beyond the Local Group! By 2018 ESO hope to be using this gargantuan scope to stare so deep into space that they can actually see the Universe expanding!

The E-ELT is currently scheduled for completion around 2018 and when built it will be four times larger than anything currently looking at the sky in optical wavelengths and 100 times more powerful than the Hubble Space Telescope – despite being a ground-based observatory.

With advanced adaptive optics systems, the E-ELT will use up to 6 laser guide stars to analyse the twinkling caused by the motion of the atmosphere. Computer systems move the 984 individual mirrored panels up to a thousand times a second to cancel out this blurring effect in real time. The result is an image almost as crisp as if the telescope were in space.

This combination of incredible technological power and gigantic size mean that that the E-ELT will be able to not only detect the presence of planets around other stars but also begin to make images of them. It could potentially make a direct image of a Super Earth (a rocky planet just a few times larger than Earth). It would be capable of observing planets around stars within 15-30 light years of the Earth – there are almost 400 stars within that distance!

The E-ELT will be able to resolve stars within distant galaxies and as such begin to understand the history of such galaxies. This method of using the chemical composition, age and mass of stars to unravel the history of the galaxy is sometimes called galactic archaeology and instruments like the E-ELT would lead the way in such research.

Incredibly, by measuring the redshift of distant galaxies over many years with a telescope as sensitive as the E-ELT it should be possible to detect the gradual change in their doppler shift. As such the E-ELT could allow humans to watch the Universe itself expand!

ESO has already spent millions on developing the E-ELT concept. If it is completed as planned then it will eventually cost about €1 billion. The technology required to make the E-ELT happen is being developed right now all over the world – in fact it is creating new technologies, jobs and industry as it goes along. The telescope’s enclosure alone presents a huge engineering conundrum – how do you build something the size of modern sports stadium at high altitude and without any existing roads? They will need to keep 5,000 tonnes of metal and glass slewing around smoothly and easily once it’s operating – as well as figuring out how to mass-produce more than 1200 1.4m hexagonal mirrors.

The E-ELT has the capacity to transform our view not only of the Universe but of telescopes and the technology to build them as well. It will be a huge leap forward in telescope engineering and for European astronomy it will be a massive 42m jewel in the crown.

Early Galaxy Pinpoints Reionization Era

This is a composite of false color images of the galaxies found at the early epoch around 800 million years after the Big Bang. The upper left panel presents the galaxy confirmed in the 787 million year old universe. These galaxies are in the Subaru Deep Field. Credit: M. Ouchi et al.

[/caption]
Astronomers looking to pinpoint when the reionozation of the Universe took place have found some of the earliest galaxies about 800 million years after the Big Bang. 22 early galaxies were found using a method that looks for far-away redshifting sources that disappear or “drop-out” at a specific wavelength. The age of one galaxy was confirmed by a characteristic neutral hydrogen signature at 787 million years after the Big Bang. The finding is the first age-confirmation of a so-called dropout galaxy at that distant time and pinpoints when the reionization epoch likely began.

The reionization period is about the farthest back in time that astronomers can observe. The Big Bang, 13.7 billion years ago, created a hot, murky universe. Some 400,000 years later, temperatures cooled, electrons and protons joined to form neutral hydrogen, and the murk cleared. Some time before 1 billion years after the Big Bang, neutral hydrogen began to form stars in the first galaxies, which radiated energy and changed the hydrogen back to being ionized. Although not the thick plasma soup of the earlier period just after the Big Bang, this star formation started the reionization epoch.

Astronomers know that this era ended about 1 billion years after the Big Bang, but when it began has eluded them.

We look for ‘dropout’ galaxies,” said Masami Ouchi, who led a US and Japanese team of astronomers looking back at the reionization epoch. “We use progressively redder filters that reveal increasing wavelengths of light and watch which galaxies disappear from or ‘dropout’ of images made using those filters. Older, more distant galaxies ‘dropout’ of progressively redder filters and the specific wavelengths can tell us the galaxies’ distance and age. What makes this study different is that we surveyed an area that is over 100 times larger than previous ones and, as a result, had a larger sample of early galaxies (22) than past surveys. Plus, we were able to confirm one galaxy’s age,” he continued. “Since all the galaxies were found using the same dropout technique, they are likely to be the same age.”

Ouchi’s team was able to conduct such a large survey because they used a custom-made, super-red filter and other unique technological advancements in red sensitivity on the wide-field camera of the 8.3-meter Subaru Telescope. They made their observations from 2006 to 2009 in the Subaru Deep Field and Great Observatories Origins Deep Survey North field. They then compared their observations with data gathered in other studies.

Astronomers have wondered whether the universe underwent reionization instantaneously or gradually over time, but more importantly, they have tried to isolate when the universe began reionization. Galaxy density and brightness measurements are key to calculating star-formation rates, which tell a lot about what happened when. The astronomers looked at star-formation rates and the rate at which hydrogen was ionized.

Using data from their study and others, they determined that the star-formation rates were dramatically lower from 800 million years to about one billion years after the Big Bang, then thereafter. Accordingly, they calculated that the rate of ionization would be very slow during this early time, because of this low star-formation rate.

“We were really surprised that the rate of ionization seems so low, which would constitute a contradiction with the claim of NASA’s WMAP satellite. It concluded that reionization started no later than 600 million years after the Big Bang,” remarked Ouchi. “We think this riddle might be explained by more efficient ionizing photon production rates in early galaxies. The formation of massive stars may have been much more vigorous then than in today’s galaxies. Fewer, massive stars produce more ionizing photons than many smaller stars,” he explained.

The research will be published in a December issue of the Astrophysical Journal.

Source: EurekAlert

New CMB Measurements Support Standard Model

The measure of polarized light from the early Universe allowed researchers to better plot the location of matter - the left image - which later became the stars and galaxies we have today. Image Credit: Sarah Church/Walter Gear

New measurements of the cosmic microwave background (CMB) – the leftover light from the Big Bang – lend further support the Standard Cosmological Model and the existence of dark matter and dark energy, limiting the possibility of alternative models of the Universe. Researchers from Stanford University and Cardiff University produced a detailed map of the composition and structure of matter as it would have looked shortly after the Big Bang, which shows that the Universe would not look as it does today if it were made up solely of ‘normal matter’.

By measuring the way the light of the CMB is polarized, a team led by Sarah Church of the Kavli Institute for Particle Astrophysics and Cosmology at Stanford University and by Walter Gear, head of the School of Physics and Astronomy at Cardiff University in the United Kingdom were able construct a map of the way the Universe would have looked shortly after matter came into existence after the Big Bang. Their findings lend evidence to the predictions of the Standard Model in which the Universe is composed of 95% dark matter and energy, and only 5% of ordinary matter.

Polarization is a feature of light rays in which the oscillation of the light wave lies in right angles to the direction in which the light is traveling. Though most light is unpolarized, light that has interacted with matter can become polarized. The leftover light from the Big Bang – the CMB – has now cooled to a few degrees above 0 Kelvin, but it still retains the same polarization it had in the early Universe, once it had cooled enough to become transparent to light. By measuring this polarization, the researchers were able to extrapolate the location, structure, and velocity of matter in the early Universe with unprecedented precision. The gravitational collapse of large clumps of matter in the early universe creates certain resonances in the polarization that allowed the researchers to create a map of the matter composition.

Dr. Gear said, “The pattern of oscillations in the power spectra allow us to discriminate, as “real” and “dark” matter affect the position and amplitudes of the peaks in different ways. The results are also consistent with many other pieces of evidence for dark matter, such as the rotation rate of galaxies, and the distribution of galaxies in clusters.”

The measurements made by the QUaD experiment further constrain those made by previous experiments to measure properties of the CMB, such as WMAP and ACBAR. In comparison to these previous experiments, the The QUaD experiment, located at the South Pole, allowed researchers to measure the polarization of the CMB with very high precision. Image Credit: Sarah Churchmeasurements come closer to fitting what is predicted by the Standard Cosmologicl Model by more than an order of magnitude, said Dr. Gear. This is a very important step on the path to verifying whether our model of the Universe is correct.

The researchers used the QUaD experiment at the South Pole to make their observations. The QUaD telescope is a bolometer, essentially a thermometer that measures how certain types of radiation increase the temperature of the metals in the detector. The detector itself has to be near 1 degree Kelvin to eliminate noise radiation from the surrounding environment, which is why it is located at the frigid South Pole, and placed inside of a cryostat.

Paper co-author Walter Gear said in an email interview:

“The polarization is imprinted at the time the Universe becomes transparent to light, about 400,000 years after the big bang, rather than right after the big bang before matter existed. There are major efforts now to try to find what is called the “B-mode” signal”  which is a more complicated polarization pattern that IS imprinted right after the big-bang. QuaD places the best current upper limit on this but is still more than an order of magnitude away in sensitivity from even optimistic predictions of what that signal might be. That is the next generation of experiments’s goal.”

The results, published in a paper titled Improved Measurements of the Temperature and Polarization of the Cosmic Microwave Background from QUaD in the November 1st Astrophysical Journal, fit the predictions of the Standard Model remarkably well, providing further evidence for the existence of dark matter and energy, and constraining alternative models of the Universe.

Source: SLAC, email interview with Dr. Walter Gear