GALEX Confirms Nature of Dark Energy

[/caption]

From a JPL press release:

A five-year survey of 200,000 galaxies, stretching back seven billion years in cosmic time, has led to one of the best independent confirmations that dark energy is driving our universe apart at accelerating speeds. The survey used data from NASA’s space-based Galaxy Evolution Explorer and the Anglo-Australian Telescope on Siding Spring Mountain in Australia.

The findings offer new support for the favored theory of how dark energy works — as a constant force, uniformly affecting the universe and propelling its runaway expansion. They contradict an alternate theory, where gravity, not dark energy, is the force pushing space apart. According to this alternate theory, with which the new survey results are not consistent, Albert Einstein’s concept of gravity is wrong, and gravity becomes repulsive instead of attractive when acting at great distances.

“The action of dark energy is as if you threw a ball up in the air, and it kept speeding upward into the sky faster and faster,” said Chris Blake of the Swinburne University of Technology in Melbourne, Australia. Blake is lead author of two papers describing the results that appeared in recent issues of the Monthly Notices of the Royal Astronomical Society. “The results tell us that dark energy is a cosmological constant, as Einstein proposed. If gravity were the culprit, then we wouldn’t be seeing these constant effects of dark energy throughout time.”

Dark energy is thought to dominate our universe, making up about 74 percent of it. Dark matter, a slightly less mysterious substance, accounts for 22 percent. So-called normal matter, anything with atoms, or the stuff that makes up living creatures, planets and stars, is only approximately four percent of the cosmos.

The idea of dark energy was proposed during the previous decade, based on studies of distant exploding stars called supernovae. Supernovae emit constant, measurable light, making them so-called “standard candles,” which allows calculation of their distance from Earth. Observations revealed dark energy was flinging the objects out at accelerating speeds.

his diagram illustrates two ways to measure how fast the universe is expanding -- the "standard candle" method, which involves exploded stars in galaxies, and the "standard ruler" method, which involves pairs of galaxies. Image credit: NASA/JPL-Caltech

Dark energy is in a tug-of-war contest with gravity. In the early universe, gravity took the lead, dominating dark energy. At about 8 billion years after the Big Bang, as space expanded and matter became diluted, gravitational attractions weakened and dark energy gained the upper hand. Billions of years from now, dark energy will be even more dominant. Astronomers predict our universe will be a cosmic wasteland, with galaxies spread apart so far that any intelligent beings living inside them wouldn’t be able to see other galaxies.

The new survey provides two separate methods for independently checking the supernovae results. This is the first time astronomers performed these checks across the whole cosmic timespan dominated by dark energy. The team began by assembling the largest three-dimensional map of galaxies in the distant universe, spotted by the Galaxy Evolution Explorer. The ultraviolet-sensing telescope has scanned about three-quarters of the sky, observing hundreds of millions of galaxies.

“The Galaxy Evolution Explorer helped identify bright, young galaxies, which are ideal for this type of study,” said Christopher Martin, principal investigator for the mission at the California Institute of Technology in Pasadena. “It provided the scaffolding for this enormous 3-D map.”

The astronomers acquired detailed information about the light for each galaxy using the Anglo-Australian Telescope and studied the pattern of distance between them. Sound waves from the very early universe left imprints in the patterns of galaxies, causing pairs of galaxies to be separated by approximately 500 million light-years.

This “standard ruler” was used to determine the distance from the galaxy pairs to Earth — the closer a galaxy pair is to us, the farther apart the galaxies will appear from each other on the sky. As with the supernovae studies, this distance data were combined with information about the speeds at which the pairs are moving away from us, revealing, yet again, the fabric of space is stretching apart faster and faster.

The team also used the galaxy map to study how clusters of galaxies grow over time like cities, eventually containing many thousands of galaxies. The clusters attract new galaxies through gravity, but dark energy tugs the clusters apart. It slows down the process, allowing scientists to measure dark energy’s repulsive force.

“Observations by astronomers over the last 15 years have produced one of the most startling discoveries in physical science; the expansion of the universe, triggered by the Big Bang, is speeding up,” said Jon Morse, astrophysics division director at NASA Headquarters in Washington. “Using entirely independent methods, data from the Galaxy Evolution Explorer have helped increase our confidence in the existence of dark energy.”

For more information see the Australian Astronomical Observatory

Antigravity Could Replace Dark Energy as Cause of Universe’s Expansion

Annihilation

[/caption]

Since the late 20th century, astronomers have been aware of data that suggest the universe is not only expanding, but expanding at an accelerating rate. According to the currently accepted model, this accelerated expansion is due to dark energy, a mysterious repulsive force that makes up about 73% of the energy density of the universe. Now, a new study reveals an alternative theory: that the expansion of the universe is actually due to the relationship between matter and antimatter. According to this study, matter and antimatter gravitationally repel each other and create a kind of “antigravity” that could do away with the need for dark energy in the universe.

Massimo Villata, a scientist from the Observatory of Turin in Italy, began the study with two major assumptions. First, he posited that both matter and antimatter have positive mass and energy density. Traditionally, the gravitational influence of a particle is determined solely by its mass. A positive mass value indicates that the particle will attract other particles gravitationally. Under Villata’s assumption, this applies to antiparticles as well. So under the influence of gravity, particles attract other particles and antiparticles attract other antiparticles. But what kind of force occurs between particles and antiparticles?

To resolve this question, Villata needed to institute the second assumption – that general relativity is CPT invariant. This means that the laws governing an ordinary matter particle in an ordinary field in spacetime can be applied equally well to scenarios in which charge (electric charge and internal quantum numbers), parity (spatial coordinates) and time are reversed, as they are for antimatter. When you reverse the equations of general relativity in charge, parity and time for either the particle or the field the particle is traveling in, the result is a change of sign in the gravity term, making it negative instead of positive and implying so-called antigravity between the two.

Villata cited the quaint example of an apple falling on Isaac Newton’s head. If an anti-apple falls on an anti-Earth, the two will attract and the anti-apple will hit anti-Newton on the head; however, an anti-apple cannot “fall” on regular old Earth, which is made of regular old matter. Instead, the anti-apple will fly away from Earth because of gravity’s change in sign. In other words, if general relativity is, in fact, CPT invariant, antigravity would cause particles and antiparticles to mutually repel. On a much larger scale, Villata claims that the universe is expanding because of this powerful repulsion between matter and antimatter.

What about the fact that matter and antimatter are known to annihilate each other? Villata resolved this paradox by placing antimatter far away from matter, in the enormous voids between galaxy clusters. These voids are believed to have stemmed from tiny negative fluctuations in the primordial density field and do seem to possess a kind of antigravity, repelling all matter away from them. Of course, the reason astronomers don’t actually observe any antimatter in the voids is still up in the air. In Villata’s words, “There is more than one possible answer, which will be investigated elsewhere.” The research appears in this month’s edition of Europhysics Letters.

Hubble Rules Out One Alternative to Dark Energy

[/caption]

From a NASA press release:

Astronomers using NASA’s Hubble Space Telescope have ruled out an alternate theory on the nature of dark energy after recalculating the expansion rate of the universe to unprecedented accuracy.

The universe appears to be expanding at an increasing rate. Some believe that is because the universe is filled with a dark energy that works in the opposite way of gravity. One alternative to that hypothesis is that an enormous bubble of relatively empty space eight billion light-years across surrounds our galactic neighborhood. If we lived near the center of this void, observations of galaxies being pushed away from each other at accelerating speeds would be an illusion.

This hypothesis has been invalidated because astronomers have refined their understanding of the universe’s present expansion rate. Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University in Baltimore, Md., led the research. The Hubble observations were conducted by the SHOES (Supernova H0 for the Equation of State) team that works to refine the accuracy of the Hubble constant to a precision that allows for a better characterization of dark energy’s behavior. The observations helped determine a figure for the universe’s current expansion rate to an uncertainty of just 3.3 percent. The new measurement reduces the error margin by 30 percent over Hubble’s previous best measurement in 2009. Riess’s results appear in the April 1 issue of The Astrophysical Journal.

“We are using the new camera on Hubble like a policeman’s radar gun to catch the universe speeding,” Riess said. “It looks more like it’s dark energy that’s pressing the gas pedal.”

Riess’ team first had to determine accurate distances to galaxies near and far from Earth. The team compared those distances with the speed at which the galaxies are apparently receding because of the expansion of space. They used those two values to calculate the Hubble constant, the number that relates the speed at which a galaxy appears to recede to its distance from the Milky Way. Because astronomers cannot physically measure the distances to galaxies, researchers had to find stars or other objects that serve as reliable cosmic yardsticks. These are objects with an intrinsic brightness, brightness that hasn’t been dimmed by distance, an atmosphere, or stellar dust, that is known. Their distances, therefore, can be inferred by comparing their true brightness with their apparent brightness as seen from Earth.

To calculate longer distances, Riess’ team chose a special class of exploding stars called Type 1a supernovae. These stellar explosions all flare with similar luminosity and are brilliant enough to be seen far across the universe. By comparing the apparent brightness of Type 1a supernovae and pulsating Cepheid stars, the astronomers could measure accurately their intrinsic brightness and therefore calculate distances to Type Ia supernovae in far-flung galaxies.

Using the sharpness of the new Wide Field Camera 3 (WFC3) to study more stars in visible and near-infrared light, scientists eliminated systematic errors introduced by comparing measurements from different telescopes.

“WFC3 is the best camera ever flown on Hubble for making these measurements, improving the precision of prior measurements in a small fraction of the time it previously took,” said Lucas Macri, a collaborator on the SHOES Team from Texas A&M in College Station.

Knowing the precise value of the universe’s expansion rate further restricts the range of dark energy’s strength and helps astronomers tighten up their estimates of other cosmic properties, including the universe’s shape and its roster of neutrinos, or ghostly particles, that filled the early universe.

“Thomas Edison once said ‘every wrong attempt discarded is a step forward,’ and this principle still governs how scientists approach the mysteries of the cosmos,” said Jon Morse, astrophysics division director at NASA Headquarters in Washington. “By falsifying the bubble hypothesis of the accelerating expansion, NASA missions like Hubble bring us closer to the ultimate goal of understanding this remarkable property of our universe.”

Science Paper by: Adam G. Riess et al. (PDF document)

Astronomers Now Closer to Understanding Dark Energy

Dark Energy

Understanding something we can’t see has been a problem that astronomers have overcome in the past. Now, a group of scientists believe a new technique will meet the challenge of helping to solve one of the biggest mysteries in cosmology today: understanding the nature of dark energy. Using the strong gravitational lensing method — where a massive galaxy cluster acts as a cosmic magnifying lens — an international team of astronomers have been able to study elusive dark energy for the first time. The team reports that when combined with existing techniques, their results significantly improve current measurements of the mass and energy content of the universe.

Using data taken by the Hubble Space Telescope as well as ground-based telescopes, the team analyzed images of 34 extremely distant galaxies situated behind Abell 1689, one of the biggest and most massive known galaxy clusters in the universe.

Through the gravitational lens of Abell 1689, the astronomers, led by Eric Jullo from JPL and Priyamvada Natarajan from Yale University, were able to detect the faint, distant background galaxies—whose light was bent and projected by the cluster’s massive gravitational pull—in a similar way that the lens of a magnifying lens distorts an object’s image.

Using this method, they were able to reduce the overall error in its equation-of-state parameter by 30 percent, when combined with other methods.

The way in which the images were distorted gave the astronomers clues as to the geometry of the space that lies between the Earth, the cluster and the distant galaxies. “The content, geometry and fate of the universe are linked, so if you can constrain two of those things, you learn something about the third,” Natarajan said.

The team was able to narrow the range of current estimates about dark energy’s effect on the universe, denoted by the value w, by 30 percent. The team combined their new technique with other methods, including using supernovae, X-ray galaxy clusters and data from the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft, to constrain the value for w.

“Dark energy is characterized by the relationship between its pressure and its density: this is known as its equation of state,” said Jullo. “Our goal was to try to quantify this relationship. It teaches us about the properties of dark energy and how it has affected the development of the Universe.”

Dark energy makes up about 72 percent of all the mass and energy in the universe and will ultimately determine its fate. The new results confirm previous findings that the nature of dark energy likely corresponds to a flat universe. In this scenario, the expansion of the universe will continue to accelerate and the universe will expand forever.

The astronomers say the real strength of this new result is that it devises a totally new way to extract information about the elusive dark energy, and it offers great promise for future applications.

According to the scientists, their method required multiple, meticulous steps to develop. They spent several years developing specialized mathematical models and precise maps of the matter — both dark and “normal” — that together constitute the Abell 1689 cluster.

The findings appear in the August 20 issue of the journal Science.

Sources: Yale University, Science Express. ESA Hubble.

New Technique Could Track Down Dark Energy

[/caption]

From an NRAO press release:

Dark energy is the label scientists have given to what is causing the Universe to expand at an accelerating rate, and is believed to make up nearly three-fourths of the mass and energy of the Universe. While the acceleration was discovered in 1998, its cause remains unknown. Physicists have advanced competing theories to explain the acceleration, and believe the best way to test those theories is to precisely measure large-scale cosmic structures. A new technique developed for the Robert C. Byrd Green Bank Telescope (GBT) have given astronomers a new way to map large cosmic structures such as dark energy.

Sound waves in the matter-energy soup of the extremely early Universe are thought to have left detectable imprints on the large-scale distribution of galaxies in the Universe. The researchers developed a way to measure such imprints by observing the radio emission of hydrogen gas. Their technique, called intensity mapping, when applied to greater areas of the Universe, could reveal how such large-scale structure has changed over the last few billion years, giving insight into which theory of dark energy is the most accurate.

“Our project mapped hydrogen gas to greater cosmic distances than ever before, and shows that the techniques we developed can be used to map huge volumes of the Universe in three dimensions and to test the competing theories of dark energy,” said Tzu-Ching Chang, of the Academia Sinica in Taiwan and the University of Toronto.

To get their results, the researchers used the GBT to study a region of sky that previously had been surveyed in detail in visible light by the Keck II telescope in Hawaii. This optical survey used spectroscopy to map the locations of thousands of galaxies in three dimensions. With the GBT, instead of looking for hydrogen gas in these individual, distant galaxies — a daunting challenge beyond the technical capabilities of current instruments — the team used their intensity-mapping technique to accumulate the radio waves emitted by the hydrogen gas in large volumes of space including many galaxies.

“Since the early part of the 20th Century, astronomers have traced the expansion of the Universe by observing galaxies. Our new technique allows us to skip the galaxy-detection step and gather radio emissions from a thousand galaxies at a time, as well as all the dimly-glowing material between them,” said Jeffrey Peterson, of Carnegie Mellon University.

The astronomers also developed new techniques that removed both man-made radio interference and radio emission caused by more-nearby astronomical sources, leaving only the extremely faint radio waves coming from the very distant hydrogen gas. The result was a map of part of the “cosmic web” that correlated neatly with the structure shown by the earlier optical study. The team first proposed their intensity-mapping technique in 2008, and their GBT observations were the first test of the idea.

“These observations detected more hydrogen gas than all the previously-detected hydrogen in the Universe, and at distances ten times farther than any radio wave-emitting hydrogen seen before,” said Ue-Li Pen of the University of Toronto.

“This is a demonstration of an important technique that has great promise for future studies of the evolution of large-scale structure in the Universe,” said National Radio Astronomy Observatory Chief Scientist Chris Carilli, who was not part of the research team.

In addition to Chang, Peterson, and Pen, the research team included Kevin Bandura of Carnegie Mellon University. The scientists reported their work in the July 22 issue of the scientific journal Nature.

Using Gravitational Lensing to Measure Age and Size of Universe

[/caption]

Handy little tool, this gravitational lensing! Astronomers have used it to measure the shape of stars, look for exoplanets, and measure dark matter in distant galaxies. Now its being used to measure the size and age of the Universe. Researchers say this new use of gravitation lensing provides a very precise way to measure how rapidly the universe is expanding. The measurement determines a value for the Hubble constant, which indicates the size of the universe, and confirms the age of Universe as 13.75 billion years old, within 170 million years. The results also confirm the strength of dark energy, responsible for accelerating the expansion of the universe.

Gravitational lensing occurs when two galaxies happen to aligned with one another along our line of sight in the sky. The gravitational field of the nearer galaxy distorts the image of the more distant galaxy into multiple arc-shaped images. Sometimes this effect even creates a complete ring, known as an “Einstein Ring.”
Researchers at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) used gravitational lensing to measure the distances light traveled from a bright, active galaxy to the earth along different paths. By understanding the time it took to travel along each path and the effective speeds involved, researchers could infer not just how far away the galaxy lies but also the overall scale of the universe and some details of its expansion.

Distinguishing distances in space is difficult. A bright light far away and a dimmer source lying much closer can look like they are at the same distance. A gravitational lens circumvents this problem by providing multiple clues as to the distance light travels. That extra information allows them to determine the size of the universe, often expressed by astrophysicists in terms of a quantity called Hubble’s constant.

“We’ve known for a long time that lensing is capable of making a physical measurement of Hubble’s constant,” KIPAC’s Phil Marshall said. However, gravitational lensing had never before been used in such a precise way. This measurement provides an equally precise measurement of Hubble’s constant as long-established tools such as observation of supernovae and the cosmic microwave background. “Gravitational lensing has come of age as a competitive tool in the astrophysicist’s toolkit,” Marshall said.

When a large nearby object, such as a galaxy, blocks a distant object, such as another galaxy, the light can detour around the blockage. But instead of taking a single path, light can bend around the object in one of two, or four different routes, thus doubling or quadrupling the amount of information scientists receive. As the brightness of the background galaxy nucleus fluctuates, physicists can measure the ebb and flow of light from the four distinct paths, such as in the B1608+656 system that was the subject of this study. Lead author on the study Sherry Suyu, from the University of Bonn, said, “In our case, there were four copies of the source, which appear as a ring of light around the gravitational lens.”

Though researchers do not know when light left its source, they can still compare arrival times. Marshall likens it to four cars taking four different routes between places on opposite sides of a large city, such as Stanford University to Lick Observatory, through or around San Jose. And like automobiles facing traffic snarls, light can encounter delays, too.

“The traffic density in a big city is like the mass density in a lens galaxy,” Marshall said. “If you take a longer route, it need not lead to a longer delay time. Sometimes the shorter distance is actually slower.”

The gravitational lens equations account for all the variables such as distance and density, and provide a better idea of when light left the background galaxy and how far it traveled.

In the past, this method of distance estimation was plagued by errors, but physicists now believe it is comparable with other measurement methods. With this technique, the researchers have come up with a more accurate lensing-based value for Hubble’s constant, and a better estimation of the uncertainty in that constant. By both reducing and understanding the size of error in calculations, they can achieve better estimations on the structure of the lens and the size of the universe.

There are several factors scientists still need to account for in determining distances with lenses. For example, dust in the lens can skew the results. The Hubble Space Telescope has infra-red filters useful for eliminating dust effects. The images also contain information about the number of galaxies lying around the line of vision; these contribute to the lensing effect at a level that needs to be taken into account.

Marshall says several groups are working on extending this research, both by finding new systems and further examining known lenses. Researchers are already aware of more than twenty other astronomical systems suitable for analysis with gravitational lensing.

These results of this study was published in the March 1 issue of The Astrophysical Journal. The researchers used data collected by the NASA/ESA Hubble Space Telescope, and showed the improved precision they provide in combination with the Wilkinson Microwave Anisotropy Probe (WMAP).

Source: SLAC

Quintessence

Quintessence is one idea – hypothesis – of what dark energy is (remember that dark energy is the shorthand expression of the apparent acceleration of the expansion of the universe … or the form of mass-energy which causes this observed acceleration, in cosmological models built with Einstein’s theory of general relativity).

The word quintessence means fifth essence, and is kinda cute … remember Earth, Water, Fire, and Air, the ‘four essences’ of the Ancient Greeks? Well, in modern cosmology, there are also four essences: normal matter, radiation (photons), cold dark matter, and neutrinos (which are hot dark matter!).

Quintessence covers a range of hypotheses (or models); the main difference between quintessence as a (possible) explanation for dark energy and the cosmological constant Λ (which harks back to Einstein and the early years of the 20th century) is that quintessence varies with time (albeit slooowly), and can also vary with location (space). One version of quintessence is phantom energy, in which the energy density increases with time, and leads to a Big Rip end of the universe.

Quintessence, as a scalar field, is not the least bit unusual in physics (the Newtonian gravitational potential field is one example, of a real scalar field; the Higgs field of the Standard Model of particle physics is an example of a complex scalar field); however, it has some difficulties in common with the cosmological constant (in a nutshell, how can it be so small).

Can quintessence be observed; or, rather, can quintessence be distinguished from a cosmological constant? In astronomy, yes … by finding a way to observed (and measure) the acceleration of the universe at widely different times (quintessence and Λ predict different results). Another way might be to observe variations in the fundamental constants (e.g. the fine structure constant) or violations of Einstein’s equivalence principle.

One project seeking to measure the acceleration of the universe more accurately was ESSENCE (“Equation of State: SupErNovae trace Cosmic Expansion”).

In 1999, CERN Courier published a nice summary of cosmology as it was understood then, a year after the discovery of dark energy The quintessence of cosmology (it’s well worth a read, though a lot has happened in the past decade).

Universe Today articles? Yep! For example Will the Universe Expand Forever?, More Evidence for Dark Energy, and Hubble Helps Measure the Pace of Dark Energy.

Astronomy Cast episodes relevant to quintessence include What is the universe expanding into?, and A Universe of Dark Energy.

Source: NASA

New Search for Dark Energy Goes Back in Time

[/caption]
Baryon acoustic oscillation (BAO) sounds like it could be technobabble from a Star Trek episode. BAO is real, but astronomers are searching for these particle fluctuations to do what seems like science fiction: look back in time to find clues about dark energy. The Baryon Oscillation Spectroscopic Survey(BOSS), a part of the Sloan Digital Sky Survey III (SDSS-III), took its “first light” of astronomical data last month, and will map the expansion history of the Universe.

“Baryon oscillation is a fast-maturing method for measuring dark energy in a way that’s complementary to the proven techniques of supernova cosmology,” said David Schlegel from the Lawrence Berkeley National Laboratory (Berkeley Lab), the Principal Investigator of BOSS. “The data from BOSS will be some of the best ever obtained on the large-scale structure of the Universe.”

BOSS uses the same telescope as the original Sloan Digital Sky Survey — 2.5-meter telescope
at Apache Point Observatory in New Mexico — but equipped with new, specially-built spectrographs to measure the spectra.

Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long
Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long

Baryon oscillations began when pressure waves traveled through the early universe. The same density variations left their mark as the Universe evolved, in the periodic clustering of visible matter in galaxies, quasars, and intergalactic gas, as well as in the clumping of invisible dark matter.

Comparing these scales at different eras makes it possible to trace the details of how the Universe has expanded throughout its history – information that can be used to distinguish among competing theories of dark energy.

“Like sound waves passing through air, the waves push some of the matter closer together as they travel” said Nikhil Padmanabhan, a BOSS researcher who recently moved from Berkeley Lab to Yale University. “In the early universe, these waves were moving at half the speed of light, but when the universe was only a few hundred thousand years old, the universe cooled enough to halt the waves, leaving a signature 500 million light-years in length.”

“We can see these frozen waves in the distribution of galaxies today,” said Daniel Eisenstein of the University of Arizona, the Director of the SDSS-III. “By measuring the length of the baryon oscillations, we can determine how dark energy has affected the expansion history of the universe. That in turn helps us figure out what dark energy could be.”

“Studying baryon oscillations is an exciting method for measuring dark energy in a way that’s complementary to techniques in supernova cosmology,” said Kyle Dawson of the University of Utah, who is leading the commissioning of BOSS. “BOSS’s galaxy measurements will be a revolutionary dataset that will provide rich insights into the universe,” added Martin White of Berkeley Lab, BOSS’s survey
scientist.

On Sept. 14-15, 2009, astronomers used BOSS to measure the spectra of a thousand galaxies and quasars. The goal of BOSS is to measure 1.4 million luminous red galaxies at redshifts up to 0.7 (when the Universe was roughly seven billion years old) and 160,000 quasars at redshifts between 2.0 and 3.0 (when the Universe was only about three billion years old). BOSS will also measure variations in the density of hydrogen gas between the galaxies. The observation program will take five years.

Source: Sloan Digital Sky Survey

Variability in Type 1A Supernovae Has Implications for Studying Dark Energy

[/caption]

The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae, and these stellar explosions have long been used as “standard candles” for measuring the expansion. But not all type 1A supernovae are created equal. A new study reveals sources of variability in these supernovae, and to accurately probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to find a way to measure cosmic distances with much greater precision than they have in the past.

“As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance,” said lead author Daniel Kasen, of a study published in Nature this week. “We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness.”

Kasen and his coauthors–Fritz Röpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz–used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass–1.4 times the mass of the Sun, packed into an object the size of the Earth–the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their “light curves” (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. “The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry,” Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

“The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light,” Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. “Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based,” Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher “metallicity,” in astronomers’ terminology) than stars formed in the distant past.

“That’s the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity,” Kasen said. “When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less.”

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

Source: EurekAlert