Astronomers Closing in on Dark Energy with Refined Hubble Constant



The name “dark energy” is just a placeholder for the force — whatever it is — that is causing the Universe to expand. But astronomers are perhaps getting closer to understanding this force. New observations of several Cepheid variable stars by the Hubble Space Telescope has refined the measurement of the Universe’s present expansion rate to a precision where the error is smaller than five percent. The new value for the expansion rate, known as the Hubble constant, or H0 (after Edwin Hubble who first measured the expansion of the universe nearly a century ago), is 74.2 kilometers per second per megaparsec (error margin of ± 3.6). The results agree closely with an earlier measurement gleaned from Hubble of 72 ± 8 km/sec/megaparsec, but are now more than twice as precise.

The Hubble measurement, conducted by the SHOES (Supernova H0 for the Equation of State) Team and led by Adam Riess, of the Space Telescope Science Institute and the Johns Hopkins University, uses a number of refinements to streamline and strengthen the construction of a cosmic “distance ladder,” a billion light-years in length, that astronomers use to determine the universe’s expansion rate.

Hubble observations of the pulsating Cepheid variables in a nearby cosmic mile marker, the galaxy NGC 4258, and in the host galaxies of recent supernovae, directly link these distance indicators. The use of Hubble to bridge these rungs in the ladder eliminated the systematic errors that are almost unavoidably introduced by comparing measurements from different telescopes.

Steps to the Hubble Constant.  Credit: NASA, ESA, and A. Feild (STScI)
Steps to the Hubble Constant. Credit: NASA, ESA, and A. Feild (STScI)

Riess explains the new technique: “It’s like measuring a building with a long tape measure instead of moving a yard stick end over end. You avoid compounding the little errors you make every time you move the yardstick. The higher the building, the greater the error.”

Lucas Macri, professor of physics and astronomy at Texas A&M, and a significant contributor to the results, said, “Cepheids are the backbone of the distance ladder because their pulsation periods, which are easily observed, correlate directly with their luminosities. Another refinement of our ladder is the fact that we have observed the Cepheids in the near-infrared parts of the electromagnetic spectrum where these variable stars are better distance indicators than at optical wavelengths.”

This new, more precise value of the Hubble constant was used to test and constrain the properties of dark energy, the form of energy that produces a repulsive force in space, which is causing the expansion rate of the universe to accelerate.

By bracketing the expansion history of the universe between today and when the universe was only approximately 380,000 years old, the astronomers were able to place limits on the nature of the dark energy that is causing the expansion to speed up. (The measurement for the far, early universe is derived from fluctuations in the cosmic microwave background, as resolved by NASA’s Wilkinson Microwave Anisotropy Probe, WMAP, in 2003.)

Their result is consistent with the simplest interpretation of dark energy: that it is mathematically equivalent to Albert Einstein’s hypothesized cosmological constant, introduced a century ago to push on the fabric of space and prevent the universe from collapsing under the pull of gravity. (Einstein, however, removed the constant once the expansion of the universe was discovered by Edwin Hubble.)

Detail from NGC 3021. Credit: NASA, ESA, and A. Riess (STScI/JHU)
Detail from NGC 3021. Credit: NASA, ESA, and A. Riess (STScI/JHU)

“If you put in a box all the ways that dark energy might differ from the cosmological constant, that box would now be three times smaller,” says Riess. “That’s progress, but we still have a long way to go to pin down the nature of dark energy.”

Though the cosmological constant was conceived of long ago, observational evidence for dark energy didn’t come along until 11 years ago, when two studies, one led by Riess and Brian Schmidt of Mount Stromlo Observatory, and the other by Saul Perlmutter of Lawrence Berkeley National Laboratory, discovered dark energy independently, in part with Hubble observations. Since then astronomers have been pursuing observations to better characterize dark energy.

Riess’s approach to narrowing alternative explanations for dark energy—whether it is a static cosmological constant or a dynamical field (like the repulsive force that drove inflation after the big bang)—is to further refine measurements of the universe’s expansion history.

Before Hubble was launched in 1990, the estimates of the Hubble constant varied by a factor of two. In the late 1990s the Hubble Space Telescope Key Project on the Extragalactic Distance Scale refined the value of the Hubble constant to an error of only about ten percent. This was accomplished by observing Cepheid variables at optical wavelengths out to greater distances than obtained previously and comparing those to similar measurements from ground-based telescopes.

The SHOES team used Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) and the Advanced Camera for Surveys (ACS) to observe 240 Cepheid variable stars across seven galaxies. One of these galaxies was NGC 4258, whose distance was very accurately determined through observations with radio telescopes. The other six galaxies recently hosted Type Ia supernovae that are reliable distance indicators for even farther measurements in the universe. Type Ia supernovae all explode with nearly the same amount of energy and therefore have almost the same intrinsic brightness.

By observing Cepheids with very similar properties at near-infrared wavelengths in all seven galaxies, and using the same telescope and instrument, the team was able to more precisely calibrate the luminosity of supernovae. With Hubble’s powerful capabilities, the team was able to sidestep some of the shakiest rungs along the previous distance ladder involving uncertainties in the behavior of Cepheids.

Riess would eventually like to see the Hubble constant refined to a value with an error of no more than one percent, to put even tighter constraints on solutions to dark energy.

Source: Space Telescope Science Institute

New Hubble Survey Supports Cold Dark Matter in Early Universe

NICMOS Image of the GOODS North field. Credit: C Conselice, A Bluck, GOODS NICMOS Team.

[/caption]

A new survey is revealing how the most massive galaxies formed in the early Universe, and the findings support the theory that Cold Dark Matter played a role. A team of scientists from six countries used the NICMOS near infrared camera on the Hubble Space Telescope to carry out the deepest ever survey of its type at near infrared wavelengths. Early results show that the most massive galaxies, which have masses roughly 10 times larger than the Milky Way, were involved in significant levels of galaxy mergers and interactions when the Universe was just 2-3 billion years old.

“As almost all of these massive galaxies are invisible in the optical wavelengths, this is the first time that most of them have been observed,” said Dr. Chris Conselice, who is the Principal Investigator for the survey. “To assess the level of interaction and mergers between the massive galaxies, we searched for galaxies in pairs, close enough to each other to merge within a given time-scale. While the galaxies are very massive and at first sight may appear fully formed, the results show that they have experienced an average of two significant merging events during their life-times.”

The results show that these galaxies did not form in a simple collapse in the early universe, but that their formation is more gradual over the course of the Universe’s evolution, taking about 5 billion years.

NICMOS image of merging galaxies.  Credit: C Conselice, A Bluck, GOODS NICMOS Team
NICMOS image of merging galaxies. Credit: C Conselice, A Bluck, GOODS NICMOS Team

“The findings support a basic prediction of the dominant model of the Universe, known as Cold Dark Matter,” said Conselice, “so they reveal not only how the most massive galaxies are forming, but also that the model that’s been developed to describe the Universe, based on the distribution of galaxies that we’ve observed overall, applies in its basic form to galaxy formation.”

The Cold Dark Matter theory is a refinement of the Big Bang theory, which includes the assumption that most of the matter in the Universe consists of material that cannot be observed by its electromagnetic radiation and hence is dark matter, while at the same time the particles making up this matter are slow and are thereforer cold.

The preliminary results are based on a paper led by PhD student Asa Bluck at the University of Nottingham, and were presented this week at the European Week of Astronomy and Space Science at the University of Hertfordshire.

The observations are part of the Great Observatories Origins Deep Survey (GOODS), a campaign that is using NASA’s Spitzer, Hubble and Chandra space telescopes together with ESA’s XMM Newton X-ray observatory to study the most distant Universe.

Source: RAS

Kepler Will Be Used to Measure the Size of the Universe

Artist's rendering of the Kepler Mission (NASA)

[/caption]

On April 7th, commands were sent to NASA’s exoplanet-hunting Kepler telescope to eject the 1.3×1.7 metre lens cap so the unprecedented mission could begin its hunt for Earth-like alien worlds orbiting distant stars. However, one UK astronomer won’t be using the Kepler data to detect the faint transits of rocky exoplanets in front of their host stars. He’ll be using it to monitor the light from a special class of variable star, and through the extreme precision of Kepler’s optics he will be joining an international team of collaborators to redefine the size of the Universe…

Kepler is carrying the largest camera ever launched into space. The camera has 42 charge-coupled devices (CCDs) to monitor the very slight changes in star brightness as an exoplanet passes in front of its host star. Considering the fact that it is hoped Kepler will detect exoplanets a little larger than our planet (known as super-Earths), the instrument is extremely sensitive. It is for this reason that not only exoplanet hunters are interested in using Kepler’s sensitive eye.

Using Kepler data, Dr Alan Penny, a researcher at the University of St Andrews will be joining a 200-strong team of astronomers to analyse the light not emitted from exoplanet-harbouring stars, but from a smaller group of variable stars that fluctuate in brightness with striking regularity and precision. These stars are Cepheid variables, also known as “standard candles” as they can be relied upon for their strong correlation between period of variability and absolute luminosity. This means that no matter where Cepheids are observed in galaxies or clusters, astronomers can always deduce the distance from the Earth to the Cepheid with great precision. The only thing limiting astronomers is the precision that can be attained by instrumentation, so when Kepler left Earth, carrying the most advanced and sensitive camera ever to be taken into space, Penny and his collaborators jumped at the chance to use Kepler to refine the measurement of the Universe.

While Kepler is doing its exciting planet-hunting, we will be using its extreme precision to resolve a possible problem with our measurement of the size of the Universe,” said Penny. “These variable stars known as ‘Cepheids’ form the base of a series of steps by which we measure the distance to distant galaxies and, through them, we can measure the size of the Universe.”

Current estimates place the size of the Universe at 93 billion light years across, but Penny believes Kepler observations of a small selection of Cepheids may change this value by a few percent. When precision observations of a very precise stellar period-brightness relationship, it’s nice to be able to use the most precise instrument you can lay your hands on. However, our understanding of the “standard candles” themselves is very poor, and small-scale, dynamic changes on the star itself can go unnoticed on the ground. Kepler should shed some light on gaps in our knowledge of Cepheids as well as give us the best-yet measurement of the scale of our Universe.

These Cepheid stars which get brighter and fainter by some tens of percent every ten to a hundred days are mostly understood. But recently it has become clear that our theories of what happens in the outer layers of these stars which cause the variations in brightness do not totally agree with what we see. The exquisite accuracy of Kepler in measuring star brightness, one hundred times better than we can do from the ground, means we can get such good measurements that we should be able to match theory with observation. Resolving the issue may only change estimates of the size of the Universe by a small amount, but we won’t rest easy until the problem is solved.” — Dr Alan Penny

Source: Physorg.com

Cosmologists Search for Gravity Waves to Prove Inflation Theory

The South Pole Telescope under the aurora australis (southern lights). Photo by Keith Vanderlinde

[/caption]

During the next decade, cosmologists will attempt to observe the first moments of the Universe, hoping to prove a popular theory. They’ll be searching for extremely weak gravity waves to measure primordial light, looking for convincing evidence for the Cosmic Inflation Theory, which proposes that a random, microscopic density fluctuation in the fabric of space and time gave birth to the Universe in a hot big bang approximately 13.7 billion years ago. A new instrument called a polarimeter is being attached to the South Pole Telescope (SPT), which operates at submillimeter wavelengths, between microwaves and the infrared on the electromagnetic spectrum. Einstein’s theory of general relativity predicts that Cosmic Inflation should produce the weak gravity waves.

Inflation Theory proposes a period of extremely rapid and exponential expansion of the Universe during its first few moments prior to the more gradual Big Bang expansion, during which time the energy density of the universe was dominated by a cosmological constant-type of vacuum energy that later decayed to produce the matter and radiation that fill the Universe today.

In 1979, physicist Alan Guth proposed the Cosmic Inflation Theory, which also predicts the existence of an infinite number of universes. Unfortunately, cosmologists have no way of testing that particular prediction.

The South Pole Telescope takes advantage of the clear, dry skies at the National  Science Foundation’s South Pole Station to study the cosmic background  radiation, the afterglow of the big bang. The SPT measures eight meters (26.4  feet) in diameter.  Photo by Jeff McMahon
The South Pole Telescope takes advantage of the clear, dry skies at the National Science Foundation’s South Pole Station to study the cosmic background radiation, the afterglow of the big bang. The SPT measures eight meters (26.4 feet) in diameter. Photo by Jeff McMahon

“Since these are separate universes, by definition that means we can never have any contact with them. Nothing that happens there has any impact on us,” said Scott Dodelson, a scientist at Fermi National Accelerator Laboratory and a Professor in Astronomy & Astrophysics at the University of Chicago.

But there is a way to probe the validity of cosmic inflation. The phenomenon would have produced two classes of perturbations. The first, fluctuations in the density of subatomic particles happen continuously throughout the universe, and scientists have already observed them.

“Usually they’re just taking place on the atomic scale. We never even notice them,” Dodelson said. But inflation would instantaneously stretch these perturbations into cosmic proportions. “That picture actually works. We can calculate what those perturbations should look like, and it turns out they are exactly right to produce the galaxies we see in the universe.”

The second class of perturbations would be gravity waves—Einsteinian distortions in space and time. Gravity waves also would get promoted to cosmic proportions, perhaps even strong enough for cosmologists to detect them with sensitive telescopes tuned to the proper frequency of electromagnetic radiation.

If the new polarimeter is sensitive enough, scientists should be able to detect the waves.

“If you detect gravity waves, it tells you a whole lot about inflation for our universe,” said John Carlstrom from the University of Chicago, who developed the new instrument. Carlstrom said detecting the waves would rule out various competing ideas for the origin of the universe. “There are fewer than there used to be, but they don’t predict that you have such an extreme, hot big bang, this quantum fluctuation, to start with,” he said. Nor would they produce gravity waves at detectable levels.

A simulation at this link portrays the distortions in space and time at the subatomic scale, the result of quantum fluctuations occurring continuously throughout the universe. Near the end of the simulation, cosmic inflation begins to stretch space-time to the cosmic proportions of the universe.

Cosmologists also use the SPT in their quest to solve the mystery of dark energy. A repulsive force, dark energy pushes the universe apart and overwhelms gravity, the attractive force exerted by all matter.
Dark energy is invisible, but astronomers are able to see its influence on clusters of galaxies that formed within the last few billion years.

NASA’s Wilkinson Microwave Anisotropy Probe collected data that produced this  chart of sound waves from the universe. Called a power spectrum, the chart  plots the cosmic microwave background radiation as ripples of different sizes  across the sky. The data are consistent with predictions of cosmic inflation  theory.  Courtesy of the WMAP Science Team
NASA’s Wilkinson Microwave Anisotropy Probe collected data that produced this chart of sound waves from the universe. Called a power spectrum, the chart plots the cosmic microwave background radiation as ripples of different sizes across the sky. The data are consistent with predictions of cosmic inflation theory. Courtesy of the WMAP Science Team

The SPT detects the cosmic microwave background (CMB) radiation, the afterglow of the big bang. Cosmologists have mined a fortune of data from the CMB, which represent the forceful drums and horns of the cosmic symphony. But now the scientific community has its ears cocked for the tones of a subtler instrument—gravitational waves—that underlay the CMB.

“We have these key components to our picture of the universe, but we really don’t know what physics produces any of them,” said Dodelson of inflation, dark energy and the equally mysterious dark matter. “The goal of the next decade is to identify the physics.”

Source: University of Chicago

Cosmologists Look Back to Cosmic Dawn

The Universe 590 million years after the Big Bang. Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.

[/caption]
What did the Universe look like early in its history, only 500 million years after the Big Bang? Currently, we have no way of actually “looking” back that far with our telescopes, but cosmologists from Durham University in the UK have used a computer simulation to predict how the very early Universe would have appeared. The images portray the “Cosmic Dawn,” and calculate the formation of the first big galaxies. The simulation also attempts to discern the role that dark matter played in galaxy formation. “We are effectively looking back in time and by doing so we hope to learn how galaxies like our own were made and to understand more about dark matter,” said Alvaro Orsi, lead author of the study from Durham University’s Institute for Computational Cosmology (ICC). “The presence of dark matter is the key to building galaxies – without dark matter we wouldn’t be here today.”

In the images produced by the computer simulation, the green swirls represent dark matter, which the scientists say is an essential ingredient in galaxy formation, while the circles show the star formation rate in galaxies. The different color circles represent the varying luminosity of star formation with yellow being brightest. The top image portrays the Universe as it was 590 million years after the Big Bang, and the image below shows the Universe 1 billion years after the Big Bang, as star formation rates begin to ramp up.

The Universe 1 billion years after the Big Bang. Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.
The Universe 1 billion years after the Big Bang. Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.

The very first galaxies were created from the debris of massive stars which died explosively shortly after the beginning of the Universe. The Durham calculation predicts where these galaxies appear and how they evolve to the present day, over 13 billion years later. Although the galaxies today are bigger, they are not forming stars as quickly now as they were in the past. “Our research predicts which galaxies are growing through the formation of stars at different times in the history of the Universe and how these relate to the dark matter,” said co-author Dr. Carlton Baugh. “We give the computer what we think is the recipe for galaxy formation and we see what is produced which is then tested against observations of real galaxies.”

The massive simulation shows how structures grow in dark matter with a model showing how normal matter, such as gas, behaves to predict how galaxies grow. Gas feels the pull of gravity from dark matter and is heated up before cooling by releasing radiation and turning into stars. The simulation images show which galaxies are forming stars most vigorously at a given time. The image below shows the Universe 1.9 billion years after the Big Bang, a very active time of star formations in galaxies.

The Universe 1.9 billion years after the Big Bang.  Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.
The Universe 1.9 billion years after the Big Bang. Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.

The calculations of the Durham team, supported by scientists at the Universidad Catolica in Santiago, Chile, can be tested against new observations reaching back to early stages in the history of the Universe almost one billion years after the Big Bang. Professor Keith Mason, Chief Executive of the Science and Technology Facilities Council, said: “Computational cosmology plays an important part in our understanding of the Universe. Not only do these simulations allow us to look back in time to the early Universe but they complement the work and observations of our astronomers.”

This image shows the Universe today, 13.6 billion years after the Big Bang. Galaxies are not forming stars as quickly now as they were in the past.

The Universe today.  Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.
The Universe today. Credit: Alvaro Orsi, Institute for Computational Cosmology, Durham University.

The team hopes that further study and simulations of effects of dark matter on galaxies will help astronomers learn more about what this ubiquitous substance is.

Source: Science and Technology Facilities Council

Institute for Computational Cosmology, Durham University
Department of Physics, Durham University

Next-Generation Telescope Gets Team

Artist's rendering of the Giant Magellan Telescope and support facilities at Las Campanas Observatory, Chile, high in the Andes Mountains. Photo by Todd Mason/Mason Productions

 

[/caption]

Astronomy organizations in the United States, Australia and Korea have signed on to build the largest ground-based telescope in the world – unless another team gets there first. The Giant Magellan Telescope, or GMT, will have the resolving power of a single 24.5-meter (80-foot) primary mirror, which will make it three times more powerful than any of the Earth’s existing ground-based optical telescopes. Its domestic partners include the Carnegie Institution for Science, Harvard University, the Smithsonian Institution, Texas A & M University, the University of Arizona, and the University of Texas at Austin. Although the telescope has been in the works since 2003, the formal collaboration was announced Friday.

Charles Alcock, director of the Harvard-Smithsonian Center for Astrophysics, said the Giant Magellan Telescope is being designed to build on the legacy of a rash of smaller telescopes from the 1990s in California, Hawaii and Arizona. The existing telescopes have mirrors in the range of six to 10 meters (18 to 32 feet), and – while they’re making great headway in the nearby universe – they’re only able to make out the largest planets around other stars and the most luminous distant galaxies.

With a much larger primary mirror, the GMT will be able to detect much smaller and fainter objects in the sky, opening a window to the most distant, and therefore the oldest, stars and galaxies. Formed within the first billion years of the Big Bang, such objects reveal tantalizing insight into the universe’s infancy.

Earlier this year, a different consortium including the California Institute of Technology and the University of California, with Canadian and Japanese institutions, unveiled its own next-generation concept: the Thirty Meter Telescope. Whereas the GMT’s 24.5-meter primary mirror will come from a collection of eight smaller mirrors, the TMT will combine 492 segments to achieve the power of a single 30-meter (98-foot) mirror design.

In addition, the European Extremely Large Telescope is in the concept stage.

In terms of science, Alcock acknowledged that the two telescopes with US participation are headed toward redundancy. The main differences, he said, are in the engineering arena.

“They’ll probably both work,” he said. But Alcock thinks the GMT is most exciting from a technological point of view. Each of the GMT’s seven 8.4-meter primary segments will weigh 20 tons, and the telescope enclosure has a height of about 200 feet. The GMT partners aim to complete their detailed design within two years.

The TMT’s segmented concept builds on technology pioneered at the W.M. Keck Observatory in Hawaii, a past project of the Cal-Tech and University of California partnership.

Construction on the GMT is expected to begin in 2012 and completed in 2019, at Las Campanas Observatory in the Andes Mountains of Chile. The total cost is projected to be $700 million, with $130 million raised so far. 

Artists concept of the Thirty Meter Telescope Observatory. Credit: TMT
Artists concept of the Thirty Meter Telescope Observatory. Credit: TMT

Construction on the TMT could begin as early as 2011 with an estimated completion date of 2018. The telescope could go to Hawaii or Chile, and final site selection will be announced this summer. The total cost is estimated to be as high as $1 billion, with $300 million raised at last count.

 

Alcock said the next generation of telescopes is crucial for forward progress in 21st Century astronomy.

“The goal is to start discovering and characterizing planets that might harbor life,” he said. “It’s very clear that we’re going to need the next generation of telescopes to do that.”

And far from being a competition, the real race is to contribute to science, said Charles Blue, a TMT spokesman.

“All next generation observatories would really like to be up and running as soon as possible to meet the scientific demand,” he said.

In the shorter term, long distance space studies will get help from the James Webb Space Telescope, designed to replace the Hubble Space Telescope when it launches in 2013. And the Atacama Large Millimeter Array (ALMA), a large interferometer being completed in Chile, could join the fore by 2012.

Sources: EurekAlert and interviews with Charles Alcock, Charles Blue

Space Telescope of the Future: SIM

Artist's concept of the current mission configuration. Credit: JPL

[/caption]
Two of the hottest and most engaging topics in space and astronomy these days are 1.) exoplanets – planets orbiting other stars – and 2.) dark matter—that unknown stuff that seemingly makes up a considerable portion of our universe. There’s a spacecraft currently in development that could help answer our questions about whether there really are other Earth-like planets out there, as well as provide clues to the nature of dark matter. The spacecraft is called SIM – the Space Interferometry Mission. “We’ll be looking for other Earths around other stars,” said Stephen Edberg, System Scientist for the mission, “and by making accurate mass measurements of galaxies, we should be able to measure dark matter, as well.”

Listen to the January 20, 2009 “365 Days of Astronomy” Podcast and my interview with Steve Edberg, and/or read more about the SIM Lite mission below!

The concept for this mission has been around for awhile, and the concept has changed over time, with the telescope going through different incarnations. Currently, the mission is being called SIM Lite, as the spacecraft itself has gotten smaller, however the mirrors for the interferometer have gotten bigger.

While interferometry at radio wavelengths has been done for over 50 years, optical interferometry has only matured recently. Optical interferometry combines the light of multiple telescopes to perform as a single, much larger telescope. SIM Lite will have two visible-wavelength stellar interferometer sensors – as well as other advanced detectors, that will work together to create an extremely sensitive telescope, orbiting outside of Earth’s atmosphere.

“These are instruments that can measure positions in the sky to almost unbelievable accuracy,” said Edberg. “Envision Buzz Aldrin standing on the moon. Pretend he’s holding a nickel between thumb and forefinger. SIM can measure the thickness of that nickel as seen by someone standing on the surface of the Earth. That is one micro arc second, a very tiny fraction of the sky.” Watch a video depicting this — (Quicktime needed)

Having the ability to make measurements like that with SIM, it will be possible to infer the presence of planets within about 30 light-years from Earth, and those planets can be as small and low mass as Earth. As of now, the SIM team anticipates studying between 65 and 100 stars over a five year mission, looking for Earth analogs, planets roughly the same mass as Earth orbiting their stars in the habitable zone, where liquid water could exist.

So, for example, SIM Lite would be able to detect a habitable planet around the star 40 Eridani A, 16 light-years away, known to fans of the “Star Trek” television series as the location of Mr. Spock’s home planet, Vulcan. See a movie depicting this possible detection — (QuickTime needed).

SIM will not detect a planet directly, but by detecting the motion it causes in the parent star. “That’s a difficult task, there’s no question,” said Edberg, “but it gets complicated, based on what we see with our own solar system and what we’ve seen in other planetary systems. We know there are other systems out there that have more than one planet. Multiple planets can confound the measurements.”

But SIM should be able to detect the different sized planets orbiting other stars. SIM Lite recently passed a double blind study conducted by four separate teams who confirmed that SIM’s technology will allow the detection of Earth-mass planets among multiple-planet systems, by having the ability to measure the mass of different sized planets, to as low as Earth-mass.

“With a few exceptions all the planets we know about were detected using a method called radial velocity,” said Edberg, “where we look at the periodic motion of the star coming toward us and moving away from us on a regular basis. But when you make measurements like that, when you have no other information, you don’t know the orientation of the planets’ orbit with respect to the star, or the mass of either the star or the planet.”

With the hottest stars, radial velocity can’t be used to look for planets. But SIM Lite will be able to look at stars clear across the diagram from the coolest to the hottest stars.

“So far, we haven’t found any other Earth-sized planets,” said Edberg. (See our article from 1/19/2009 about a planet that could possibly be 1.4 times the mass of Earth.) “So, finding Earth analogs around stars like the sun is really the big goal.”

“It’s a big question mark in the other planets we know about now – I believe we know only about 10% of the masses of extrasolar planets,” said Edberg.

A second planet search program, called the “broad survey,” will probe roughly 2,000 stars in our galaxy to determine the prevalence planets the size of Neptune and larger.

Graphic illustrating the mass and quantity of planets SIM Lite could potentially detect. Number of terrestrial planets assumes 40% of mission time divided evenly between 1-Earth mass and 2-Earth mass surveys.  Credit:  JPL
Graphic illustrating the mass and quantity of planets SIM Lite could potentially detect. Number of terrestrial planets assumes 40% of mission time divided evenly between 1-Earth mass and 2-Earth mass surveys. Credit: JPL

SIM will also be used to measure the sizes of stars, as well as distances of stars, and be able to do so several hundred times more accurately than previously possible. SIM Lite will also measure the motion of nearby galaxies, in most cases, for the first time. These measurements will help provide the first total mass measurements of individual galaxies. All of this will enable scientists to estimate the distribution of dark matter in our own galaxy and the universe.

“Dark matter is known for its gravitational affects,” said Edberg. “It doesn’t seem to interact with normal matter as we know it. To get more clues on it, we want to know where it is.”

SIM will measure on two different scales. One is within the Milky Way Galaxy, making measurements of stars and globular clusters, and making measurements of stars that have been torn out of smaller galaxies that orbit the Milky Way.

“We can do mass model of our galaxy and find out where that mass is, including what has to be a lot of dark matter,” said Edberg. “When we make measurements of how our galaxy rotates, you find that it rotates like a solid. Instead of being Keplerian, where you think of Mercury going around the sun faster than Pluto, from all the way inside the galaxy as close as we can measure to the center, out to beyond the sun’s distance, the Milky Way rotates like it’s a solid body. It’s not a solid body, but that means it must have a density that is constant all the way through and that means there is far more matter than we can see.”

“Another thing we’d like to know is the concentration of dark matter in cluster of galaxies,” Edberg continued. “The Milky Way is part of the Local Group of galaxies, and SIM has the capability to measure stars within the individual galaxies, which in turn can be modeled to tell us where the dark matter is within the Local Group. This is cutting edge. This is one of the big mysteries right now in astrophysics and cosmology.”

Extra solar planets and dark energy may seem like two completely different things for one spacecraft to be looking for, but Edberg said this is an example of how everything is tied together.

“To get planet masses we need to know the masses of the parent stars,” he said. “SIM will make measurements of stars, particularly binary stars, and determine the masses of stars for a wide variety of star types, and be able to estimate the sizes of the planets that are causing the reflex motion. To make the measurements, and because stars with planets are going to be scattered around the sky, we need to have a grid of stars that are the fixed points to give us latitude and longitude, so to speak. If you know exactly where St. Louis and Los Angeles are, then it’s much easier to triangulate where things between them are. We need to do this all around the sky, and to do that we tie that down to the stars, and SIM can do that. These are fundamental questions that we don’t know the answers to, but SIM will help us find the answers.”

So, SIM Lite will be searching from within our neighborhood to the edge of the universe.

What’s the status of this future spacecraft?

“We’re on hold right now,” said Edberg. “We recently passed the double blind test to show that SIM can find Earth-like planets in systems that have multiple planets. SIM is also undergoing a decadal review to make the case that the astronomical science community needs to have a mission like SIM to strengthen the foundations enormously.”

Technical work is being done to prepare to build the actual instruments, but due to budgetary reasons, NASA has not set a launch date. “We think we could be ready to launch by 2015 once we get the go-ahead from NASA,” said Edberg, “and the go ahead depends on the decadal review, and the reports should be out in about a year.”

SIM Lite would provide an entirely new measurement capability in astronomy. Its findings would likely stand firmly on their own, while complimenting the capabilities of our current, as well as other planned future space observatories.

For more information about SIM check out the mission website.

Profiling Potential Supernovae

Astronomical plate showing Sagittarius. Credit: Ashley Pagnotta

[/caption]

Just as psychologists and detectives try to “profile” serial killers and other criminals, astronomers are trying to determine what type of star system will explode as a supernova. While criminals can sometimes be caught or rehabilitated before they do the crime, supernovae, well, there’s no stopping them. But there’s the potential of learning a great deal in both astronomy and cosmology by theorizing about potential stellar explosions. At the American Astronomical Society meeting last week, Professor Bradley E. Schaefer of Louisiana State University, Baton Rouge, discussed how searching through old astronomical archives can produce unique and front-line science about supernovae – as well as providing information about dark energy — in ways that no combination of modern telescopes can provide. Additionally, Schaefer said amateur astronomers can help in the search, too.

Schaefer has been studying archived data back to 1890. “Archival data is the only way to see the long-term behavior of stars, unless you want to keep watch nightly for the next century, and this is central to many front-line astronomy questions,” he said.

Bradley E. Schaefer of Louisiana State University, Baton Rouge
Bradley E. Schaefer of Louisiana State University, Baton Rouge

The main question Schaefer is trying answer is what stars are progenitors for type Ia supernovae. Astronomers have been trying to track down this mystery for over 40 years.

Type Ia supernovae are remarkably bright but also remarkably uniform in their brightness, and therefore are regarded as the best astronomical “standard candles” for measurement across cosmological distances. Type Ia supernovae are also key to the search for dark energy. These blasts have been used as distance markers for measuring how fast the Universe is expanding.

However, a potential problem is that distant supernovae might be different from nearby events, thus confounding the measures. Schaefer said the only way to solve this problem is to identify the type of stars that explode as Type Ia supernovae so that corrections can be calculated. “The upcoming big-money supernova-cosmology programs require the answer to this problem for them to achieve their goal of precision cosmology,” said Schaefer.

Supernova 1994D in the outskirts of the galaxy NGC 4526.
Supernova 1994D in the outskirts of the galaxy NGC 4526.

Many types of star systems have been proposed as being the potential supernovae, such as double white dwarf binaries which were not discovered until 1988, and symbiotic stars which are very rare. But the most promising progenitor is the recurrent novae (RN) which are usually binary systems with matter flowing off a companion star onto a white dwarf. The matter accumulates onto the white dwarf’s surface until the pressure gets high enough to trigger a thermonuclear reaction (like an H-bomb). RNs can have multiple eruptions every century (as opposed to classical novae which have only one observed eruption).

To answer the question if RN are supernova progenitors, Schaefer conducted extensive research to get RN orbital periods, accretion rates, outburst dates, eruption light curves, and the average magnitudes between outbursts.

Artists impression of a recurrent nova.
Artists impression of a recurrent nova.

One big question was whether there were enough RN occurrences to supply the observed rate of supernovae. Another question was if the nova eruption itself blows off more material than is accumulated between eruptions, so the white dwarf would not be gaining mass.

In looking at the old sky photos, he was able count all the discovered eruptions and measure the frequency of RN eruptions back to 1890. He could also measure the mass ejected during an eruption by measuring eclipse times on the archived photos, and then looking at the change in the orbital period across an eruption.

In doing so, Schaefer was able to answer both questions: There was enough RN occurrences to provide sources for the observed Type Ia supernovae rate. “With 10,000 recurrent novae in our Milky Way, their numbers are high enough to account for all of the Type Ia supernovae,” he said.

He also found the mass of the white dwarf is increasing and its collapse will occur within a million years or so, and cause a Type Ia supernova.

Schaefer concluded that roughly one-third of all ‘classical novae’ are really RNe with two-or-more eruptions in the last century.

With this knowledge, astronomical theorists can now perform the calculations to make subtle corrections in using supernovae to measure the Universe’s expansion, which may help the search for dark energy.

An important result from this archival search is the prediction of a RN that will erupt at any time. An RN named U Scorpii (U Sco) is ready to “blow,” and already a large worldwide collaboration (dubbed ‘USCO2009’) has been formed to make concentrated observations (in x-ray, ultraviolet, optical, and infrared wavelengths) of the upcoming event. This is the first time that a confident prediction has identified which star will go nova and which year it will blow up in.

During this search Schaefer also discovered one new RN (V2487 Oph), six new eruptions, five orbital periods, and two mysterious sudden drops in brightness during eruptions.

Another discovery is that the nova discovery efficiency is “horrifyingly low,” Schaefer said, being typically 4%. That is, only 1-out-of-25 novae are ever spotted. Schaefer said this is an obvious opportunity for amateur astronomers to use digital cameras to monitor the sky and discover all the missing eruptions.

Photo archive at Harvard.  Credit: Ashley Pagnotta
Photo archive at Harvard. Credit: Ashley Pagnotta

Schaefer used archives from around the world, with the two primary archives being the Harvard College Observatory in Boston, Massachusetts and at the headquarters of the American Association of Variable Star Observers (AAVSO) in Cambridge, Massachusetts. Harvard has a collection of half-a-million old sky photos covering the entire sky with 1000-3000 pictures of each star going back to 1890. The AAVSO is the clearinghouse for countless measures of star brightness by many thousands of amateurs worldwide from 1911 to present.

Source: Louisiana State University, AAS meeting press conference

Did Dark Matter Power Early Stars?

The galaxy cluster Cl 0024+17 (ZwCl0024+1652) as seen by Hubble’s Advanced Camera for Surveys. Credit: NASA, ESA, M.J. Jee and H. Ford (Johns Hopkins University)

[/caption]
The first stars to light the early universe may have been powered by dark matter, according to a new study. Researchers from the University of Michigan, Ann Arbor call these very first stars “Dark Stars,” and propose that dark matter heating provided the energy for these stars instead of fusion. The researchers propose that with a high concentration of dark matter in the early Universe, the theoretical particles called Weakly Interacting Massive Particles(WIMPs), collected inside the first stars and annihilated themselves to produce a heat source to power the stars. “We studied the behavior of WIMPs in the first stars,” said Katherine Freese and her team in their paper, “and found that they can radically alter the stellar evolution. The annihilation products of the dark matter inside the star can be trapped and deposit enough energy to heat the star and prevent it from further collapse.”

The philosophy behind this research is that 95% of the mass in galaxies and clusters of galaxies is in the form of an unknown type of matter and energy. The researchers say, “The first stars to form in the universe are a natural place to look for significant amounts of dark matter annihilation, because they form at the right place and the right time. They form at high redshifts, when the universe was still substantially denser than it is today, and at the high density centers of dark matter haloes.”

The concentration of dark matter at that time would have been extremely high meaning that any ordinary stars would naturally contain large amounts of dark matter.

Dark stars would have been driven by the annihilation of dark matter particles releasing heat but only in stars larger than 400 solar masses. That turns out to be quite feasible since stars containing smaller amounts of dark matter would naturally grow as they swept up dark matter from nearby space.

The stars continued, and may still continue to be powered by dark matter annihilation as long as there is dark matter for fuel. When the dark matter runs out, they simply collapse to form black holes.

If they exist, Dark Stars should be able to be detected with future telescopes, and if found, would enable the study of WIMPs, and therefore be able to prove the existence of dark matter.

Sources: arXiv, arXiv blog

More Evidence Earth is Not Center of Universe

Spiral Galaxy NGC 4414

[/caption]
If you’re certain the Universe revolves around you, I have some bad news for you. Researchers from the University of British Columbia say Earth’s location in the Universe is utterly unremarkable, despite recent theories that propose Earth is at the center of a giant void in space. A decade ago, it was discovered the Universe’s expansion was accelerating. This continually expanding Universe was attributed to dark energy, the highly repulsive and mysterious stuff that has yet to be detected. But some scientists came up with an alternate theory where Earth was near the centre of a giant void or bubble, mostly empty of matter. But new calculations solidify the case that dark energy permeates the cosmos.

While dark energy sometimes seems pretty far-fetched – with its mysterious and so far undetectable properties – the alternate “void” theory of why the Universe is ever-expanding contains a problem, in that it violates the long held Copernican Principle.

Polish astronomer Nicolaus Copernicus’s 1543 book, On the Revolutions of the Heavenly Spheres, moved Earth from being the center of the Universe to just another planet orbiting the Sun. Since then, astronomers have extended the idea and formed the Copernican Principle, which says that our place in the Universe as a whole is completely ordinary. Although the Copernican Principle has become a pillar of modern cosmology, finding conclusive evidence that our neighborhood of the Universe really isn’t special has proven difficult.

Nicolaus Copernicus
Nicolaus Copernicus

In 1998, studies of distant explosions called “type Ia supernovae” indicated that the expansion of the Universe is accelerating, an observation attributed to the repulsive force of a mysterious “dark energy.” But some cosmologist proposed that Earth was at the center of a void, and that gravity would create the illusion of acceleration, mimicking the effect of dark energy on the supernova observations.

Now some advanced analysis and modeling performed by UBC post-doctoral fellows Jim Zibin and Adam Moss and Astronomy Prof. Douglas Scott is showing that this alternate “void theory” just doesn’t add up.

The researchers used data from the Wilkinson Microwave Anisotropy Probe satellite, which includes members from UBC on its international team, as well as data from various ground-based instruments and surveys.

“We tested void models against the latest data, including subtle features in the cosmic microwave background radiation – the afterglow of the Big Bang – and ripples in the large-scale distribution of matter,” says Zibin. “We found that void models do a very poor job of explaining the combination of these data.”

The team’s calculations instead solidify the conventional view that an enigmatic dark energy fills the cosmos and is responsible for the acceleration of the Universe. “Recent advances in data collection have brought us to the era of precision cosmology,” says Zibin. “Void models are terrible at explaining the new data, but the standard dark energy model works very well.

“Since we can only observe the Universe from Earth, it’s really hard to determine if we’re in a ‘special place,'” says Zibin. “But we’ve now learned that our location is much more ordinary than the strange dark energy that fills the Universe.”

The team’s research is available at Physical Review Letters

Source: EurekAlert