It Doesn’t Get Much Hotter Than Io

Image credit: NASA/JPL
The hottest spot in the solar system is neither Mercury, Venus, nor St. Louis in the summer. Io, one of the four satellites that the Italian astronomer Galileo discovered orbiting Jupiter almost 400 years ago, takes that prize. The Voyager spacecraft discovered volcanic activity on Io over 20 years ago and subsequent observations show that Io is the most volcanically active body in the solar system. The Galileo spacecraft, named in honor of the astronomer Galileo, found volcanic hot spots with temperatures as high as 2,910 Fahrenheit (1,610 Celsius).

Now computer models of volcanic eruptions on Io performed by researchers at Washington University in St. Louis show that the lavas are so hot that they are vaporizing sodium, potassium, silicon and iron and probably other gases as well into its atmosphere.

Using an updated version of MAGMA, a versatile computer program he developed 15 years ago with a Harvard University colleague, Bruce Fegley, Jr., Ph.D., professor of earth and planetary sciences in Arts & Sciences at Washington University in St. Louis, found that some of these elements are vaporized at least partly as single-atom gases. Others are vaporized in different molecular forms, for instance, silicon monoxide, silicon dioxide and iron monoxide.

“Reaction of these gases with sulfur and chlorine species in volcanic gases could lead to the formation of such unusual gases as sodium chloride, potassium chloride, magnesium dichloride and iron dichloride, ” Fegley said.

In 2000, Fegley and former Washington University colleague Mikhail Zolotov, Ph.D., now at Arizona Sate University, predicted formation of sodium chloride and potassium chloride vapor in volcanic gases on Io. Three years later astronomers found sodium chloride gas on Io. However, these observations were not sensitive enough to detect the less abundant potassium chloride vapor.

Now Fegley has found that sodium and potassium in Ionian volcanic gases are being vaporized from the hot lavas. Fegley and research assistant Laura Schaefer of Washington University used data from the Galileo mission and Earth-based observations from high-powered telescopes in their NASA-funded research. They published their results in the May 2004 issue of Icarus, the leading planetary science journal.

“We’re basically doing geology on Io using data from telescopes on Earth, which shows that observations like this can compete with expensive space missions,” said Fegley. “It’s amazing how hot and how volcanically active Io is. It is 30 times more active than Earth. It’s the hottest body outside of the sun in the solar system.”

The innermost of the four major satellites of Jupiter – there are at least 16 – Io gets its high rate of volcanism from tidal interactions with Jupiter, which has the strongest magnetic field of all the planets. Over 100 active volcanoes have been identified on Io. Hotspots there have temperatures as high as 1,600 degrees Celsius. This is several hundred degrees hotter than terrestrial volcanoes like Kilauea in Hawaii, which has a temperature of about 1,000 Celsius (1,830 Fahrenheit).

Fegley and Schaefer found that silicon monoxide is the major silicon-bearing gas over the lavas.

“The interesting thing about this is that astronomers have observed silicon monoxide in other environments in interstellar space, most notably in the atmospheres of cool stars,” said Fegley.

Astronomical observations of actively erupting volcanoes on Io may be able to detect the silicon monoxide gas in its atmosphere.

Fegley and Schaefer recommend an Io volcanic probe mission to directly measure the pressure, temperature and composition of gases of Pele, one of Io’s most active volcanoes. Such an endeavor is “feasible using present technology,” Fegley said. “It would vastly expand our knowledge of the most volcanically active body in the solar system.”

The volcanic probe mission would represent an advance in the effort to unveil some of Io’s mysteries, such as how the satellite, about the size of our own Moon, can maintain its high magma temperatures without being nearly totally molten, and how does Io maintain a strong enough lithosphere to support mountains higher than Mount Everest?

Original Source: WUSTL News Release

Wallpaper: Flood Plains on Mars

Image credit: ESA
These images of fluvial surface features at Mangala Valles on Mars were obtained by the High Resolution Stereo Camera (HRSC) on board the ESA Mars Express spacecraft.

The HRSC has imaged structures several times which are related to fluvial events in the past on Mars.

The region seen here is situated on the south-western Tharsis bulge and shows the mouth of the Mangala Valles and Minio Vallis outflow channels.

The source of the outflow channel is related to the Mangala Fossa, a fissure running east-west for several hundred kilometres.

One theory about its formation is related to a process known on Earth as ?dyke emplacement?.

This is when hot molten rock finds its way to the surface through a fissure, releasing large amounts of water by the melting of subsurface ice.

It is still unclear for how long and to what extent water, mud or even ice masses and wind have carved the channel here.

This theory on its formation has several analogues on Earth. Events like the one proposed for Mangala Valles occur on Earth, for example in Iceland, where volcanic activity causes episodic releases of water from subsurface reservoirs, causing catastrophic floods.

Along the channel troughs, areas with so-called ?chaotic terrain? features favour the idea of the existence of subsurface ice.

The small-scale chaotic terrain is characterised by isolated blocks of surface material which have been randomly arranged during the release of subsurface water and subsequent collapse of the surface.

Huge areas of chaotic terrain can be found near the source areas of the outflow channels around Chryse Planitia, such as Kasei, Maja and Ares Valles.

Beside the large outflow channels, a variety of smaller ?dendritic? valley networks with a number of tributary valleys can be seen near the main channels. This indicates possible precipitation.

The images were taken during orbit 299 with a resolution of 28 metres per pixel. The image centre is located at 209? E longitude and 5? S latitude. For practical use on the internet, the images have been reduced in resolution.

The red/cyan 3D anaglyph image was created using the stereo- and nadir channels of the HRSC. The perspective view was calculated from the digital terrain model derived from the stereo and colour information of the image data.

Original Source: ESA News Release

Gemini Goes Silver

Image credit: Gemini
To investors looking for the next sure thing, the silver coating on the Gemini South 8-meter telescope mirror might seem like an insider’s secret tip-off to invest in this valuable metal for a huge profit. However, it turns out that this immense mirror required less than two ounces (50 grams) of silver, not nearly enough to register on the precious metals markets. The real return on Gemini’s shiny investment is the way it provides unprecedented sensitivity from the ground when studying warm objects in space.

The new coating-the first of its kind ever to line the surface of a very large astronomical mirror-is among the final steps in making Gemini the most powerful infrared telescope on our planet. “There is no question that with this coating, the Gemini South telescope will be able to explore regions of star and planet formation, black holes at the centers of galaxies and other objects that have eluded other telescopes until now,” said Charlie Telesco of the University of Florida who specializes in studying star- and planet-formation regions in the mid-infrared.

Covering the Gemini mirror with silver utilizes a process developed over several years of testing and experimentation to produce a coating that meets the stringent requirements of astronomical research. Gemini’s lead optical engineer, Maxime Boccas who oversaw the mirror-coating development said, “I guess you could say that after several years of hard work to identify and tune the best coating, we have found our silver lining!”

Most astronomical mirrors are coated with aluminum using an evaporation process, and require recoating every 12-18 months. Since the twin Gemini mirrors are optimized for viewing objects in both optical and infrared wavelengths, a different coating was specified. Planning and implementing the silver coating process for Gemini began with the design of twin 9-meter-wide coating chambers located at the observatory facilities in Chile and Hawaii. Each coating plant (originally built by the Royal Greenwich Observatory in the UK) incorporates devices called magnetrons to “sputter” a coating on the mirror. The sputtering process is necessary when applying multi-layered coatings on the Gemini mirrors in order to accurately control the thickness of the various materials deposited on the mirror’s surface. A similar coating process is commonly used for architectural glass to reduce air-conditioning costs and produce an aesthetic reflection and color to glass on buildings, but this is the first time it has been applied to a large astronomical telescope mirror.

The coating is built up in a stack of four individual layers to assure that the silver adheres to the glass base of the mirror and is protected from environmental elements and chemical reactions. As anyone with silverware knows, tarnish on silver reduces the reflection of light. The degradation of an unprotected coating on a telescope mirror would have a profound impact on its performance. Tests done at Gemini with dozens of small mirror samples over the past few years show that the silvered coating applied to the Gemini mirror should remain highly reflective and usable for at least a year between recoatings.

In addition to the large primary mirror, the telescope’s 1-meter secondary mirror and a third mirror that directs light into scientific instruments were also coated using the same protected silver coatings. The combination of these three mirror coatings as well as other design considerations are all responsible for the dramatic increase in Gemini’s sensitivity to thermal infrared radiation.

A key measure of a telescope’s performance in the infrared is its emissivity (how much heat it actually emits compared to the total amount it can theoretically emit) in the thermal or mid-infrared part of the spectrum. These emissions result in a background noise against which astronomical sources must be measured. Gemini has the lowest total thermal emissivity of any large astronomical telescope on the ground, with values under 4% prior to receiving its silver coating. With this new coating, Gemini South’s emissivity will drop to about 2%. At some wavelengths this has the same effect on sensitivity as increasing the diameter of the Gemini telescope from 8 to more than 11 meters! The result is a significant increase in the quality and amount of Gemini’s infrared data, which allows detection of objects that would otherwise be lost in the noise generated by heat radiating from the telescope. It is common among other ground-based telescopes to have emissivity values in excess of 10%

The recoating procedure was successfully performed on May 31, and the newly coated Gemini South mirror has been re-installed and calibrated in the telescope. Engineers are currently testing the systems before returning the telescope to full operations. The Gemini North mirror on Mauna Kea will undergo the same coating process before the end of this year.

Why Silver?
The reason astronomers wish to use silver as the surface on a telescope mirror lies in its ability to reflect some types of infrared radiation more effectively than aluminum. However, it is not just the amount of infrared light that is reflected but also the amount of radiation actually emitted from the mirror (its thermal emissivity) that makes silver so attractive. This is a significant issue when observing in the mid-infrared (thermal) region of the spectrum, which is essentially the study of heat from space. ?The main advantage of silver is that it reduces the total thermal emission of the telescope. This in turn increases the sensitivity of the mid-infrared instruments on the telescope and allows us to see warm objects like stellar and planetary nurseries significantly better,? said Scott Fisher a mid-infrared astronomer at Gemini.

The advantage comes at a price however. To use silver, the coating must be applied in several layers, each with a very precise and uniform thickness. To do this, devices called magnetrons are used to apply the coating. They work by surrounding an extremely pure metal plate (called the target) with a plasma cloud of gas (argon or nitrogen) that knocks atoms out from the target and deposits them uniformly on the mirror (which rotates slowly under the magnetron). Each layer is extremely thin; with the silver layer only about 0.1 microns thick or about 1/200 the thickness of a human hair. The total amount of silver deposited on the mirror is approximately equal to 50 grams.

Studying Heat Originating from Space
Some of the most intriguing objects in the universe emit radiation in the infrared part of the spectrum. Often described as “heat radiation,” infrared light is redder than the red light we see with our eyes. Sources that emit in these wavelengths are sought after by astronomers since most of their infrared radiation can pass through clouds of obscuring gas dust and reveal secrets otherwise shrouded from view. The infrared wavelength regime is split into three main regions, near- , mid- and far-infrared. Near-infrared is just beyond what the human eye can see (redder than red), mid-infrared (often called thermal infrared) represents longer wavelengths of light usually associated with heat sources in space, and far-infrared represents cooler regions.

Gemini’s silver coating will enable the most significant improvements in the thermal infrared part of the spectrum. Studies in this wavelength range include star- and planet-formation regions, with intense research that seeks to understand how our own solar system formed some five billion years ago.

Original Source: Gemini News Release

Molecular Nitrogen Found Outside our Solar System

Image credit: Orbital Sciences
Using NASA’s Far Ultraviolet Spectroscopic Explorer (FUSE) satellite, researchers have for the first time detected molecular nitrogen in interstellar space, giving them their first detailed look into how the universe’s fifth most-abundant element behaves in an environment outside the Solar System.

This discovery, made by astronomers at The Johns Hopkins University, Baltimore, promises to enhance understanding not only of the dense regions between the stars, but also of the very origins of life on Earth.

“Detecting molecular nitrogen is vital for improved understanding of interstellar chemistry,” said David Knauth, a post-doctoral fellow at Johns Hopkins and first author of a paper in the June 10 issue of Nature. “And because stars and planets form from the interstellar medium, this discovery will lead to an improved understanding of their formation, as well.”

Nitrogen is the most prevalent element of Earth’s atmosphere. Its molecular form, known as N2, consists of two combined nitrogen atoms. A team of researchers led by Knauth and physics and astronomy research scientist and co-author B-G Andersson continued investigations of N2 that began in the 1970s with the Copernicus satellite. At least 10,000 times more sensitive than Copernicus, FUSE – a satellite-telescope designed at and operated by Johns Hopkins for NASA – allowed the astronomers to probe the dense interstellar clouds where molecular nitrogen was expected to be a dominant player.
“Astronomers have been searching for molecular nitrogen in interstellar clouds for decades,” said Dr. George Sonneborn, FUSE Project Scientist at NASA Goddard Space Flight Center, Greenbelt, Md. “Its discovery by FUSE will greatly improve our knowledge of molecular chemistry in space.”

The astronomers faced several challenges along the way, including the fact that they were peering through dusty, dense interstellar clouds which blocked a substantial amount of the star’s light. In addition, the researchers confronted a classic Catch-22: Only the brightest stars emitted enough of a signal to allow FUSE to detect molecular nitrogen’s presence, but many of those stars were so bright they threatened to damage the satellite’s exquisitely-sensitive detectors.

HD 124314, a moderately-reddened star in the southern constellation of Centaurus, ended up being the first sight line where researchers could verify molecular nitrogen’s presence. This discovery is an important step in ascertaining the complicated process of how much molecular nitrogen exists in the interstellar medium and how its presence varies in different environments.

“For nitrogen, most models say that a major part of the element should be in the form of N2, but as we had not been able to measure this molecule, it’s been very hard to test whether those models and theories are right or not. The big deal here is that now we have a way to test and constrain those models,” Andersson said.

Launched on June 24, 1999, FUSE seeks to understand several fundamental questions about the Universe. What were the conditions shortly after the Big Bang? What are the properties of interstellar gas clouds that form stars and planetary systems? How are the chemical elements made and dispersed throughout our galaxy?

FUSE is a NASA Explorer mission. Goddard manages the Explorers Program for the Office of Space Science at NASA Headquarters in Washington, D.C. For more on the FUSE mission, go the website at: http://fuse.pha.jhu.edu

Original Source: NASA News Release

Europeans Agree to Build Instrument for Webb Telescope

Image credit: ESA
An agreement between ESA and seven Member States to jointly build a major part of the MIRI instrument, which will considerably extend the capability of the James Webb Space Telescope (JWST), was signed, 8 June 2004.

This agreement also marks a new kind of partnership between ESA and its Member States for the funding and implementation of payload for scientific space missions.

MIRI, the Mid-Infrared Instrument, is one of the four instruments on board the JWST, the mission scheduled to follow on the heritage of Hubble in 2011. MIRI will be built in cooperation between Europe and the United States (NASA), both equally contributing to its funding. MIRI?s optics, core of the instrument, will be provided by a consortium of European institutes. According to this formal agreement, ESA will manage and co-ordinate the whole development of the European part of MIRI and act as the sole interface with NASA, which is leading the JWST project.

This marks a difference with respect to the previous ESA scientific missions. In the past the funding and the development of the scientific instruments was agreed by the participating ESA Member States on the basis of purely informal arrangements with ESA. In this case, the Member States involved in MIRI have agreed on formally guaranteeing the required level of funding on the basis of a multi-lateral international agreement, which still keeps scientists in key roles.

Over the past years, missions have become more complex and demanding, and more costly within an ever tighter budget. They also require a more and more specific expertise which is spread throughout the vast European scientific community. As a result, a new management procedure for co-ordination of payload development has become a necessity to secure the successful and timely completion of scientific space projects. ESA?s co-ordination of the MIRI European consortium represents the first time such an approach has been used, which will be applied to the future missions of the ESA long-term Science Programme ? the ?Cosmic Vision?. The technology package for LISA (LTP), an ESA/NASA mission to detect gravitational waves, is already being prepared under the same scheme.

Sergio Volonte, ESA Co-ordinator for Astrophysics and Fundamental Physics Missions, comments: ?I?m delighted for such an achievement between ESA and its Member States. With MIRI we will start an even more effective co-ordination on developing our scientific instruments, setting a new framework to further enhance their excellence.?

The James Webb Space Telescope (JWST), is a partnership between ESA, NASA and the Canadian Space Agency. Formerly known as the Next Generation Space Telescope (NGST), it is due to be launched in August 2011, and it is considered the successor of the NASA/ESA Hubble Space Telescope. It is three times larger and more powerful than its predecessor and it is expected to shed light on the ‘Dark Ages of the Universe’ by studying the very distant Universe, observing infrared light from the first stars and galaxies that ever emerged.

MIRI (Mid-Infrared Camera-Spectrograph) is essential for the study of the old and distant stellar population; regions of obscured star formation; hydrogen emission from previously unthinkable distances; the physics of protostars; and the sizes of ?Kuiper belt? objects and faint comets.

Further to the contribution to MIRI, Europe through ESA is contributing to JWST with the NIRSPEC (Near-Infrared multi-object Spectrograph) instrument (fully funded and managed by ESA) and, as agreed in principle with NASA, with the Ariane 5 launcher. The ESA financial contribution to JWST will be about 300 million Euros, including the launcher. The European institutions involved in MIRI will contribute about 70 million Euros overall.

The European institutions who signed the MIRI agreement with ESA are: the Centre Nationale des Etudes Spatiales (CNES), the Danish Space Research Institute (DSRI), the German Aerospace Centre (DLR), the Spanish Ministerio de Educaci?n y Ciencia (MEC), the Nederlandse Onderzoekschool voor Astronomie (NOVA), the UK Particle Physics and Astronomy Research Council (PPARC) and the Swedish National Space Board (SNSB).

Four European countries, Belgium, Denmark, Ireland and Switzerland contribute to MIRI through their participation into ESA?s Scientific Experiment Development programme (PRODEX). This is an optional programme, mainly used by smaller countries, by which they delegate to ESA the management of funding to develop scientific instruments.

The delivery to NASA of the MIRI instrument is due for March 2009.

Original Source: ESA News Release

New Estimate for the Mass of Higgs Boson

Image credit: Berkeley Lab
In a case of the plot thickening as the mystery unfolds, the Higgs boson has just gotten heavier, even though the subatomic particle has yet to be found. In a letter to the scientific journal Nature, published in the June 10, 2004 issue, an international collaboration of scientists working at the Tevatron accelerator of the Fermi National Accelerator Laboratory (Fermilab), report the most precise measurements yet for the mass of the top quark ? a subatomic particle that has been found ? and this requires an upward revision for the long-postulated but still undetected Higgs boson.

“Since the top quark mass we are reporting is a bit higher than previously measured, it means the most likely value of the Higgs mass is also higher,” says Ron Madaras, a physicist with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), who heads the local participation in the D-Zero experiment at the Tevatron. “The most likely Higgs mass has now been increased from 96 to 117 GeV/c2” ? GeV/c2 is a common particle-physics unit of mass; the mass of the proton measures about 1 GeV/c2 ? “which means it’s probably beyond the sensitivity of current experiments, but very likely to be found in future experiments at the Large Hadron Collider being built at CERN.”

The Higgs boson has been called the missing link in the Standard Model of Particles and Fields, the theory that’s been used to explain fundamental physics since the 1970s. Prior to 1995 the top quark was also missing, but then the experimental teams working at the Tevatron’s two large detector systems, D-Zero and CDF, were able to discover it independently.

Scientists believe that the Higgs boson, named for Scottish physicist Peter Higgs, who first theorized its existence in 1964, is responsible for particle mass, the amount of matter in a particle. According to the theory, a particle acquires mass through its interaction with the Higgs field, which is believed to pervade all of space and has been compared to molasses that sticks to any particle rolling through it. The Higgs field would be carried by Higgs bosons, just as the electromagnetic field is carried by photons.

“In the Standard Model, the Higgs boson mass is correlated with top quark mass,” says Madaras, “so an improved measurement of the top quark mass gives more information about the possible value of the Higgs boson mass.”

According to the Standard Model, at the beginning of the universe there were six different types of quarks. Top quarks exist only for an instant before decaying into a bottom quark and a W boson, which means those created at the birth of the universe are long gone. However, at Fermilab’s Tevatron, the most powerful collider in the world, collisions between billions of protons and antiprotons yield an occasional top quark. Despite their brief appearances, these top quarks can be detected and characterized by the D-Zero and CDF experiments.

In announcing the D-Zero results, experiment cospokesperson John Womersley said, “An analysis technique that allows us to extract more information from each top quark event that occurred in our detector has yielded a greatly improved precision of plus or minus 5.3 GeV/c2 in the top mass measurement, compared with previous measurements. The new measurement is comparable to the precision of all previous top quark mass measurements put together. When this new result is combined with all other measurements from both the D-Zero and CDF experiments, the new world average for the top mass becomes 178.0 plus or minus 4.3 GeV/c2.”

The D-Zero detector system consists of a central tracking detector array, a hermetic calorimeter for measuring energy, and a large solid-angle muon detector system. Berkeley Lab designed and built the two electromagnetic end-cap calorimeters and also the initial vertex detector, the innermost component of the tracking system. Tracking detectors supplement calorimeters by measuring particle trajectories. Only when trajectory and energy measurements are combined can scientists identify and characterize particles.

While raising the central value for the top quark mass appears to diminish the possibility that the Higgs boson could be discovered at the Tevatron, it does open a wider door for new discoveries in supersymmetry, also known as SUSY, an extension of the Standard Model that unites particles of force and matter through the existence of superpartners (sometimes referred to as “sparticles”). Supersymmetry seeks to fill gaps left by the Standard Model.

“The current mass limits or bounds that exclude supersymmetric particles are very sensitive to the top quark mass,” says Madaras. “Since the top quark mass is now higher, these limits or bounds are not as severe, which increases the chance of seeing supersymmetric particles at the Tevatron.”

Scientists from nearly 40 US universities and 40 foreign institutions contributed to the data analysis reported in the letter to Nature by the D-Zero experimental group. Berkeley Lab co-authors of the letter in addition to Madaras were Mark Strovink, Al Clark, Tom Trippe, and Daniel Whiteson.

Fermilab Director Michael Witherell said in a statement that these results do not end the story of precision measurements of the top quark mass. “The two collider detectors, D-Zero and CDF, are recording large amounts of data in Run II of the Tevatron. The CDF collaboration has recently reported preliminary new measurements of the top mass based on Run II data. The precision of the world average will improve further when their results are final. Over the next few years, both experiments will make increasingly precise measurements of the top quark mass.”

Fermilab, like Berkeley Lab, is funded by the Department of Energy?s Office of Science. In response to the Nature letter from the D-Zero group, Raymond L. Orbach, Director of the Office of Science, said: ?These important results demonstrate how our scientists are applying new techniques to existing data, producing new estimates for the mass of the Higgs boson. We eagerly await the next round of results from the vast quantities of data that are generated today at the Fermilab Tevatron.?

Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California. Fermilab is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Universities Research Association, Inc.

Original Source: Berkeley Lab News Release

Phoebe: Cassini’s First Target

Image credit: NASA/JPL/Space Science Institute
The Cassini spacecraft is closing in fast on its first target of observation in the Saturn system: the small, mysterious moon Phoebe, only 220 kilometers (137 miles) across.

The three images shown here, the latest of which is twice as good as any image returned by the Voyager 2 spacecraft in 1981, were captured in the past week on approach to this outer moon of Saturn. Phoebe’s surface is already showing a great deal of contrast, most likely indicative of topography, such as tall sunlit peaks and deep shadowy craters, as well as genuine variation in the reflectivity of its surface materials. Left to right, the three views were captured at a Sun-Saturn-spacecraft, or phase, angle of 87 degrees between June 4 and June 7, from distances ranging from 4.1 million km (2.6 million miles) to 2.5 million km (1.5 million miles). The image scale ranges from 25 to 15 km (16 to 9 miles) per pixel.

The images have been magnified eight times using a linear interpolation scheme; the contrast has been untouched. Phoebe rotates once every 9 hours and 16 minutes; each of these images shows a different region on Phoebe.

Cassini’s powerful cameras will provide the best-ever look at this moon on Friday, June 11, when the spacecraft will streak past Phoebe at a distance of only about 2,000 kilometers (1,240 miles) from the moon’s surface. The current images, and the presence of large craters, promise a heavily cratered surface which will come into sharp view over the next few days when image scales should shrink to a few 10’s of meters…the size of office buildings.

Because of its small size and retrograde orbit – Phoebe orbits Saturn in a direction opposite to that of the larger inner Saturnian moons – and because of the presence of water ice on its surface, Phoebe is believed to be a body from the distant outer solar system, one of the building blocks of the outer planets that was captured into orbit around Saturn. If true, the little moon will provide a windfall of precious information about a primitive piece of the solar system that has never before been explored up close.

Phoebe was the first moon discovered using photography in 1898 and has a very dark surface. It has long been believed that material coming from Phoebe’s surface and impacting the very dark leading hemisphere of Iapetus may play some role in thelatter’s extreme albedo asymmetry, though the precise relationship is unclear. Cassini should help solve this and other mysteries during its exciting encounter with Phoebe.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini-Huygens mission for NASA’s Office of Space Science, Washington, D.C. The imaging team is based at the Space Science Institute, Boulder, Colorado.

For more information about the Cassini-Huygens mission, visit http://saturn.jpl.nasa.gov and the Cassini imaging team home page, http://ciclops.org.

Original Source: CICLOPS News Release

New Horizons Mission Will Measure the Solar Wind out at Pluto

Image credit: NASA/JHUAPL/SwRI
The Solar Wind Around Pluto (SWAP) instrument aboard the New Horizons spacecraft is designed to measure the interactions of Pluto and Charon with the solar wind, the high-speed stream of charged particles flowing out from the sun. Understanding these interactions will expand researchers’ knowledge of the astrophysical processes affecting these bodies and that part of the solar system.

The space science community understands the extremes (called the bounding states) of solar wind interactions with planets, comets and other bodies, but no one knows what kind of interaction is present at Pluto. Comet Borrelly represents a strong interaction with the solar wind, while Venus represents a weak one.

“We expect solar wind interactions at Pluto to lie somewhere between the strong and weak extremes,” says SWAP Principal Investigator Dr. David J. McComas, a senior executive director at Southwest Research Institute? (SwRI?).

After taking measurements at Pluto, researchers plan to use the SWAP data to define basic parameters about the system. For example, once researchers know how such material comes off Pluto, they can then estimate the amount of Pluto’s atmosphere that escapes into space. This will reveal insights into the structure and destiny of the atmosphere itself.

SWAP would go on to take similar measurements at Charon and at least one Kuiper belt object; however, the team expects those interactions to be much weaker simply because the atmospheres of these objects are expected to be less extensive and not likely to emit much material.

Another of the many Pluto mysteries is where the interactions of the solar wind will occur around the planet, so science plans call for SWAP to take continuous measurements as it nears and passes Pluto.

“We know when and where to use some of the instruments to take an image or a measurement at Pluto,” says McComas. “Solar wind interactions, however, present quite a challenge because we’re trying to measure this invisible thing surrounding Pluto at an uncertain distance from it.”

“The science we expect SWAP to perform is impossible to accomplish without actually going to Pluto-Charon and directly sampling its environment. That capability is something that NASA pioneered and which, to this day, only the United States can do,” says Dr. Alan Stern, principal investigator of New Horizons and an executive director at SwRI.

The incredible distances of Pluto from the sun required that the SWAP team build the largest aperture instrument ever used to measure the solar wind. It allows SWAP to make measurements even when the solar wind is very tenuous. The instrument also combines a retarding potential analyzer (RPA) with an electrostatic analyzer (ESA) to enable extremely fine, accurate energy measurements of the solar wind.

“Should the interaction between Pluto and the solar wind turn out to be very small, the RPA and ESA combination will allow us to measure minute changes in solar wind speed,” says Scott Weidner, the SWAP instrument manager and an SwRI principal scientist.

The various instruments aboard New Horizons were designed and are being built independently, yet they are expected to work together to reveal significant new insights about Pluto, Charon and their Kuiper belt neighbors. SWAP measures low energy interactions, such as those caused by the solar wind. Its complement, the Pluto Energetic Particle Spectrometer Science Investigation, or PEPSSI, will look at higher energy particles, such as pickup ions. The top of SWAP’s energy range can measure some pickup ions, and PEPSSI picks up where SWAP leaves off to see the highest energy interactions.

The sun and its solar wind affect the entire solar system and should create interesting science opportunities for SWAP throughout its planned nine-year voyage to Pluto. SWAP will operate for more than a month each year and will sample heliospheric pickup ions?ions that originate in interstellar space and get ionized when they come near the sun. Other pickup ions come from material inside the solar system. Researchers have shown that even collisions between Kuiper belt objects result in tiny grains that drift toward the sun, evaporate and become ionized. The Cassini spacecraft, when it reaches Saturn this July, will allow researchers to observe
these so-called “outer source” pickup ions to 10 astronomical units (AU, the distance from the Earth to the sun), the region where pickup ions from the outer source are believed to begin.

“We’ll be out to 30 AU before New Horizons even reaches Pluto. While we’re targeting a Kuiper belt object, we could be anywhere from 30 to 50 AU, where the influence of heliospheric pickup ions becomes greater and greater in the solar wind,” says McComas. “On the journey out to Pluto, we’ll be able to validate or disprove the outer source theory, which is an exciting warm up to reaching Pluto itself.”

Original Source: SWRI News Release

New Simulation Improves Ideas of Galaxy Formation

Image credit: U of Chicago
Astrophysicists led by the University of Chicago?s Andrey Kravtsov have resolved an embarrassing contradiction between a favored theory of how galaxies form and what astronomers see in their telescopes.

Astrophysicists base their understanding of how galaxies form on an extension of the big bang theory called the cold dark matter theory. In this latter theory, small galaxies collide and merge, inducing bursts of star formation that create the different types of massive and bright galaxies that astronomers see in the sky today. (Dark matter takes its name from the idea that 85 percent of the total mass of the universe is made of unknown matter that is invisible to telescopes, but whose gravitational effects can be measured on luminous galaxies.)

This theory fits some key data that astrophysicists have collected in recent years. Unfortunately, when astrophysicists ran supercomputer simulations several years ago, they ended up with 10 times more dark matter satellites?clumps of dark matter orbiting a large galaxy?than they expected.

?The problem has been that the simulations don?t match the observations of galaxy properties,? said David Spergel, professor of astrophysics at Princeton University. ?What Andrey?s work represents is a very plausible solution to this problem.?

Kravtsov and his collaborators found the potential solution in new supercomputer simulations they will describe in a paper that will appear in the July 10 issue of the Astrophysical Journal. ?The solution to the problem is likely to be in the way the dwarf galaxies evolve,? Kravtsov said, referring to the small galaxies that inhabit the fringes of large galaxies.

In general, astrophysicists believe that formation of very small dwarf galaxies should be suppressed. This is because gas required for continued formation of stars can be heated and expelled by the first generation of exploding supernovae stars. In addition, ultraviolet radiation from galaxies and quasars that began to fill the universe approximately 12 billion years ago heats the intergalactic gas, shutting down the supply of fresh gas to dwarf galaxies.

In the simulations, Kravtsov, along with Oleg Gnedin of the Space Telescope Science Institute and Anatoly Klypin of New Mexico State University, found that some of the dwarf galaxies that are small today have been more massive in the past and could gravitationally collect the gas they need to form stars and become a galaxy.

?The systems that appear rather feeble and anemic today could, in their glory days, form stars for a relatively brief period,? Kravtsov said. ?After a period of rapid mass growth, they lost the bulk of their mass when they experienced strong tidal forces from their host galaxy and other galaxies surrounding them.?

This galactic ?cannibalism? persists even today, with many of the ?cannibalized? dwarf galaxies becoming satellites orbiting in the gravitational pull of larger galaxies.

?Just like the planets in the solar system surrounding the sun, our Milky Way galaxy and its nearest neighbor, the Andromeda galaxy, are surrounded by about a dozen faint ?dwarf? galaxies,? Kravtsov said. ?These objects were pulled in by the gravitational attraction of the Milky Way and Andromeda some time ago during their evolution.?

The simulations had succeeded where others had failed because Kravtsov?s team analyzed simulations that were closely spaced in time at high resolution. This allowed the team to track the evolution of individual objects in the simulations. ?This is rather difficult and is not often done in analyses of cosmological simulations. But in this case it was the key to recognize what was going on and get the result,? Kravtsov said.

The result puts the cold dark matter scenario on more solid ground. Scientists had attempted to modify the main tenets of the scenario and the properties of dark matter particles to eliminate the glaring discrepancy between theory and observation of dwarf galaxies. ?It turns out that the proposed modifications introduced more problems than they solved,? Kravtsov said.

The simulations were performed at the National Center for Supercomputer Applications, University of Illinois at Urbana-Champaign, with grants provided by the National Science Foundation and the National Aeronautics and Space Administration.

Original Source: University of Chicago News Release

How Deforestation in Brazil is Affecting Local Climate

Image credit: NASA
NASA satellite data are giving scientists insight into how large-scale deforestation in the Amazon Basin in South America is affecting regional climate. Researchers found during the Amazon dry season last August, there was a distinct pattern of higher rainfall and warmer temperatures over deforested regions.

Researchers analyzed multiple years of data from NASA’s Tropical Rainfall Measuring Mission (TRMM). They also used data from the Department of Defense Special Sensor Microwave Imager and the National Oceanic and Atmospheric Administration’s Geostationary Operational Environmental Satellites.

The study appeared in a recent issue of the American Meteorological Society’s Journal of Climate. Lead authors, Andrew Negri and Robert Adler, are research meteorologists at NASA’s Goddard Space Flight Center (GSFC), Greenbelt, Md. Other authors include Liming Xu, formerly of the University of Arizona, Tucson, and Jason Surratt, North Carolina State University, Raleigh.

“In deforested areas, the land heats up faster and reaches a higher temperature, leading to localized upward motions that enhance the formation of clouds and ultimately produce more rainfall,” Negri said.

The researchers caution the rainfall increases were most pronounced in August, during the transition from dry to wet seasons. In this transition period, the effects of land cover, such as evaporation, are not overwhelmed by large-scale weather disturbances that are common during the rest of the year. While the study, based on satellite data analysis, focused on climate changes in the deforested areas, large increases in cloud cover and rainfall were also observed in the naturally un-forested savanna region and surrounding the urban area of Port Velho, Brazil, particularly in August and September.

Recent studies by Dr. Marshall Shepherd cited similar findings, including an average rain-rate increase of 28 percent downwind of urban areas and associated changes in the daily timing of cloud formation and precipitation. He is also a research meteorologist at GSFC.

This research confirmed the Amazon savanna region experienced a shift in the onset of cloudiness and rainfall toward the morning hours. The shift was likely initiated by the contrast in surface heating across the deforested and savanna region.

The varied heights of plants and trees in the region change the aerodynamics of the atmosphere, creating more circulation and rising air. When the rising air reaches the dew point in the cooler, upper atmosphere, it condenses into water droplets and forms clouds.

Negri acknowledged other factors are involved. The savanna in this study is approximately 100 kilometers (62 miles) wide, the perfect size to influence precipitation, such as rain showers and thunderstorms. Earlier studies hypothesized certain land surfaces, such as bands of vegetation 50 to 100 kilometers (31-62 miles) wide in semiarid regions, could result in enhanced precipitation.

This research is in agreement with the recent and sophisticated computer models developed by the Massachusetts Institute of Technology. The models concluded small-scale circulations, including the mixing and rising of air induced by local land surfaces, could enhance cloudiness and rainfall. Many earlier studies that relied on models developed in the 1990s or earlier concluded widespread deforestation of the Amazon Basin would lead to decreased rainfall.

“The effects here are rather subtle and appear to be limited to the dry season. The overall effect of this deforestation on annual and daily rainfall cycles is probably small and requires more study,” Negri said. Future research will use numerical models for investigating the linkage between deforested land surface and the cloud-precipitation components of the water cycle.

NASA’s Earth Science Enterprise is dedicated to understanding the Earth as an integrated system and applying Earth System Science to improve prediction of climate, weather, and natural hazards using the unique vantage point of space.

Original Source: NASA News Release