No Life Possible at Edges of the Pinwheel Galaxy

The bright red spots at the edge of the Pinwheel Galaxy means bad news for life. Image credit: NASA/JPL-Caltech/STScI

[/caption]
Another beautiful image from the Spitzer Space Telescope; in this case, it’s Messier 101, more commonly known as the Pinwheel Galaxy. But the pretty red highlights at the edges of the galaxy are bad news for anyone looking for evidence of life. “If you were going look for life in Messier 101, you would not want to look at its edges,” said Karl Gordon of the Space Telescope Science Institute. “The organics can’t survive in these regions, most likely because of high amounts of harsh radiation.” The red color highlights a zone where organic molecules called polycyclic aromatic hydrocarbons (PAHs), which are present throughout most of the galaxy, suddenly disappear.

PAHs are dusty, carbon-containing molecules found in star nurseries. They’re also found on Earth in barbeque pits, exhaust pipes and anywhere combustion reactions take place. Scientists believe this space dust has the potential to be converted into the stuff of life.

The Pinwheel galaxy is located about 27 million light-years away in the constellation Ursa Major. It has one of the highest known gradients of metals (elements heavier than helium) of all nearby galaxies in our universe. In other words, its concentrations of metals are highest at its center, and decline rapidly with distance from the center. This is because stars, which produce metals, are squeezed more tightly into the galaxy’s central quarters.

Gordon’s team also wanted to learn more about the gradient of the PAHs. Using Spitzer’s Infrared Array Camera and the Infrared Spectograph to carefully analyze the spectra of the PAHs, astronomers can more precisely identify the PAH features, and even deduce information about their chemistry and temperature. The astronomers found that, like the metals, the polycyclic aromatic hydrocarbons decrease in concentration toward the outer portion of the galaxy. But, unlike the metals, these organic molecules quickly drop off and are no longer detected at the very outer rim.

“There’s a threshold at the rim of this galaxy, where the organic material is getting destroyed,” said Gordon.

The findings also provide a better understanding of the conditions under which the very first stars and galaxies arose. In the early universe, there were not a lot of metals or PAHs around. The outskirt of the Pinwheel galaxy therefore serves as a close-up example of what the environment might look like in a distant galaxy.

In this image, infrared light with a wavelength of 3.6 microns is colored blue; 8-micron light is green; and 24-micron light is red. All three of Spitzer instruments were used in the study: the infrared array camera, the multiband imaging photometer and the infrared spectrograph.

Original News Source: JPL

NASA to Develop GPS-Like System for the Moon

Future astronauts may use GPS-like system. Credit: The Ohio State University

During the second moonwalk of the Apollo 14 mission, Alan Shepard and Edgar Mitchell were hoping to walk to the 300 meter (1,000 feet) wide Cone Crater on the moon, not far from their landing site. However, the two astronauts were not able to find the crater’s rim amid the rolling, repetitive terrain. Later analysis using pictures the two astronauts took determined they had come within 65 feet of the crater. People are used to having certain visual cues to judge distances, such as the size of a building or another car on the horizon, said Ron Li, who has been awarded a $1.2 million grant to develop a navigation system to be used on the moon. Since the moon has no landmarks or cues to help determine distance, getting lost, or misjudging a distant object’s size and location would be easy, and extremely dangerous. New technology like sensors, inertial navigation systems, cameras, computer processors, and image processors will make the next trip to the moon easier for astronauts.

Li, from The Ohio State University, developed software for the Mars rovers Spirit and Opportunity, which has helped him learn a lot about navigation. The navigation system to help future astronauts find their way around moon won’t use satellites; instead the system will rely on signals from a set of sensors including lunar beacons, stereo cameras, and orbital imaging sensors.

Images taken from orbit will be combined with images from the surface to create maps of lunar terrain. Motion sensors on lunar vehicles and on the astronauts themselves will allow computers to calculate their locations. Signals from lunar beacons, the lunar lander, and base stations will give astronauts a picture of their surroundings similar to what drivers see when using a GPS device on Earth. The researchers have named the entire system the Lunar Astronaut Spatial Orientation and Information System (LASOIS).

Astronauts will have a keypad and screen, possibly right on their spacesuits, to view their location and search for new destinations.

Keeping astronauts safe will be a top priority for Li’s team, which includes experts in psychology and human-computer interaction as well as engineering.

“We will help with navigation, but also with astronauts’ health as well,” Li said. “We want them to avoid the stress of getting lost, or getting frustrated with the equipment. Lunar navigation isn’t just a technology problem, it’s also biomedical.”

News Source: The Ohio State University

NASA’s Use of Cadavers to Test the Orion Capsule

Orion Crew Capsule. Credit: Howstuffworks.com

[/caption]
NASA is debating whether the new Orion capsule should land in the water, like Apollo, or on land, similar to how the Russian Soyuz capsule returns to Earth. To help them determine the potential for human injuries with each possible landing scenario, NASA has used human cadavers during their tests. At first, this revelation may seem quite morbid or even gruesome. But as Keith Cowing said in his expose article on Space Ref and NASA Watch on this subject, “Given the potentially hazardous nature of the tests required, cadavers must be used in the place of living persons.” Sometimes, crash-test dummies or computer simulations don’t provide the crucial information needed, such as the forces on the spinal cord or internal organs. If NASA doesn’t have that information, they can’t get accurate test results. Living test subjects could possibly be killed during the landing tests. Imagine the headlines if that happened. So they have used cadavers. The cadavers NASA used were donated to science to be used for exactly this type of purpose, and NASA, of course, went through the proper channels to obtain the cadavers and treats them in an ethical manner. So while this may seem a little grisly, NASA is doing the right thing.

Marc Carreau from the Houston Chronicle also wrote an article on this subject, and he interviewed David Steitz, a spokesman for NASA’s medical division. “It’s a socially awkward topic,” Steitz said. “The bodies are all carefully handled through all of the tests. We follow ethical medical procedures with these bodies that have been donated for science.”

Three human bodies were used during testing last year, said NASA seat engineer Dustin Gohmert, to help determine the potential for serious human injury during descent and landing. “The interface between the spacesuit and the seats is relatively complex, much more so than in an automobile, even one from the racing industry,” Gohmert said. “The (forces) we anticipate have never been studied before. We are using this research to help define and refine the suits and the seats.”

Tests using human bodies has been done for previous spacecraft, as well.

Cowing received this statement from NASA on the use of cadavers:

“In limited cases, postmortem human subject tests may be performed when insufficient data are available from simulations that use dummies or from mathematical modeling of the human body responses. This is particularly critical where the dynamic responses of internal organs and soft tissue must be evaluated. Using a combination of test methods, the engineering and scientific teams at NASA are able to enhance astronaut safety by designing landing attenuation systems that will minimize accelerations imparted to the crew and significantly reduce the potential for injuries.”

Personally, I could imagine donating my body for this type of research. Even if I never get to fly to space when I’m alive, I’d be proud to help the rest of the human race get there and return safely by giving my body for tests such as this.

News Sources: NASA Watch, Space Ref, Houston Chronicle

Hubble Survey of Gravitational Lenses Yields Measure of Dark Matter in Distant Galaxies

Hubble Space Telescope image shows Einstein ring of one of the SLACS gravitational lenses, with the lensed background galaxy enhanced in blue. A. Bolton (UH/IfA) for SLACS and NASA/ESA.

[/caption]
An international team of astronomers have compiled the largest-ever single collection of “gravitational lens” galaxies, and their survey yielded information on the masses of galaxies, including an inference of the amount of dark matter. Gravitational lensing occurs when two galaxies happen to aligned with one another along our line of sight in the sky. The gravitational field of the nearer galaxy distorts the image of the more distant galaxy into multiple arc-shaped images. Sometimes this effect even creates a complete ring, known as an “Einstein Ring.” The findings of this survey helps settle a long standing debate over the relationship between and mass and luminosity in galaxies.

Using the Advanced Camera for Surveys on the Hubble Space Telescope to image galaxies that had been identified as gravitational lens galaxies by the Sloan Digital Sky Survey, the team was able to measure the distances to both galaxies in each “lensing” set, as well as measure the masses of each galaxy.

Gravitational lensing creates a “mirage” of a ring, and the Einstein ring images can be up to 30 times brighter than the image of the distant galaxy would be in the absence of the lensing effect. By combining Hubble and Sloan data into the Sloan Lens ACS (or SLACS) Survey, the team was able to make a mathematical model describing the lensing effect and use that model to illustrate what we would see if we could remove the lensing effect.

Animation of the lensing effect.

“The SLACS collection of lenses is especially powerful for science,” said Adam Bolton from the University of Hawaii, lead author of two papers describing these latest results. “For each lens, we measured the apparent sizes of the Einstein rings on the sky using the Hubble images, and we measured the distances to the two galaxies of the aligned pair using Sloan data. By combining these measurements, we were able to deduce the mass of the nearer galaxy.”

By considering these galaxy masses along with measurements of their sizes, brightnesses, and stellar velocities, the SLACS astronomers were able to infer the presence of “dark matter” in addition to the visible stars within the galaxies. Dark matter is the mysterious, unseeable material that is the majority of matter in the universe. And with such a large number of lens galaxies across a range of masses, they found that the fraction of dark matter relative to stars increases systematically when going from galaxies of average mass to galaxies of high mass.

Mosaic of the SLACS galaxies.  Credit:  SLACS and NASA/ESA.
Mosaic of the SLACS galaxies. Credit: SLACS and NASA/ESA.

Albert Einstein predicted the existence of gravitational lenses in the 1930’s, but the first example was not discovered until the late 1970s. Since then, many more lenses have been discovered, but their scientific potential has been limited by the disparate assortment of known examples. The SLACS Survey has significantly changed this situation by discovering a single large and uniformly selected sample of strong lens galaxies. The SLACS collection promises to form the basis of many further scientific studies.

Original News Source: University of Hawaii

Mars Arctic in 3D from Phoenix

OK, everyone: get out your funky 3-D glasses for a whole new look at Mars! We’ve seen the smooth plains of Meridiani from Opportunity in 3-D; we’ve gazed upon the rocky terrain of Gusev Crater from Spirit in more than two dimensions. But now it’s time to feast your eyes on Mars’ arctic tundra as its never been seen before: in super frozen 3-D from the Phoenix lander! The image above shows a color, stereoscopic 3D view of the Martian surface near the lander, and is one of Phoenix’s workplaces called “Wonderland.” But wait! There’s more…..


This 3-D view is from an image acquired by Phoenix’s Surface Stereo Imager on Sol 33, the 33rd Martian day of the mission (June 28, 2008). Phoenix’s solar panel is seen in the bottom right corner of the image.


Here’s a close up view of where all the action has been taking place recently: the trench called “Snow White.” The hole to the left of the trench, seen in the upper left of the image, is informally called “Burned Alive. This image was taken on Sol 22, but recently, Phoenix has scooped and rasped the area in an effort to get “shaved ice” samples.

Here’s a great touchy-feely 3-D image (don’t you just want to reach out and touch that rock?) The largest rock seen in this image is called “Midgard.” The edge of Phoenix’s deck is seen in the bottom right corner of the image.

There’s lots more 3-D loveliness at the Phoenix Image Gallery. Have fun!

Super-Sensitive, Ultra-Small Device Heightens Infrared Capabilities

Physics Prof. Michael Gershenson with laboratory equipment used to fabricate ultra-sensitive, nano-sized infrared light detector. Credit: Carl Blesch

[/caption]
A tiny new circuit could make a big difference in the way astronomers can see infrared light. This newly developed nano-sized electronic device is 100 times smaller than the thickness of a human hair, and is sensitive to faint traces of light in the far-infrared spectrum, well beyond the colors humans see. Infrared light makes up 98% of the light emitted since the Big Bang. Better detection methods with this new device should provide insights into the earliest stages of star and galaxy formation almost 14 billion years ago.


“In the expanding universe, the earliest stars move away from us at a speed approaching the speed of light,” said Michael Gershenson, professor of physics at Rutgers and one of the lead investigators. “As a result, their light is strongly red-shifted when it reaches us, appearing infrared.”

But Earth’s thick atmosphere absorbs far-infrared light, and ground-based radio telescopes cannot detect the very faint light emitted by these far-away stars. So scientists are proposing a new generation of space telescopes to gather this light. But new and better detectors are needed to take the next step in infrared observing.

Currently bolometers are used, which detect infrared and submillimeter waves by measuring the heat generated when photons are absorbed.

“The device we built, which we call a hot-electron nanobolometer, is potentially 100 times more sensitive than existing bolometers,” Gershenson said. “It is also faster to react to the light that hits it.”
The new device is made of titanium and niobium metals. Its about 500 nanometers long and 100 nanometers wide and was made using techniques similar to those used in computer chip manufacturing. The device operates at very cold temperatures – about 459 degrees below zero Fahrenheit, or one-tenth of one degree above absolute zero on the Kelvin scale.

Photons striking the nanodetector heat electrons in the titanium section, which is thermally isolated from the environment by superconducting niobium leads. By detecting the infinitesimal amount of heat generated in the titanium section, one can measure the light energy absorbed by the detector. The device can detect as little as a single photon of far infrared light.

“With this single detector, we have demonstrated a proof of concept,” said Gershenson. “The final goal is to build and test an array of 100 by 100 photodetectors, which is a very difficult engineering job.”

Rutgers and the Jet Propulsion Laboratory are working together to build the new infrared detector.
Gershenson expects the detector technology to be useful for exploring the early universe when satellite-based far-infrared telescopes start flying 10 to 20 years from now. “That will make our new technology useful for examining stars and star clusters at the farthest reaches of the universe,” he said.

The team’s orginal paper can be found here.
Original News Source: Rutgers State University

How Future Missions Could Detect Organisms Inside Rocks on Mars

Jarosite in New Zealand. Credit: Michelle Kotler

[/caption][/caption]
For a geologist, looking inside a rock is essential to help determine the makeup and history of the rock sample. That’s why geologists have rock hammers, and also why the Mars Exploration Rovers, Spirit and Opportunity, have their Rock Abrasion Tool. For future missions to Mars, or even for a sample return mission, one of the main goals will be to look for signs of life, past or present, that might be hiding inside the rocks. Scientists are working on a new, simple technique for detecting biological and pre-biotic molecules that become trapped inside the minerals in rocks.

This new technique utilizes a laser-based optical and chemical imager or LOCI. A single laser shot vaporizes a small portion of the surface into individual ions. These pass through a mass spectrometer, which can identify each ion by how much mass and charge it has. The great thing about this technique is that the sample requires no preparation: just shoot and detect.

Previous techniques for required that the minerals be dissolved in a solution or mixed in with some other medium, which dilutes the sample and runs the risk of introducing contamination.

Jill Scott of Idaho National Laboratory with the laser-based optical and chemical imager (LOCI). Credit: Idaho National Lab
Jill Scott of Idaho National Laboratory with the laser-based optical and chemical imager (LOCI).  Credit: Idaho National Lab
This procedure was tested on Earth using samples of the mineral jarosite. Jarosite is a yellowish-brown sulfate mineral containing iron, potassium and hydroxide. It is found in places around the world such as southern California beaches and volcanic fields in New Zealand. It forms only in the presence of highly acidic water.

In 2004, jarosite was discovered on Mars by the rover Opportunity. Scientists immediately recognized the find as clear evidence for past water on the red planet.

But there is something else about jarosite that makes it interesting. On Earth, for jarosite to form, oxidation of the rock must occur – usually the rock is pyrite (ferrous sulfide). And on Earth, the oxidation reaction is usually performed by certain “rock-eating” microorganisms.

Scientists say the rate of the jarosite formation would be extremely slow without microbes, as well as without the presence of water.

Whether jarosite can form without the assistance of these microbes is very difficult to say, since every corner of Earth is occupied by little bugs of some sort or another.

And yet, there remains the tantalizing possibility that jarosite on Mars exists because of some little, rock-eating microbes. If so, remnants of these organisms may be locked in the mineral. And there’s only one way to find out: look inside Mars rocks.

Right now, this method couldn’t be used on the next bigger Mars rover, the Mars Science Laboratory, which will hopefully launch in 2009. The LOCI instrument is just too big and too complex to use remotely, said David Beaty, chief scientist of the Mars Exploration Directorate at the Jet Propulsion Laboratory.

But it could be used on a sample return mission. But hopefully, scientists will be able to develop a smaller, simpler version to be used on future missions to look for signs of life in rocks on Mars.

Original News Source: Astrobiology Magazine

An Alien View of the Moon Transiting Earth

Series of images showing the Moon transiting Earth, captured by NASA's EPOXI spacecraft.

[/caption]

Ever wonder what an approaching alien spacecraft would see as it comes within tracking range of our Earth/Moon system? NASA’s EXPOXI mission, which uses the old Deep Impact spacecraft, has created a video of the moon transiting (passing in front of) Earth as seen from the spacecraft’s point of view 50 million kilometers (31 million miles) away. Scientists are using the video to develop techniques to study alien worlds. “Making a video of Earth from so far away helps the search for other life-bearing planets in the Universe by giving insights into how a distant, Earth-like alien world would appear to us,” said astronomer Michael A’Hearn, principal investigator for the Deep Impact extended mission, called EPOXI. The video is pretty amazing and there’s actually two versions of the video; the first one uses a red-green-blue filter, showing how it looks with our human eyes, and the second uses an infrared-green-blue, which makes the vegetation on the land masses show up in red.

And the infrared version:

EPOXI is a combination of the names for the two extended mission components: a search for alien (extrasolar) planets during the cruise to Hartley 2, called Extrasolar Planet Observations and Characterization (EPOCh), and the flyby of comet Hartley 2, called the Deep Impact eXtended Investigation (DIXI).

“To image Earth in a similar fashion, an alien civilization would need technology far beyond what Earthlings can even dream of building,” said Sara Seager, a planetary theorist at the Massachusetts Institute of Technology, Cambridge, Mass., and a co-investigator on EPOXI. “Nevertheless, planet-characterizing space telescopes under study by NASA would be able to observe an Earth twin as a single point of light — a point whose total brightness changes with time as different land masses and oceans rotate in and out of view. The video will help us connect a varying point of planetary light with underlying oceans, continents, and clouds — and finding oceans on extrasolar planets means identifying potentially habitable worlds.” said Seager.

Pretty exciting stuff!

Original News Source: NASA Press Release, with a little help from Bad Astronomy for the videos

Problems Surface For Constellation Program

NASA's new Ares V & Ares I Rockets. Credit: NASA

[/caption]
On the heels of news about NASA engineers who feel the Constellation program is using the wrong kind of rockets comes word that efforts to build the spacecraft which will replace the shuttle and return astronauts to the moon is running behind and over-budget. NASA Watch published a leaked internal NASA document showing the Constellation Program has encountered financial and technical problems, and the Associated Press quoted Doug Cooke, NASA’s deputy associate administrator for exploration as saying the first test flights for Orion may be delayed. However, the delay thus far is only of NASA’s internal goal of having the spacecraft ready by 2013. Cooke said they are still on target for NASA’s public commitment of first test flights by 2015, and returning to the moon by 2020. But unless the space agency can receive more funding, further delays may be inevitable.

The 117-page report shows an $80 million cost overrun this year for just one motor and a dozen different technical problems that the space agency put in the top risk zone, meaning the problems are considered severe. The report put the program’s financial performance in that category, as well.

Some experts say it’s too early to be worried, others say NASA’s design is flawed or the space agency is just repeating mistakes made in developing the space shuttle. But almost everyone agrees that NASA isn’t getting enough funding to do what they’ve been asked to do.

Additional funding from Congress is pending, but in an election year, don’t count on it.

News Sources: NASA Watch, Newsweek/AP

A Cold War Meeting in Space 33 Years Ago Today

Deke Slayton and Aleksey Leonov meet in space. Credit: NASA

[/caption]
On July 17, 1975, something momentous and unprecedented happened: two Cold War-rivals met in space. The Apollo-Soyuz Test Project saw spacecraft from the United States and the Soviet Union docking together in space, ushering in a new era of cooperative ventures between the two countries that once were rivals in the “space race.” Preparing for the mission, the astronauts and cosmonauts had to visit each other’s countries for training, and the two space agencies had to share classified information with each other in order for the rendezvous and docking to work successfully. A few years ago, Tom Stafford, one of the American astronauts said the Apollo-Soyuz mission “showed the whole world that if the Soviet Union and America could work together in space, they could work together on the Earth.”

We almost take this cooperation for granted now, as for more than a decade, American astronauts and Russian cosmonauts have been regularly living and working together in Earth orbit, first in the Shuttle-Mir program, and now on the International Space Station. But, before the two Cold War-rivals first met in orbit, such a partnership seemed unlikely. Since Sputnik bleeped into orbit in 1957, there had indeed been a Space Race, with the U.S. and then-Soviet Union driven more by competition than cooperation. When President Kennedy called for a manned moon landing in 1961, he spoke of “battle that is now going on around the world between freedom and tyranny” and referred to the “head start obtained by the Soviets with their large rocket engines.”

But by the mid-70s things had changed. The U.S. had “won” the race to the moon, with six Apollo landings between 1969 and 1972. Both nations had launched space stations, the Russian Salyut and American Skylab. With the space shuttle still a few years off and the diplomatic chill thawing, the time was right for a joint mission.

The Apollo-Soyuz Test Project would send NASA astronauts Tom Stafford, Donald K. “Deke” Slayton and Vance Brand in an Apollo Command and Service Module to meet Russian cosmonauts Aleksey Leonov and Valeriy Kubasov in a Soyuz capsule. A jointly designed docking module fulfilled the main technical goal of the mission, demonstrating that two dissimilar craft could dock in orbit. But the human side of the mission went far beyond that.

Original News Source: NASA Image of the Day