Is Methane Evidence of Life on Mars?

Mars. Image credit: NASA Click to enlarge
Are microbes making the methane that’s been found on Mars, or does the hydrocarbon gas come from geological processes? It’s the question that everybody wants to answer, but nobody can. What will it take to convince the jury?

Many experts told Astrobiology Magazine that the best way to judge whether methane has a biological origin is to look at the ratio of carbon-12 (C-12) to carbon-13 (C-13) in the molecules. Living organisms preferentially take up the lighter C-12 isotopes as they assemble methane, and that chemical signature remains until the molecule is destroyed.

“There may be a way of distinguishing the origin of methane, whether biogenic or not, by using stable isotope measurements,” says Barbara Sherwood Lollar, an isotope chemist at the University of Toronto.

But isotope signals are subtle, best performed by accurate spectrometers placed on the martian surface rather than on an orbiting spacecraft orbit.

And there are complications. For one thing, an average martian methane level of 10 parts per billion (ppb) may be too faint for accurate isotope measurement, even for a spectroscope placed on Mars. Also, the C-12 to C-13 ratio of methane alone is not always proof of life. For example, the “Lost City” hydrothermal vent field in the Atlantic Ocean did not show a clear isotope signature, says James Kasting, professor of earth and mineral science at Penn State University.

“The methane is not that strongly fractionated, but they still think it might be biological,” says Kasting. “At Lost City, you can’t figure out if it’s biological or not by the isotopes. How are we going to figure that out on Mars?”

By expanding the search, responds Sherwood Lollar. Instead of measuring only carbon, she suggests measuring hydrogen isotopes, because biological systems also prefer hydrogen (H) to the heavier deuterium (2H).

A second approach would look at the longer, heavier hydrocarbons — ethane, propane and butane — that are related to methane, and that sometimes appear with biogenic or abiogenic methane. Sherwood Lollar detected these hydrocarbons while investigating abiogenic methane trapped in pores in ancient rocks in the Canadian Shield, a large deposit of Precambrian igneous rock. “When the water gets trapped over very, very long time periods,” she says, an abiogenic reaction between water and rock makes methane, ethane, propane and butane.

If the longer-chain abiogenic hydrocarbons are ever detected in the martian atmosphere, how could we distinguish them from similar hydrocarbons that are the breakdown products of kerogen, a remnant of decomposing living matter? The answer, Sherwood Lollar repeats, could be found in the isotopes. Abiogenic hydrocarbon chains would contain a higher proportion of heavier isotopes than the hydrocarbon chains derived from the breakdown of kerogen.

“Future missions to Mars plan to look for the presence of higher hydrocarbons as well as methane,” Sherwood Lollar says. “If this isotopic pattern can be identified in martian methane and ethane for instance, then this type of information could help resolve abiogenic versus biogenic origin.”

Isotopes figure prominently in several upcoming space missions that could slake the growing thirst for evidence on the methane mystery:

* The Phoenix lander, scheduled for launch in August 2007, will go to an ice-rich region near the North Pole, and “dig up dirt and analyze the dirt, along with the ice,” says William Boynton of the University of Arizona, who will direct the mission. The lander’s mass spectrometer will measure isotopes in any methane trapped in the soil, if the concentration is sufficient. “We won’t be able to measure the isotope ratio [in the atmosphere], because it won’t be a high enough concentration,” Boynton says.

* Mars Science Laboratory, scheduled for launch sometime between 2009 and 2011, is a 3,000-kilogram, six-wheel rover packed with scientific instruments. The tunable laser spectrometer and mass spectrometer-gas chromatograph may both be able to ferret out isotope ratios of carbon and other elements.

* Beagle 3, a successor to Britain’s lost-in-space Beagle 2, may carry an improved mass spectrometer capable of measuring carbon isotope ratios, but the project has yet to be approved. The craft would not launch until at least 2009.

From these launch dates, it’s clear the jury on this who-dun-it must remain sequestered for years, until hard data on the source of methane on Mars can be aired in the scientific courtroom. At this point, it’s fair to say that many expert witnesses take the possibility of a biogenic source rather seriously. For example, Vladimir Krasnopolsky, who led one of the teams that found methane on the planet, says, “Bacteria, I think, are plausible sources of methane on Mars, the most likely source.” But he expects the microbes to be found in oases, “because the martian conditions are very hostile to life. I think these bacteria may exist in some locations where conditions are warm and wet.”

That observation points to a possible win-win situation for those who want to find life on Mars, says Timothy Kral of the University of Arkansas, who grows methanogens for a living. If, as calculations suggest, asteroids and comets are not a likely to be delivering methane to Mars, then either methane-making organisms must be living in the subsurface, or there is a place where it’s warm enough for abiogenic generation.

“Even though it is not an indication of life directly, it’s an indication that there is warming,” says Kral. In those conditions, “there is heat, energy for organisms to grow.”

A lot has changed in the past year. Kral, who has spent a dozen years growing methanogens in a simulated martian environment, says, “Prior to last year, when people asked if I thought there was life on Mars, I would giggle. I would not be in this business if I did not think it was possible, but there was no real evidence for any life. Then, all of a sudden, last year, they found methane in the atmosphere, and we suddenly have a piece of real scientific evidence saying that it’s possible” that Mars is the second living planet.

Original Source: NASA Astrobiology

10th Planet Discovered

This new planet is larger than Pluto. Image credit: NASA/JPL. Click to enlarge.
A planet larger than Pluto has been discovered in the outlying regions of the solar system.

The 10th planet was discovered using the Samuel Oschin Telescope at Palomar Observatory near San Diego, Calif. The discovery was announced today by planetary scientist Dr. Mike Brown of the California Institute of Technology in Pasadena, Calif., whose research is partly funded by NASA.

The planet is a typical member of the Kuiper belt, but its sheer size in relation to the nine known planets means that it can only be classified as a planet, Brown said. Currently about 97 times further from the sun than the Earth, the planet is the farthest-known object in the solar system, and the third brightest of the Kuiper belt objects.

“It will be visible with a telescope over the next six months and is currently almost directly overhead in the early-morning eastern sky, in the constellation Cetus,” said Brown, who made the discovery with colleagues Chad Trujillo, of the Gemini Observatory in Mauna Kea, Hawaii, and David Rabinowitz, of Yale University, New Haven, Conn., on January 8.

Brown, Trujillo and Rabinowitz first photographed the new planet with the 48-inch Samuel Oschin Telescope on October 31, 2003. However, the object was so far away that its motion was not detected until they reanalyzed the data in January of this year. In the last seven months, the scientists have been studying the planet to better estimate its size and its motions.

“It’s definitely bigger than Pluto,” said Brown, who is a professor of planetary astronomy.

Scientists can infer the size of a solar system object by its brightness, just as one can infer the size of a faraway light bulb if one knows its wattage. The reflectance of the planet is not yet known. Scientists can not yet tell how much light from the sun is reflected away, but the amount of light the planet reflects puts a lower limit on its size.

“Even if it reflected 100 percent of the light reaching it, it would still be as big as Pluto,” says Brown. “I’d say it’s probably one and a half times the size of Pluto, but we’re not sure yet of the final size.

“We are 100 percent confident that this is the first object bigger than Pluto ever found in the outer solar system,” Brown added.

The size of the planet is limited by observations using NASA’s Spitzer Space Telescope, which has already proved its mettle in studying the heat of dim, faint, faraway objects such as the Kuiper-belt bodies. Because Spitzer is unable to detect the new planet, the overall diameter must be less than 2,000 miles, said Brown.

A name for the new planet has been proposed by the discoverers to the International Astronomical Union, and they are awaiting the decision of this body before announcing the name.

The Jet Propulsion Laboratory manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at Caltech. Caltech manages JPL for NASA.

For more information and images see: http://www.nasa.gov/vision/universe/solarsystem/newplanet-072905-images.html

or http://www.astro.caltech.edu/palomarnew/sot.html

For information about NASA and agency programs on the Web, visit:

http://www.nasa.gov/home/index.

Original Source: NASA/JPL News Release

Supernova, Before and After

Supernova (SN) 2005cs in M51. Image credit: Hubble Click to enlarge
A series of lucky breaks has allowed two University of California, Berkeley, astronomers to track down the identity of a distant star that lit up the majestic Whirlpool Galaxy a month ago.

While astronomers can predict which stars will end their lives in a fiery explosion, surprisingly only five supernovas before now had been traced back to a known star, according to one of the astronomers, UC Berkeley astronomy professor Alex Filippenko. Most supernovas are too distant, or their progenitor stars too faint or in too crowded fields for astronomers to look back in historical sky photos in order to pinpoint the location and type of star.

The Space Telescope Science Institute (STScI) today (Thursday, July 28) released photos of the beautiful Whirlpool Galaxy, M51, showing the location of the original star and the bright supernova just 12 days after its explosion was discovered.

The supernova, dubbed SN 2005cs, belongs to a class of exploding stars called “Type II-plateau.” A supernova of this type results from the collapse and subsequent explosion of a massive star whose light remains at a constant brightness (a “plateau”) for a period of time.

This finding is consistent with the idea that the progenitors of supernova explosions are red, supergiant stars with masses eight to 15 times the sun’s mass. The progenitor star of SN 2005cs was found to be at the low end of the mass range for supernova explosions. Stars with masses lower than eight solar masses do not explode as supernovae at all, but rather blow off their outer atmospheres to become planetary nebulae before contracting to white dwarfs.

A German amateur astronomer was the first to note the unusually bright star – perhaps a supernova – in M51, and he asked the staff at the Central Bureau for Astronomical Telegrams to post a note to that effect on June 29. Filippenko, who specializes in supernovae and black holes, received the notice late that afternoon and rushed to get one of his former students to request a spectrum of the brightly burning star from a telescope in Arizona. This spectrum confirmed that it was a Type II supernova.

Filippenko, by chance, was at the very end of a year-long observational program using the Hubble Space Telescope, and he worked during an overnight flight and early the next morning to submit a request to observe the supernova before his opportunity ended at 5 p.m. Eastern time June 30. Since Hubble can easily resolve stars in nearby galaxies, such as the Whirlpool, it was the only chance he had to track down the exploding star’s identity. The new picture was needed for comparison with archival images in order to accurately determine the position of the supernova.

He got in under the wire, convincing the telescope crew to observe the waning supernova on July 11, amidst the hoopla and frequent observations of the Deep Impact probe’s collision with comet Temple I.

“This will be one of Hubble’s many legacies,” Filippenko said. “No other telescope program could observe the exact location of this Type II supernova, yet it was an opportunity not to be missed.”

From the brand new Hubble image and a January 2005 image Hubble had taken of the Whirlpool Galaxy, UC Berkeley research astronomer Weidong Li and Filippenko were able to pinpoint the location of the progenitor star and identify it as a red supergiant whose mass is about seven to 10 times that of the sun.

“This is a great example of the excitement of science, when something happens and you have to jump on it right away,” said Filippenko, who is known for the enthusiasm he brings to teaching. “Some nights you just don’t sleep.”

Filippenko, Li and colleague Schuyler Van Dyk of Caltech’s Spitzer Science Center first reported their findings in IAU circulars 8556 and 8565 on July 3 and July 12, respectively. The team submitted a full paper describing their research to The Astrophysical Journal on July 18.

The Space Telescope Science Institute is operated for NASA by the Association of Universities for Research in Astronomy, Inc., under contract with the Goddard Space Flight Center, Greenbelt, Md. The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency.

Original Source: UC Berkeley News Release

Super Climate Simulation Models Oceans, Ice, Land and Atmosphere

Image depicts the sea surface temperature. Image credit: Shep Smithline, GFDL; Chris Hill, MIT. Click to enlarge
Researchers from MIT, NASA’s Goddard Space Flight Center and several other government and academic institutions have created four new supercomputer simulations that for the first time combine mathematical computer models of the atmosphere, ocean, land surface and sea ice.

These simulations are the first field tests of the new Earth System Modeling Framework (ESMF), an innovative software system that promises to improve predictive capability in diverse areas such as short-term weather forecasts and century-long climate-change projections.

Although still under development, groups from NASA, the National Science Foundation, the National Oceanic and Atmospheric Administration (NOAA), the Department of Energy, the Department of Defense and research universities are using ESMF as the standard for coupling their weather and climate models to achieve a realistic representation of the Earth as a system of interacting parts.

ESMF makes it easier to share and compare alternative scientific approaches from multiple sources; it uses remote sensing data more efficiently and eliminates the need for individual agencies to develop their own coupling software.

“The development of large Earth system applications often spans initiatives, institutions and agencies, and involves the geoscience, physics, mathematics and computer science communities. With ESMF, these diverse groups can leverage common software to simplify model development,” said NASA’s Arlindo da Silva, a scientist in Goddard’s Global Modeling and Assimilation Office.

The newly completed field tests, known as interoperability experiments, show that the new approach can be successful. Although most of the experiments would require exhaustive tuning and validation to be scientifically sound, they already show that ESMF can be used to assemble coupled applications quickly, easily and with technical accuracy.

The MIT experiment combines an atmosphere-land-ice model from NOAA’s Geophysical Fluid Dynamics Laboratory with an MIT ocean-sea ice model known as MITgcm (http://mitgcm.org/). This may ultimately bring new insights into ocean uptake of carbon dioxide and other atmospheric gases and information on how this process affects climate. Christopher Hill, principal research scientist in the MIT Department of Earth, Atmospheric and Planetary Sciences, and a member of the MIT Climate Modeling Iniatiative, led development of the software at MIT.

The ESMF research team plans to release the software to the scientific community via the Internet later this month.

Original Source: MIT News Release

Mimas and Tethys Circling Saturn

Tethys and Mimas circling Saturn. Image credit: NASA/JPL/SSI. Click to enlarge
Far above the howling winds of Saturn, its icy moons circle the planet in silence. Mimas is seen near the upper right, while Tethys hovers at the bottom. Dark shadows cast by the see-through rings slice across the northern hemisphere. Mimas is 397 kilometers (247 miles) across. Tethys is 1,071 kilometers (665 miles) across.
The dark, doughnut-shaped storm near the south pole is at least 1,600 kilometers (1,000 miles) across and could easily swallow any of Saturn’s moons except giant Titan (5,150 kilometers, 3,200 miles across).

The image was taken with the Cassini spacecraft wide-angle camera on June 21, 2005, through a filter sensitive to wavelengths of infrared light centered at 752 nanometers at a distance of approximately 2.2 million kilometers (1.3 million miles) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 53 degrees. The image scale is 125 kilometers (78 miles) per pixel.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging team is based at the Space Science Institute, Boulder, Colo.

For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .

Original Source: NASA/JPL/SSI News Release

Flying Foam Grounds Shuttle Fleet

Although Discovery made it safely into orbit, potentially catastrophic chunks of foam dislodged from its fuel tank on Tuesday’s launch. After reviewing launch video and photographs, managers identified a few places where pieces of foam flew off the tank, including one piece as large as 90-cm (35 inches) across. Fortunately it completely missed the shuttle, but if it had hit, the damage would have been severe. NASA has grounded all future shuttle flights until the falling foam problem can be made safer.

Build Big by Thinking Small

Artist’s conception of a bio-nanorobot. Image credit: NASA. Click to enlarge
When it comes to taking the next “giant leap” in space exploration, NASA is thinking small — really small.

In laboratories around the country, NASA is supporting the burgeoning science of nanotechnology. The basic idea is to learn to deal with matter at the atomic scale — to be able to control individual atoms and molecules well enough to design molecule-size machines, advanced electronics and “smart” materials.

If visionaries are right, nanotechnology could lead to robots you can hold on your fingertip, self-healing spacesuits, space elevators and other fantastic devices. Some of these things may take 20+ years to fully develop; others are taking shape in the laboratory today.

Simply making things smaller has its advantages. Imagine, for example, if the Mars rovers Spirit and Opportunity could have been made as small as a beetle, and could scurry over rocks and gravel as a beetle can, sampling minerals and searching for clues to the history of water on Mars. Hundreds or thousands of these diminutive robots could have been sent in the same capsules that carried the two desk-size rovers, enabling scientists to explore much more of the planet’s surface — and increasing the odds of stumbling across a fossilized Martian bacterium!

But nanotech is about more than just shrinking things. When scientists can deliberately order and structure matter at the molecular level, amazing new properties sometimes emerge.

An excellent example is that darling of the nanotech world, the carbon nanotube. Carbon occurs naturally as graphite — the soft, black material often used in pencil leads — and as diamond. The only difference between the two is the arrangement of the carbon atoms. When scientists arrange the same carbon atoms into a “chicken wire” pattern and roll them up into miniscule tubes only 10 atoms across, the resulting “nanotubes” acquire some rather extraordinary traits. Nanotubes:

– have 100 times the tensile strength of steel, but only 1/6 the weight;
– are 40 times stronger than graphite fibers;
– conduct electricity better than copper;
– can be either conductors or semiconductors (like computer chips), depending on the arrangement of atoms;
– and are excellent conductors of heat.

Much of current nanotechnology research worldwide focuses on these nanotubes. Scientists have proposed using them for a wide range of applications: in the high-strength, low-weight cable needed for a space elevator; as molecular wires for nano-scale electronics; embedded in microprocessors to help siphon off heat; and as tiny rods and gears in nano-scale machines, just to name a few.

Nanotubes figure prominently in research being done at the NASA Ames Center for Nanotechnology (CNT). The center was established in 1997 and now employs about 50 full-time researchers.

“[We] try to focus on technologies that could yield useable products within a few years to a decade,” says CNT director Meyya Meyyappan. “For example, we’re looking at how nano-materials could be used for advanced life support, DNA sequencers, ultra-powerful computers, and tiny sensors for chemicals or even sensors for cancer.”

A chemical sensor they developed using nanotubes is scheduled to fly a demonstration mission into space aboard a Navy rocket next year. This tiny sensor can detect as little as a few parts per billion of specific chemicals–like toxic gases–making it useful for both space exploration and homeland defense. CNT has also developed a way to use nanotubes to cool the microprocessors in personal computers, a major challenge as CPUs get more and more powerful. This cooling technology has been licensed to a Santa Clara, California, start-up called Nanoconduction, and Intel has even expressed interest, Meyyappan says.

If these near-term uses of nanotechnology seem impressive, the long-term possibilities are truly mind-boggling.

The NASA Institute for Advanced Concepts (NIAC), an independent, NASA-funded organization located in Atlanta, Georgia, was created to promote forward-looking research on radical space technologies that will take 10 to 40 years to come to fruition.

For example, one recent NIAC grant funded a feasibility study of nanoscale manufacturing–in other words, using vast numbers of microscopic molecular machines to produce any desired object by assembling it atom by atom!

That NIAC grant was awarded to Chris Phoenix of the Center for Responsible Nanotechnology.

In his 112 page report, Phoenix explains that such a “nanofactory” could produce, say, spacecraft parts with atomic precision, meaning that every atom within the object is placed exactly where it belongs. The resulting part would be extremely strong, and its shape could be within a single atom’s width of the ideal design. Ultra-smooth surfaces would need no polishing or lubrication, and would suffer virtually no “wear and tear” over time. Such high precision and reliability of spacecraft parts are paramount when the lives of astronauts are at stake.

Although Phoenix sketched out some design ideas for a desktop nanofactory in his report, he acknowledges that — short of a big-budget “Nanhatten Project,” as he calls it — a working nanofactory is at least a decade away, and possibly much longer.

Taking a cue from biology, Constantinos Mavroidis, director of the Computational Bionanorobotics Laboratory at Northeastern University in Boston, is exploring an alternative approach to nanotech:

Rather than starting from scratch, the concepts in Mavroidis’s NIAC-funded study employ pre-existing, functional molecular “machines” that can be found in all living cells: DNA molecules, proteins, enzymes, etc.

Shaped by evolution over millions of years, these biological molecules are already very adept at manipulating matter at the molecular scale — which is why a plant can combine air, water, and dirt and produce a juicy red strawberry, and a person’s body can convert last night’s potato dinner into today’s new red blood cells. The rearranging of atoms that makes these feats possible is performed by hundreds of specialized enzymes and proteins, and DNA stores the code for making them.

Making use of these “pre-made” molecular machines — or using them as starting points for new designs — is a popular approach to nanotechnology called “bio-nanotech.”

“Why reinvent the wheel?” Mavroidis says. “Nature has given us all this great, highly refined nanotechnology inside of living things, so why not use it — and try to learn something from it?”

The specific uses of bio-nanotech that Mavroidis proposes in his study are very futuristic. One idea involves draping a kind of “spider’s web” of hair-thin tubes packed with bio-nanotech sensors across dozens of miles of terrain, as a way to map the environment of some alien planet in great detail. Another concept he proposes is a “second skin” for astronauts to wear under their spacesuits that would use bio-nanotech to sense and respond to radiation penetrating the suit, and to quickly seal over any cuts or punctures.

Futuristic? Certainly. Possible? Maybe. Mavroidis admits that such technologies are probably decades away, and that technology so far in the future will probably be very different from what we imagine now. Still, he says he believes it’s important to start thinking now about what nanotechnology might make possible many years down the road.

Considering that life itself is, in a sense, the ultimate example of nanotech, the possibilities are exciting indeed.

Original Source: NASA News Release

Water Ice in a Martian Crater

Perspective view of crater with water ice. Image credit: ESA Click to enlarge
This image, taken by the High Resolution Stereo Camera (HRSC) on board ESA?s Mars Express spacecraft, shows a patch of water ice sitting on the floor of an unnamed crater near the Martian north pole.

The HRSC obtained this image during orbit 1343 with a ground resolution of approximately 15 metres per pixel. The unnamed impact crater is located on Vastitas Borealis, a broad plain that covers much of Mars’s far northern latitudes, at approximately 70.5? North and 103? East.

The crater is 35 kilometres wide and has a maximum depth of approximately 2 kilometres beneath the crater rim. The circular patch of bright material located at the centre of the crater is residual water ice.

This white patch is present all year round, as the temperature and pressure are not high enough to allow sublimation of water ice.

It cannot be frozen carbon dioxide since carbon dioxide ice had already disappeared from the north polar cap at the time the image was taken (late summer in the Martian northern hemisphere).

There is a height difference of 200 metres between the crater floor and the surface of this bright material, which cannot be attributed solely to water ice.

It is probably mostly due to a large dune field lying beneath this ice layer. Indeed, some of these dunes are exposed at the easternmost edge of the ice.

Faint traces of water ice are also visible along the rim of the crater and on the crater walls. The absence of ice along the north-west rim and walls may occur because this area receives more sunlight due to the Sun?s orientation, as highlighted in the perspective view.

Original Source: ESA Mars Express

Martian Fossil Finder in the Works

NUGGET instrument. Image credit: NASA Click to enlarge
Astrobiologists, who search for evidence of life on other planets, may find a proposed Neutron/Gamma ray Geologic Tomography (NUGGET) instrument to be one of the most useful tools in their toolbelt.

As conceived by scientists at the Goddard Space Flight Center (GSFC) in Greenbelt, Md., NUGGET would be able to generate three-dimensional images of fossils embedded in an outcrop of rock or beneath the soil of Mars or another planet. Tomography uses radiation or sound waves to look inside objects. NUGGET could help determine if primitive forms of life took root on Mars when the planet was awash in water eons ago.

Similar to seismic tomography used by the oil industry to locate oil reserves beneath Earth?s surface, NUGGET would look instead for evidence of primitive algae and bacteria that fossilized along the edges of extinct rivers or oceans. As on Earth, these remains could lie just a few centimeters beneath the surface, compressed between layers of silt. If a mechanical rover that explores planet surfaces were equipped with an instrument like NUGGET ? capable of peering beneath the surface ? then it might be able to reveal evidence of life beyond Earth.

?This is a brand new idea,? said Sam Floyd, the principal investigator on the project, funded this year by Goddard?s Director?s Discretionary Fund. If developed, NUGGET would be able to investigate important biological indicators of life, and quickly and precisely identify areas where scientists might want to take samples of soil or conduct more intensive studies. ?It would allow us to do a much faster survey of an area,? Floyd said.

The proposed instrument, which could be carried on a rover or a robot lander, is made up of three fundamentally distinct technologies ? a neutron generator, a neutron lens, and a gamma-ray detector.

At the heart of NUGGET is a three-dimensional scanning instrument that beams neutrons into a rock or other object under study. When the nucleus of an atom inside the rock captures the neutrons, it produces a characteristic gamma-ray signal for that element, which the gamma-ray detector then analyzes. It?s also possible to plot the location of the elements.

After this process, information can then be turned into an image of the elements within the rock. By seeing images of certain existing elements, scientists could tell whether a certain type of bacteria had become fossilized inside the rock.

Although the concept of focusing neutrons is not new, the ability to focus them is. Thanks to a Russian scientist who devised the method in the 1980s, scientists today can direct a beam of neutrons through a neutron lens made up of the thousands of long, slender, hair-size glass tubes. The bundle of tubes is shaped so that the neutrons flowing down them can converge at a central point. Since the method?s invention in the 1980s, manufacturing practices have made this type of optical system feasible for space exploration.

The advantage of this technology is that it can create a higher intensity of neutrons at a central point on the object. This increased intensity allows a higher-resolution image to be produced.

Floyd and his co-investigators, Jason Dworkin, John Keller, and Scott Owens, all from NASA GSFC, plan to conduct experiments this summer at the National Institute of Standards and Technology (NIST) using one of NIST?s neutron-beam lines. By focusing neutrons into various samples (one of which is a meteorite), they hope to make a three-dimensional image of the meteorite’s internal structure.

?If we?re successful, we?ll be in position to say whether a space flight instrument is feasible,? Floyd said, adding that his research should give Goddard the lead role in developing a new class of instruments to support missions for NASA’s search of life in the future.

Original Source: NASA News Release

NASA’s Prototype Solar Sail Inflates Perfectly

20-meter solar sail. Image credit: NASA/MSFC Click to enlarge
NASA has reached a milestone in the testing of solar sails — a unique propulsion technology that will use sunlight to propel vehicles through space. Engineers have successfully deployed a 20-meter solar sail system that uses an inflatable boom deployment design.

L’Garde, Inc. of Tustin, Calif., deployed the system at the Space Power Facility — the world’s largest space environment simulation chamber — at NASA Glenn Research Center’s Plum Brook Station in Sandusky, Ohio. L’Garde is a technology development contractor for the In-Space Propulsion Technology Office at NASA’s Marshall Space Flight Center in Huntsville, Ala. NASA’s Langley Research Center in Hampton, Va., provided instrumentation and test support for the tests.

Red lights help illuminate the four, outstretched triangular sail quadrants in the chamber. The sail material is supported by an inflatable boom system designed to unfold and become rigid in the space environment. The sail and boom system is extended via remote control from a central stowage container about the size of a suitcase.

L’Garde began testing its sail system at Plum Brook in June. The test series lasted 30 days.

Solar sail technologies use energy from the Sun to power a spacecraft’s journey through space. The technology bounces sunlight off giant, reflective sails made of lightweight material 40-to-100-times thinner than a piece of writing paper. The continuous sunlight pressure provides sufficient thrust to perform maneuvers, such as hovering at a fixed point in space or rotating the vehicle’s plane of orbit. Such a maneuver would require a significant amount of propellant for conventional rocket systems.

Because the Sun provides the necessary propulsive energy, solar sails require no onboard propellant, thus increasing the range of mobility or the capability to hover at a fixed point for longer periods of time.

Solar sail technology was selected for development in August 2002 by NASA’s Science Mission Directorate in Washington. Along with sail system design projects, the Marshall Center and NASA’s Jet Propulsion Laboratory in Pasadena, Calif., are collaborating to investigate the effects of the space environment on advanced solar sail materials. These are just three of a number of efforts undertaken by NASA Centers, industry and academia to develop solar sail technology.

Solar sail technology is being developed by the In-Space Propulsion Technology Program, managed by NASA’s Science Mission Directorate and implemented by the In-Space Propulsion Technology Office at Marshall. The program’s objective is to develop in-space propulsion technologies that can enable or benefit near- or mid-term NASA space science missions by significantly reducing cost, mass and travel times.

For more information about solar sail propulsion, visit:
http://www.inspacepropulsion.com

For more information about L’Garde, Inc. and its solar sail system, visit:
http://www.lgarde.com/

Original Source: NASA News Release