2010 Had Warmest Global June on Record


Was last month warm where you live? If so, you weren’t alone. According measurements taken by the National Oceanic and Atmospheric Administration (NOAA) June 2010 was the hottest June on record worldwide. But this is not a new trend, at least for this year. March, April, and May 2010 were also the warmest on record. This was also the 304th consecutive month with a global temperature above the 20th century average. The last month with below-average temperature was February 1985.

Here are some of the numbers:

* The combined global land and ocean average surface temperature for June 2010 was the warmest on record at 16.2°C (61.1°F), which is 0.68°C (1.22°F) above the 20th century average of 15.5°C (59.9°F). The previous record for June was set in 2005.

* The June worldwide averaged land surface temperature was 1.07°C (1.93°F) above the 20th century average of 13.3°C (55.9°F)—the warmest on record.

* It was the warmest April–June (three-month period) on record for the global land and ocean temperature and the land-only temperature. The three-month period was the second warmest for the world’s oceans, behind 1998.

* It was the warmest June and April–June on record for the Northern Hemisphere as a whole and all land areas of the Northern Hemisphere.

* It was the warmest January–June on record for the global land and ocean temperature. The worldwide land on average had its second warmest January–June, behind 2007. The worldwide averaged ocean temperature was the second warmest January–June, behind 1998.

* Sea surface temperature (SST) anomalies in the central and eastern equatorial Pacific Ocean continued to decrease during June 2010. According to NOAA’s Climate Prediction Center, La Niña conditions are likely to develop during the Northern Hemisphere summer 2010.

Some regions on the planet, however, had cool temps for a northern hemisphere summer. Spain had its coolest June temperatures since 1997, and Guizhou in southern China had its coolest June since their records began in 1951.

Still, with those cool temperatures, the planet on the whole was warmer.

Arctic sea ice extent for June 2010 was 10.87 million square kilometers (4.20 million square miles). Credit: NSIDC

Other satellite data from the US National Snow and Ice Data Center in Colorado shows that the extent of sea ice in the Arctic was at its lowest for any June since satellite records started in 1979. The ice cover on Arctic Ocean grows each winter and shrinks in summer, reaching its annual low point in September. The monthly average for June 2010 was 10.87 km sq. The ice was declining an average of 88,000 sq km per day in June. This rate of decline is the fastest measured for June.

During June, ice extent was below average everywhere except in the East Greenland Sea, where it was near average.

Sources: NOAA, NSIDC

Astronomy Without A Telescope – Bringing The Planetology Home

We keep finding all these exoplanets. Our detection methods still only pick out the bigger ones, but we’re getting better at this all the time. One day in the not-too-distant future it is conceivable that we will find one with a surface gravity in the 1G range – orbiting its star in, what we anthropomorphically call, the Goldilocks zone where water can exist in liquid phase.

So let’s say we find such a planet and then direct all our SETI gear towards it. We start detecting faint morse-code like beeps – inscrutable, but clearly of artificial origin. Knowing us, we’ll send out a probe. Knowing us, there will be a letter campaign demanding that we adhere to the Prime Directive and consequently this deep space probe will include some newly developed cloaking technology, so that it will arrive at the Goldilocks planet invisible and undetectable.

The probe takes quite a while to get there and, in transit, receives indications that the alien civilization is steadily advancing its technology as black and white sitcoms start coming through – and as all that is relayed back to us we are able to begin translating their communications into a range of ‘dialects’.

By the time the probe has arrived and settles into an invisible orbit, it’s apparent a problem is emerging on the planet. Many of its inhabitants have begun expressing concern that their advancing technology is beginning to have planetary effects, with respect to land clearing and atmospheric carbon loading.

From our distant and detached viewpoint we are able to see that anyone on the planet who thinks they live in a stable and unchanging environment just isn’t paying attention. There was a volcano just the other week and their geologists keep finding ancient impact craters which have revised whole ecosystems in their planet’s past.

It becomes apparent that the planet’s inhabitants are too close the issues to be able to make a dispassionate assessment about what’s happening – or what to do about it. They are right that their technological advancement has bumped up the CO2 levels from 280ppm to over 380ppm within only 150 years – and to a level much higher than anything detectable in their ice core data, which goes back half a million years. But that’s about where the definitive data ends.

Credit: Rahstorf. NASA data is from the GISS Surface Temperature Analysis. Hadley Centre data is from the Met Office Hadley Centre, UK.

Advocates for change draw graphs showing temperatures are rising, while conservatives argue this is just cherry-picking data from narrow time periods. After all, a brief rise might be lost in the background noise of a longer monitoring period – and just how reliable is 150 year old data anyway? Other more pragmatic individuals point to the benefits gained from their advanced technology, noting that you have to break a few eggs to make an omelet (or at least the equivalent alien cuisine).

Back on Earth our future selves smile wryly, having seen it all before. As well as interstellar probes and cloaking devices, we have developed a reliable form of Asimovian psychohistory. With this, it’s easy enough to calculate that the statistical probability of a global population adopting a coordinated risk management strategy in the absence of definitive, face-slapping evidence of an approaching calamity is exactly (datum removed to prevent corrupting the timeline).

Nailing Down Goldilocks: What’s “Just Right” for Exo-Earths?

Cresent Earth

For Goldilocks, the porridge had to be not too hot, and not too cold … the right temperature was all she needed.

For an Earth-like planet to harbor life, or multicellular life, certainly temperature is important, but what else is important? And what makes the temperature of an exo-Earth “just right”?

Some recent studies have concluded that answering these questions can be surprisingly difficult, and that some of the answers are surprisingly curious.

Consider the tilt of an exo-Earth’s axis, its obliquity.

In the “Rare Earth” hypothesis, this is a Goldilocks criterion; unless the tilt is kept stable (by a moon like our Moon), and at a “just right” angle, the climates will swing too wildly for multicellular life to form: too many snowball Earths (the whole globe covered in snow and ice with an enhanced albedo effect), or too much risk of a runaway greenhouse.

“We find that planets with small ocean fractions or polar continents can experience very severe seasonal climatic variations,” Columbia University’s David Spiegel writes*, summing up the results of an extensive series of models investigating the effects of obliquity, land/ocean coverage, and rotation on Earth-like planets, “but that these planets also might maintain seasonally and regionally habitable conditions over a larger range of orbital radii than more Earth-like planets.” And the real surprise? “Our results provide indications that the modeled climates are somewhat less prone to dynamical snowball transitions at high obliquity.” In other words, an exo-Earth tilted nearly right over (much like Uranus) may be less likely to suffer snowball Earth events than our, Goldilocks, Earth!

Ultraviolet view of the Sun. Image credit: SOHO

Consider ultra-violet radiation.

“Ultraviolet radiation is a double-edged sword to life. If it is too strong, the terrestrial biological systems will be damaged. And if it is too weak, the synthesis of many biochemical compounds cannot go along,” says Jianpo Guo of China’s Yunnan Observatory** “For the host stars with effective temperatures lower than 4,600 K, the ultraviolet habitable zones are closer than the habitable zones. For the host stars with effective temperatures higher than 7,137 K, the ultraviolet habitable zones are farther than the habitable zones.” This result doesn’t change what we already knew about habitability zones around main sequence stars, but it effectively rules out the possibility of life on planets around post-red giant stars (assuming any could survive their homesun going red giant!)

(Credit: NASA)

Consider the effects of clouds.

Calculations of the habitability zones – the radii of the orbits of an exo-Earth, around its homesun – for main sequence stars usually assume an astronomers’ heaven – permanent clear skies (i.e. no clouds). But Earth has clouds, and clouds most definitely have an effect on average global temperatures! “The albedo effect is only weakly dependent on the incident stellar spectra because the optical properties (especially the scattering albedo) remain almost constant in the wavelength range of the maximum of the incident stellar radiation,” a German team’s recent study*** on the effects of clouds on habitability concludes (they looked at main sequence homesuns of spectral classes F, G, K, and M). This sounds like Gaia is Goldilocks’ friend; however, “The greenhouse effect of the high-level cloud on the other hand depends on the temperatures of the lower atmosphere, which in turn are an indirect consequence of the different types of central stars,” the team concludes (remember that an exo-Earth’s global temperature depends upon both the albedo and greenhouse effects). So, the take-home message? “Planets with Earth-like clouds in their atmospheres can be located closer to the central star or farther away compared to planets with clear sky atmospheres. The change in distance depends on the type of cloud. In general, low-level clouds result in a decrease of distance because of their albedo effect, while the high-level clouds lead to an increase in distance.”

“Just right” is tricky to pin down.

* lead author; Princeton University’s Kristen Manou and Colombia University’s Caleb Scharf are the co-authors (“Habitable Climates: The Influence of Obliquity”, The Astrophysical Journal, Volume 691, Issue 1, pp. 596-610 (2009); arXiv:0807.4180 is the preprint)
** lead author; Fenghui Zhang, Xianfei Zhang, and Zhanwen Han, all also at the Yunnan Observatory, are the co-authors (“Habitable zones and UV habitable zones around host stars”, Astrophysics and Space Science, Volume 325, Number 1, pp. 25-30 (2010))
*** “Clouds in the atmospheres of extrasolar planets. I. Climatic effects of multi-layered clouds for Earth-like planets and implications for habitable zones”, Kitzmann et al., accepted for publication in Astronomy & Astrophysics (2010); arXiv:1002.2927 is the preprint.

Ozone on Mars: Two Windows Better Than One

Understanding the present-day Martian climate gives us insights into its past climate, which in turn provides a science-based context for answering questions about the possibility of life on ancient Mars.

Our understanding of Mars’ climate today is neatly packaged as climate models, which in turn provide powerful consistency checks – and sources of inspiration – for the climate models which describe anthropogenic global warming here on Earth.

But how can we work out what the climate on Mars is, today? A new, coordinated observation campaign to measure ozone in the Martian atmosphere gives us, the interested public, our own window into just how painstaking – yet exciting – the scientific grunt work can be.

The Martian atmosphere has played a key role in shaping the planet’s history and surface. Observations of the key atmospheric components are essential for the development of accurate models of the Martian climate. These in turn are needed to better understand if climate conditions in the past may have supported liquid water, and for optimizing the design of future surface-based assets at Mars.

Ozone is an important tracer of photochemical processes in the atmosphere of Mars. Its abundance, which can be derived from the molecule’s characteristic absorption spectroscopy features in spectra of the atmosphere, is intricately linked to that of other constituents and it is an important indicator of atmospheric chemistry. To test predictions by current models of photochemical processes and general atmospheric circulation patterns, observations of spatial and temporal ozone variations are required.

The Spectroscopy for Investigation of Characteristics of the Atmosphere of Mars (SPICAM) instrument on Mars Express has been measuring ozone abundances in the Martian atmosphere since 2003, gradually building up a global picture as the spacecraft orbits the planet.

These measurements can be complemented by ground-based observations taken at different times and probing different sites on Mars, thereby extending the spatial and temporal coverage of the SPICAM measurements. To quantitatively link the ground-based observations with those by Mars Express, coordinated campaigns are set up to obtain simultaneous measurements.

Infrared heterodyne spectroscopy, such as that provided by the Heterodyne Instrument for Planetary Wind and Composition (HIPWAC), provides the only direct access to ozone on Mars with ground-based telescopes; the very high spectral resolving power (greater than 1 million) allows Martian ozone spectral features to be resolved when they are Doppler shifted away from ozone lines of terrestrial origin.

A coordinated campaign to measure ozone in the atmosphere of Mars, using SPICAM and HIPWAC, has been ongoing since 2006. The most recent element of this campaign was a series of ground-based observations using HIPWAC on the NASA Infrared Telescope Facility (IRTF) on Mauna Kea in Hawai’i. These were obtained between 8 and 11 December 2009 by a team of astronomers led by Kelly Fast from the Planetary Systems Laboratory, at NASA’s Goddard Space Flight Center (GSFC), in the USA.

Credit: Kelly Fast

About the image: HIPWAC spectrum of Mars’ atmosphere over a location on Martian latitude 40°N; acquired on 11 December 2009 during an observation campaign with the IRTF 3 m telescope in Hawai’i. This unprocessed spectrum displays features of ozone and carbon dioxide from Mars, as well as ozone in the Earth’s atmosphere through which the observation was made. Processing techniques will model and remove the terrestrial contribution from the spectrum and determine the amount of ozone at this northern position on Mars.

The observations had been coordinated in advance with the Mars Express science operations team, to ensure overlap with ozone measurements made in this same period with SPICAM.

The main goal of the December 2009 campaign was to confirm that observations made with SPICAM (which measures the broad ozone absorption spectra feature centered at around 250 nm) and HIPWAC (which detects and measures ozone absorption features at 9.7 μm) retrieve the same total ozone abundances, despite being performed at two different parts of the electromagnetic spectrum and having different sensitivities to the ozone profile. A similar campaign in 2008, had largely validated the consistency of the ozone measurement results obtained with SPICAM and the HIPWAC instrument.

The weather conditions and the seeing were very good at the IRTF site during the December 2009 campaign, which allowed for good quality spectra to be obtained with the HIPWAC instrument.

Kelly and her colleagues gathered ozone measurements for a number of locations on Mars, both in the planet’s northern and southern hemisphere. During this four-day campaign the SPICAM observations were limited to the northern hemisphere. Several HIPWAC measurements were simultaneous with observations by SPICAM allowing a direct comparison. Other HIPWAC measurements were made close in time to SPICAM orbital passes that occurred outside of the ground-based telescope observations and will also be used for comparison.

The team also performed measurements of the ozone abundance over the Syrtis Major region, which will help to constrain photochemical models in this region.
Analysis of the data from this recent campaign is ongoing, with another follow-up campaign of coordinated HIPWAC and SPICAM observations already scheduled for March this year.

Putting the compatibility of the data from these two instruments on a firm base will support combining the ground-based infrared measurements with the SPICAM ultraviolet measurements in testing the photochemical models of the Martian atmosphere. The extended coverage obtained by combining these datasets helps to more accurately test predictions by atmospheric models.

It will also quantitatively link the SPICAM observations to longer-term measurements made with the HIPWAC instrument and its predecessor IRHS (the Infrared Heterodyne Spectrometer) that go back to 1988. This will support the study of the long-term behavior of ozone and associated chemistry in the atmosphere of Mars on a timescale longer than the current missions to Mars.

Sources: ESA, a paper published in the 15 September 2009 issue of Icarus

Scientist Discusses Latest Report of Rising Global Temperatures


A new NASA report says the past decade was the warmest ever on Earth, at least since modern temperature measurements began in 1880. The study analyzed global surface temperatures and also found that 2009 was the second-warmest year on record, again since modern temperature measurements began. Last year was only a small fraction of a degree cooler than 2005, the warmest yet, putting 2009 in a virtual tie with the other hottest years, which have all occurred since 1998. This annual surface temperature study is one that always generates considerable interest — and some controversy. Gavin Schmidt, a climatologist at NASA’s Goddard Institute for Space Studies (GISS) offered some context on this latest report, in an interview with the NASA Earth Science News Team.

NASA’s Earth Science News Team: Every year, some of the same questions come up about the temperature record. What are they?

Gavin Schmidt: First, do the annual rankings mean anything? Second, how should we interpret all of the changes from year to year — or inter-annual variability — the ups and downs that occur in the record over short time periods? Third, why does NASA GISS get a slightly different answer than the Met Office Hadley Centre does? Fourth, is GISS somehow cooking the books in its handling and analysis of the data?

NASA: 2009 just came in as tied as the 2nd warmest on record, which seems notable. What is the significance of the yearly temperature rankings?

The map shows temperature changes for the last decade—January 2000 to December 2009—relative to the 1951-1980 mean. Credit: NASA

Gavin Schmidt: In fact, for any individual year, the ranking isn’t particularly meaningful. The difference between the second warmest and sixth warmest years, for example, is trivial. The media is always interested in the annual rankings, but whether it’s 2003, 2007, or 2009 that’s second warmest doesn’t really mean much because the difference between the years is so small. The rankings are more meaningful as you look at longer averages and decade-long trends.

NASA: Why does GISS get a different answer than the Met Office Hadley Centre [a UK climate research group that works jointly with the Climatic Research Unit at the University of East Anglia to perform an analysis of global temperatures]?

Gavin Schmidt: It’s mainly related to the way the weather station data is extrapolated. The Hadley Centre uses basically the same data sets as GISS, for example, but it doesn’t fill in large areas of the Arctic and Antarctic regions where fixed monitoring stations don’t exist. Instead of leaving those areas out from our analysis, you can use numbers from the nearest available stations, as long as they are within 1,200 kilometers. Overall, this gives the GISS product more complete coverage of the polar areas.

NASA: Some might hear the word “extrapolate” and conclude that you’re “making up” data. How would you reply to such criticism?

Gavin Schmidt: The assumption is simply that the Arctic Ocean as a whole is warming at the average of the stations around it. What people forget is that if you don’t put any values in for the areas where stations are sparse, then when you go to calculate the global mean, you’re actually assuming that the Arctic is warming at the same rate as the global mean. So, either way you are making an assumption.

Which one of those is the better assumption? Given all the changes we’ve observed in the Arctic sea ice with satellites, we believe it’s better to assume the Arctic Ocean is changing at the same rate as the other stations around the Arctic. That’s given GISS a slightly larger warming, particularly in the last couple of years, relative to the Hadley Centre.

NASA: Many have noted that the winter has been particularly cold and snowy in some parts of the United States and elsewhere. Does this mean that climate change isn’t happening?

Gavin Schmidt: No, it doesn’t, though you can’t dismiss people’s concerns and questions about the fact that local temperatures have been cool. Just remember that there’s always going to be variability. That’s weather. As a result, some areas will still have occasionally cool temperatures — even record-breaking cool — as average temperatures are expected to continue to rise globally.

NASA: So what’s happening in the United States may be quite different than what’s happening in other areas of the world?

Gavin Schmidt: Yes, especially for short time periods. Keep in mind that that the contiguous United States represents just 1.5 percent of Earth’s surface.

NASA: GISS has been accused by critics of manipulating data. Has this changed the way that GISS handles its temperature data?

Gavin Schmidt: Indeed, there are people who believe that GISS uses its own private data or somehow massages the data to get the answer we want. That’s completely inaccurate. We do an analysis of the publicly available data that is collected by other groups. All of the data is available to the public for download, as are the computer programs used to analyze it. One of the reasons the GISS numbers are used and quoted so widely by scientists is that the process is completely open to outside scrutiny.

NASA: What about the meteorological stations? There have been suggestions that some of the stations are located in the wrong place, are using outdated instrumentation, etc.

Gavin Schmidt: Global weather services gather far more data than we need. To get the structure of the monthly or yearly anomalies over the United States, for example, you’d just need a handful of stations, but there are actually some 1,100 of them. You could throw out 50 percent of the station data or more, and you’d get basically the same answers. Individual stations do get old and break down, since they’re exposed to the elements, but this is just one of things that the NOAA has to deal with. One recent innovation is the set up of a climate reference network alongside the current stations so that they can look for potentially serious issues at the large scale – and they haven’t found any yet.

Sources: NASA, NASA Earth Observatory

Be A Carbon Hero


NASA is quite proud of its spinoffs technology developed for the space agency’s needs in space that in turn contribute to commercial innovations that improve life here on Earth. And rightly so. Just as a quick example, improvements in spacesuits have led to better protection for firefighters, scuba divers and people working in cold weather. But the list of NASA spinoffs is quite extensive.

Just like NASA, the European Space Agency (ESA) has a Technology Transfer office to help inventors and businesses use space technology for non-space applications. The latest invention touted as an ESA spinoff is a small hand-held device called a Carbon Hero that might help make people more aware of the carbon footprint they are leaving behind due to vehicle emissions.

Used in conjunction with a cell phone, the Carbon Hero receives data from navigation satellites to determine the mode of transportation being used. The device’s algorithm is able to use the speed and position of the user to determine how they are traveling, and how much CO2 they are generating. The user doesn’t have to enter any information, the data is computed automatically.

The user would get feedback on the environmental impact of different types of transportation – whether by train, plane, bike or by foot. The Carbon Hero lets the user compare one kind of travel with another and calculate the environmental benefits daily, weekly and monthly.

“If you go on a diet you want to see if all that effort has made a difference so you weigh yourself. The beauty of our system is that it’s easy; you have a “weighing scale” on you all the time giving you your carbon footprint. When you make the effort to walk instead of taking the car you can immediately see the result, so it feels more worthwhile doing it and you are more likely to stick with it,” says Andreas Zachariah, a graduate student from the Royal College of Art in London and inventor of Carbon Hero.

The device has been tested using the GPS system, but will be fully operational after Galileo, the European global navigation system is fully up and running.

Learn more about ESA’s Technology Transfer Programme Office.

Learn more about NASA Spinoffs.

Original News Source: ESA Press Release

How Deforestation in Brazil is Affecting Local Climate

Image credit: NASA
NASA satellite data are giving scientists insight into how large-scale deforestation in the Amazon Basin in South America is affecting regional climate. Researchers found during the Amazon dry season last August, there was a distinct pattern of higher rainfall and warmer temperatures over deforested regions.

Researchers analyzed multiple years of data from NASA’s Tropical Rainfall Measuring Mission (TRMM). They also used data from the Department of Defense Special Sensor Microwave Imager and the National Oceanic and Atmospheric Administration’s Geostationary Operational Environmental Satellites.

The study appeared in a recent issue of the American Meteorological Society’s Journal of Climate. Lead authors, Andrew Negri and Robert Adler, are research meteorologists at NASA’s Goddard Space Flight Center (GSFC), Greenbelt, Md. Other authors include Liming Xu, formerly of the University of Arizona, Tucson, and Jason Surratt, North Carolina State University, Raleigh.

“In deforested areas, the land heats up faster and reaches a higher temperature, leading to localized upward motions that enhance the formation of clouds and ultimately produce more rainfall,” Negri said.

The researchers caution the rainfall increases were most pronounced in August, during the transition from dry to wet seasons. In this transition period, the effects of land cover, such as evaporation, are not overwhelmed by large-scale weather disturbances that are common during the rest of the year. While the study, based on satellite data analysis, focused on climate changes in the deforested areas, large increases in cloud cover and rainfall were also observed in the naturally un-forested savanna region and surrounding the urban area of Port Velho, Brazil, particularly in August and September.

Recent studies by Dr. Marshall Shepherd cited similar findings, including an average rain-rate increase of 28 percent downwind of urban areas and associated changes in the daily timing of cloud formation and precipitation. He is also a research meteorologist at GSFC.

This research confirmed the Amazon savanna region experienced a shift in the onset of cloudiness and rainfall toward the morning hours. The shift was likely initiated by the contrast in surface heating across the deforested and savanna region.

The varied heights of plants and trees in the region change the aerodynamics of the atmosphere, creating more circulation and rising air. When the rising air reaches the dew point in the cooler, upper atmosphere, it condenses into water droplets and forms clouds.

Negri acknowledged other factors are involved. The savanna in this study is approximately 100 kilometers (62 miles) wide, the perfect size to influence precipitation, such as rain showers and thunderstorms. Earlier studies hypothesized certain land surfaces, such as bands of vegetation 50 to 100 kilometers (31-62 miles) wide in semiarid regions, could result in enhanced precipitation.

This research is in agreement with the recent and sophisticated computer models developed by the Massachusetts Institute of Technology. The models concluded small-scale circulations, including the mixing and rising of air induced by local land surfaces, could enhance cloudiness and rainfall. Many earlier studies that relied on models developed in the 1990s or earlier concluded widespread deforestation of the Amazon Basin would lead to decreased rainfall.

“The effects here are rather subtle and appear to be limited to the dry season. The overall effect of this deforestation on annual and daily rainfall cycles is probably small and requires more study,” Negri said. Future research will use numerical models for investigating the linkage between deforested land surface and the cloud-precipitation components of the water cycle.

NASA’s Earth Science Enterprise is dedicated to understanding the Earth as an integrated system and applying Earth System Science to improve prediction of climate, weather, and natural hazards using the unique vantage point of space.

Original Source: NASA News Release

More Details on Water Vapour Feedback

Image credit: NASA
A NASA-funded study found some climate models might be overestimating the amount of water vapor entering the atmosphere as the Earth warms. Since water vapor is the most important heat-trapping greenhouse gas in our atmosphere, some climate forecasts may be overestimating future temperature increases.

In response to human emissions of greenhouse gases, like carbon dioxide, the Earth warms, more water evaporates from the ocean, and the amount of water vapor in the atmosphere increases. Since water vapor is also a greenhouse gas, this leads to a further increase in the surface temperature. This effect is known as “positive water vapor feedback.” Its existence and size have been contentiously argued for several years.

Ken Minschwaner, a physicist at the New Mexico Institute of Mining and Technology, Socorro, N.M., and Andrew Dessler, a researcher with the University of Maryland, College Park, and NASA’s Goddard Space Flight Center, Greenbelt, Md, did the study. It is in the March 15 issue of the American Meteorological Society’s Journal of Climate. The researchers used data on water vapor in the upper troposphere (10-14 km or 6-9 miles altitude) from NASA’s Upper Atmosphere Research Satellite (UARS).

Their work verified water vapor is increasing in the atmosphere as the surface warms. They found the increases in water vapor were not as high as many climate-forecasting computer models have assumed. “Our study confirms the existence of a positive water vapor feedback in the atmosphere, but it may be weaker than we expected,” Minschwaner said.

“One of the responsibilities of science is making good predictions of the future climate, because that’s what policy makers use to make their decisions,” Dessler said. “This study is another incremental step toward improving those climate predictions,” he added.

According to Dessler, the size of the positive water vapor feedback is a key debate within climate science circles. Some climate scientists have claimed atmospheric water vapor will not increase in response to global warming, and may even decrease. General circulation models, the primary tool scientists use to predict the future of our climate, forecast the atmosphere will experience a significant increase in water vapor.

NASA’s UARS satellite was used to measure water vapor on a global scale and with unprecedented accuracy in the upper troposphere. Humidity levels in this part of the atmosphere, especially in the tropics, are important for global climate, because this is where the water vapor has the strongest impact as a greenhouse gas.

UARS recorded both specific and relative humidity in the upper troposphere. Specific humidity refers to the actual amount of water vapor in the air. Relative humidity relates to the saturation point, the amount of water vapor in the air divided by the maximum amount of water the air is capable of holding at a given temperature. As air temperatures rise, warm air can hold more water, and the saturation point of the air also increases.

In most computer models relative humidity tends to remain fixed at current levels. Models that include water vapor feedback with constant relative humidity predict the Earth’s surface will warm nearly twice as much over the next 100 years as models that contain no water vapor feedback.

Using the UARS data to actually quantify both specific humidity and relative humidity, the researchers found, while water vapor does increase with temperature in the upper troposphere, the feedback effect is not as strong as models have predicted. “The increases in water vapor with warmer temperatures are not large enough to maintain a constant relative humidity,” Minschwaner said. These new findings will be useful for testing and improving global climate models.

NASA’s Earth Science Enterprise is dedicated to understanding the Earth as an integrated system and applying Earth system science to improve prediction of climate, weather and natural hazards using the unique vantage point of space. NASA plans to launch the Aura satellite in June 2004. Along with the Terra and Aqua satellites already in operation, Aura will monitor changes in Earth’s atmosphere.

Original Source: NASA News Release

The Origins of Oxygen on Earth

Image credit: NASA
Christopher Chyba is the principal investigator for The SETI Institute lead team of the NASA Astrobiology Institute. Chyba formerly headed the SETI Institute’s Center for the Study of Life in the Universe. His NAI team is pursuing a wide range of research activities, looking at both life’s beginnings on Earth and the possibility of life on other worlds. Astrobiology Magazine’s managing editor, Henry Bortman, spoke recently with Chyba about several of his team’s projects that will explore the origin and significance of oxygen in Earth’s atmosphere.

Astrobiology Magazine: Many of the projects that members of your team will be working on have to do with oxygen in Earth’s atmosphere. Today oxygen is a significant component of the air we breathe. But on early Earth, there was very little oxygen in the atmosphere. There is a great deal of debate about just how and when the planet’s atmosphere became oxygenated. Can you explain how your team’s research will approach this question?

Christopher Chyba: The usual story, with which you’re probably familiar, is that after oxygenic photosynthesis evolved, there was then a huge biological source of oxygen on early Earth. That’s the usual view. It may be right, and what’s usually the case in these kinds of arguments is not whether one effect is right or not. Probably many effects were active. It’s a question of what was the dominant effect, or whether there were several effects of comparable importance.

SETI Institute researcher Friedemann Freund has a completely non-biological hypothesis about the rise of oxygen, which has some experimental support from laboratory work that he’s done. The hypothesis is that, when rocks solidify from magma, they incorporate small amounts of water. Cooling and subsequent reactions leads to the production of peroxy links (consisting of oxygen and silicon atoms) and molecular hydrogen in the rocks.

Then, when the igneous rock is subsequently weathered, the peroxy links produce hydrogen peroxide, which decomposes into water and oxygen. So, if this is right, simply weathering igneous rocks is going to be a source of free oxygen into the atmosphere. And if you look at some of the quantities of oxygen that Friedemann is able to release from rocks in well-controlled situations in his initial experiments, it might be that this was a substantial and significant source of oxygen on early Earth.

So even apart from photosynthesis, there might be a kind of natural source of oxygen on any Earth-like world that had igneous activity and liquid water available. This would suggest that the oxidation of the surface might be something that you expect to occur, whether photosynthesis happens early or late. (Of course, the timing of this depends on oxygen sinks as well.) I emphasize that’s all a hypothesis at this point, for much more careful investigation. Friedemann’s done only pilot experiments so far.

One of the interesting things about Friedemann’s idea is that it suggests there might be an important source of oxygen on planets completely independent of biological evolution. So there might be a natural driver towards the oxidation of the surface of a world, with all the ensuing consequences for evolution. Or maybe not. The point is to do the work and find out.

Another component of his work, which Friedemann will do with the microbiolologist Lynn Rothschild of NASA Ames Research Center, has to do with this question of whether in environments associated with weathered igneous rocks and the production of oxygen, you could have created micro-environments that would have allowed certain microorganisms living in those environments to have pre-adapted to an oxygen-rich environment. They’ll be doing work with microorganisms to try to address that question.

AM: Emma Banks will be looking at chemical interactions in the atmosphere of Saturn’s moon Titan. How does that tie into understanding oxygen on early Earth?

CC: Emma’s looking at another abiotic way that might be important in oxidizing a world’s surface. Emma does chemical computational models, all the way down to the quantum mechanical level. She does them in a number of contexts, but what’s relevant to this proposal has to do with haze formation.

On Titan – and possibly on the early Earth as well, depending on your model for the atmosphere of the early Earth – there’s a polymerization of methane [the combination of methane molecules into larger hydrocarbon-chain molecules] in the upper atmosphere. Titan’s atmosphere is several percent methane; almost all the rest of it is molecular nitrogen. It’s bombarded with ultraviolet light from the sun. It’s also bombarded with charged particles from Saturn’s magnetosphere. The effect of that, acting on the methane, CH4 , is to break the methane up and polymerize it into longer-chain hydrocarbons.

If you start polymerizing methane into longer and longer carbon chains, each time you add another carbon onto the chain, you’ve got to get rid of some hydrogen. For example, to go from CH4 (methane) to C2H6, (ethane) you have to get rid of two hydrogens. Hydrogen is an extremely light atom. Even if it makes H2, that’s an extremely light molecule, and that molecule’s lost off the top of Titan’s atmosphere, just as it’s lost off the top of the Earth’s atmosphere. If you bleed hydrogen off the top of your atmosphere, the net effect is to oxidize the surface. So it’s another way that gives you a net oxidation of a world’s surface.

Emma’s interested in this primarily with respect to what takes place on Titan. But it’s also potentially relevant as a kind of global oxidizing mechanism for the early Earth. And, bringing nitrogen into the picture, she’s interested in the potential production of amino acids out of these conditions.

AM: One of the mysteries about early life on Earth is how it survived the damaging effects of ultraviolet (UV) radiation before there was enough oxygen in the atmosphere to provide an ozone shield. Janice Bishop, Nathalie Cabrol and Edmond Grin, all of whom are with the SETI Institute, are exploring some of these strategies.

CC: And there are a lot of potential strategies there. One is just being deep enough below the surface, whether you’re talking about the land or the sea, to be completely shielded. Another is to be shielded by minerals within the water itself. Janice and Lynn Rothschild are working on a project that is examining the role of ferric oxide minerals in water as a kind of UV shield.

In the absence of oxygen the iron in water would be present as ferric oxide. (When you have more oxygen, the iron oxidizes further; it becomes ferrous and drops out.) Ferric oxide could potentially have played the role of an ultraviolet shield in the early oceans, or in early ponds or lakes. To investigate how good it is as a potential UV shield, there are some measurements you might want to make, including measurements in natural environments, such as in Yellowstone. And once again there’s a microbiological component to the work, with Lynn’s involvement.

This is related to the project that Nathalie Cabrol and Edmond Grin are pursuing, from a different perspective. Nathalie and Edmond are very interested in Mars. They are both on the Mars Exploration Rover science team. In addition to their Mars work, Nathalie and Edmond explore environments on Earth as Mars analog sites. One of their topics of investigation is strategies for survival in high-UV environments. There’s a lake six kilometers high on Licancabur (a dormant volcano in the Andes). We now know there’s microscopic life in that lake. And we’d like to know what are its strategies for surviving in the high-UV environment there? And that’s a different, very empirical way of getting at this question of how life survived in the high-UV environment that existed on early Earth.

These four projects are all coupled, because they have to do with the rise of oxygen on early Earth, how organisms survived before there was substantial oxygen in the atmosphere, and then, how all this relates to Mars.

Original Source: Astrobiology Magazine

Early Oceans Might Have Had Little Oxygen

Image credit: NASA
As two rovers scour Mars for signs of water and the precursors of life, geochemists have uncovered evidence that Earth’s ancient oceans were much different from today’s. The research, published in this week’s issue of the journal Science, cites new data that shows that Earth’s life-giving oceans contained less oxygen than today’s and could have been nearly devoid of oxygen for a billion years longer than previously thought. These findings may help explain why complex life barely evolved for billions of years after it arose.

The scientists, funded by the National Science Foundation (NSF) and affiliated with the University of Rochester, have pioneered a new method that reveals how ocean oxygen might have changed globally. Most geologists agree there was virtually no oxygen dissolved in the oceans until about 2 billion years ago, and that they were oxygen-rich during most of the last half-billion years. But there has always been a mystery about the period in between.

Geochemists developed ways to detect signs of ancient oxygen in particular areas, but not in the Earth’s oceans as a whole. The team’s method, however, can be extrapolated to grasp the nature of all oceans around the world.

“This is the best direct evidence that the global oceans had less oxygen during that time,” says Gail Arnold, a doctoral student of earth and environmental sciences at the University of Rochester and lead author of the research paper.

Adds Enriqueta Barrera, program director in NSF’s division of earth sciences, “This study is based on a new approach, the application of molybdenum isotopes, which allows scientists to ascertain global perturbations in ocean environments. These isotopes open a new door to exploring anoxic ocean conditions at times across the geologic record.”

Arnold examined rocks from northern Australia that were at the floor of the ocean over a billion years ago, using the new she had method developed by her and co-authors, Jane Barling and Ariel Anbar. Previous researchers had drilled down several meters into the rock and tested its chemical composition, confirming it had kept original information about the oceans safely preserved. The team members brought those rocks back to their labs where they used newly developed technology -called a Multiple Collector Inductively Coupled Plasma Mass Spectrometer-to examine the molybdenum isotopes within the rocks.

The element molybdenum enters the oceans through river runoff, dissolves in seawater, and can stay dissolved for hundreds of thousands of years. By staying in solution so long, molybdenum mixes well throughout the oceans, making it an excellent global indicator. It is then removed from the oceans into two kinds of sediments on the seafloor: those that lie beneath waters, oxygen-rich and those that are oxygen-poor.

Working with coauthor Timothy Lyons of the University of Missouri, the Rochester team examined samples from the modern seafloor, including the rare locations that are oxygen-poor today. They learned that the chemical behavior of molybdenum’s isotopes in sediments is different depending on the amount of oxygen in the overlying waters. As a result, the chemistry of molybdenum isotopes in the global oceans depends on how much seawater is oxygen-poor. They also found that the molybdenum in certain kinds of rocks records this information about ancient oceans. Compared to modern samples, measurements of the molybdenum chemistry in the rocks from Australia point to oceans with much less oxygen.

How much less oxygen is the question. A world full of anoxic oceans could have serious consequences for evolution. Eukaryotes, the kind of cells that make up all organisms except bacteria, appear in the geologic record as early as 2.7 billion years ago. But eukaryotes with many cells-the ancestors of plants and animals- did not appear until a half billion years ago, about the time the oceans became rich in oxygen. With paleontologist Andrew Knoll of Harvard University, Anbar previously advanced the hypothesis that an extended period of anoxic oceans may be the key to why the more complex eukaryotes barely eked out a living while their prolific bacterial cousins thrived. Arnold’s study is an important step in testing this hypothesis.

“It’s remarkable that we know so little about the history of our own planet’s oceans,” says Anbar. “Whether or not there was oxygen in the oceans is a straightforward chemical question that you’d think would be easy to answer. It shows just how hard it is to tease information from the rock record and how much more there is for us to learn about our origins.”

Figuring out just how much less oxygen was in the oceans in the ancient past is the next step. The scientists plan to continue studying molybdenum chemistry to answer that question, with continuing support from NSF and NASA, the agencies that supported the initial work. The information will not only shed light on our own evolution, but may help us understand the conditions we should look for as we search for life beyond Earth.

Original Source: NSF News Release