Could Orion be Downgraded from a Six to Four Astronaut Vehicle?

A cut away graphic of the Orion Crew Module... with six seats (NASA)

[/caption]

To save on weight, NASA engineers are considering the option to remove two seats from the Orion crew exploration vehicle. According to the manager of the Constellation Program, a possible redesign option has been discussed with the International Space Station (ISS) partners despite the fact that the initial operational capability (IOC) to deliver crew to the ISS calls for a six-seat version. Although the space station crew will have expanded to six by the end of next month, NASA is confident the loss of two seats on Orion won’t cause any operational problems… at least we’ll still have Soyuz.

Today, the Orlando Sentinel reported that the Constellation Program, due to budget problems, probably won’t be ready for a return trip to the Moon until 2020, two years later than officially planned (NASA hoped for a 2018 mission). Now NASA engineers are concerned that the lunar mission may slip even further behind schedule.

To compound this bad news, NASA is weighing up its options to free up some mass from the initial Orion launches atop the Ares I rocket. This issue arose after Jeff Hanley, manager of the Constellation Program and Orion developer, said the Orion design was within “plus or minus a couple of hundred pounds” of the 21,000-pound maximum for the command module set by a safety requirement to land with only two of the three main parachutes deployed should one fail after reentry.

Right now we’re studying and really on the verge of deciding that we’re going to start with four,” Hanley said. “That gives us a common lunar and ISS version, but we’ve sized the system and have a design for six, so we’ll grow our capability as we need it.” So it’s not all bad news, the first launches may consist of four astronauts, but Orion could be modified to cater for six.

Hanley is keen to point out that although the brand new NASA manned space vehicle may be operating at a reduced capacity, at least Roscosmos will be able to help out. “Our Russian partners are always going to fly Soyuz or something derivative to that, so we’ll have the full coverage of being able to get the crew off the station in a pinch on the Soyuz and in the Orion,” he added. Soyuz is a three-crew space vehicle and is currently used by the space station as a “lifeboat” should an emergency crop up in orbit.

Apparently the Orion weight problem has been around for a while as the design of Orion is based on predicted weights, and not actual launch weight; if the actual weight exceeds that of the safety margin, cutbacks would be required. In this case the cutback may include two crew members.

Hanley points out that although the early Constellation flights may include a four-crew IOC, it would stand NASA in good stead so a good understanding of how well it performs with four seats before the possibility of expanding it to six.

Source: Aviation Week

Young Asteroids Age Fast with a Solar Wind Tan

Young asteroid tanning is big business in the Solar System (ESO)

[/caption]

If you stay out in the Sun too long, you’ll eventually get a suntan (or sunburn); your skin will also get damaged and it will show signs of ageing faster. This might sound like a sunblock ad, but the same principal holds true for the small chunks of rock floating around in the Solar System. Yes, a young asteroid’s surface will age prematurely, but it’s not caused by the Sun’s ultraviolet rays, it’s caused by the solar wind…

Within a million years, an asteroid can turn from lunar grey to Martian red when left out in the solar wind. A million years is a tiny amount of time in relation to the Solar System’s lifetime. Why is this important? European Southern Observatory (ESO) researchers have realized that this finding will not only help astronomers relate an asteroid’s appearance with its history, but it can act as an indicator for after effects of impacts with other asteroids.

It turns out that the study of “space weathering” is fairly controversial, scientists have been mulling it over for a long time. Central to the problem is the fact that the appearance of the interior of meteorites found on Earth are remarkably different to the asteroids we see in space; asteroids are redder than their meteorite cousins. So what causes this redness?

Asteroids seem to get a ‘sun tan’ very quickly,” says lead author Pierre Vernazza. “But not, as for people, from an overdose of the Sun’s ultraviolet radiation, but from the effects of its powerful wind.”

Although this is an interesting discovery, the speed at which the “tanning” occurs is astonishing. After an asteroid collision, fresh asteroid chunks are created with new surfaces. Within a million years these young asteroid surfaces will turn a dirty shade of red as the surface minerals are continuously battered by ionizing solar wind particles. “The charged, fast moving particles in the solar wind damage the asteroid’s surface at an amazing rate,” Vernazza added.

Naturally, a lot depends on the mineral composition of an asteroid’s surface, influencing how red its surface will become, but most of the tanning effect occurs in the first million years. Afterwards, the tanning continues, just at a slower rate.

Asteroid observations also reveal that the high proportion of “fresh surfaces” seen on near-Earth asteroid probably isn’t down to asteroid collisions. The frequency of collisions is far lower than the sun-tanning timescales, meaning that there shouldn’t be any “fresh surfaces” to be seen. It is far more likely that the upper layers of asteroids are renewed through planetary encounters, where the gravitational field of planets “shake off” the tanned dust.

Source: ESO

Brown Dwarfs Could Be More Common Than We Thought

In 2007, something strange happened to a distant star near the centre of our galaxy; it underwent what is known as a ‘microlensing’ event. This transient brightening didn’t have anything to do with the star itself, it had something to do with what passed in front of it. 1,700 light years away between us and the distant star, a brown dwarf crossed our line of sight with the starlight. Although one would think that the star would have been blocked by the brown dwarf, its light was actually amplified, generating a flash. This flash was created via a space-time phenomenon known as gravitational lensing.

Although lensing isn’t rare in itself (although this particular event is considered the “most extreme” ever observed), the fact that astronomers had the opportunity to witness a brown dwarf causing it means that either they were very lucky, or we have to think about re-writing the stellar physics textbooks…

By several measures OGLE-2007-BLG-224 was the most extreme microlensing event (EME) ever observed,” says Andrew Gould of Ohio State University in Columbus in a publication released earlier this month, “having a substantially higher magnification, shorter-duration peak, and faster angular speed across the sky than any previous well-observed event.”

OGLE-2007-BLG-224 revealed the passage of a brown dwarf passing in front of a distant star. The gravity of this small “failed star” deflected the starlight path slightly, creating a gravitational lens very briefly. Fortunately there were a number of astronomers prepared for the event and captured the transient flash of starlight as the brown dwarf focused the light for observers here on Earth.

From these observations, Gould and his team of 65 international collaborators managed to calculate some characteristics of the brown dwarf “lens” itself. The brown dwarf has a mass of 0.056 (+/- 0.004) solar masses, with a distance of 525 (+/- 40) parsecs (~1,700 light years) and a transverse velocity of 113 (+/- 21) km/s.

Although getting the chance to see this happen is a noteworthy in itself, the fact that it was a brown dwarf that acted as the lens is extremely rare; so rare in fact, that Gould believes something is awry.

In this light, we note that two other sets of investigators have concluded that they must have been ‘lucky’ unless old-population brown-dwarfs are more common than generally assumed,” Gould said.

Either serendipity had a huge role to play, or there are far more brown dwarfs out there than we thought. If there are more brown dwarfs, something isn’t right with our understanding of stellar evolution. Brown dwarfs may be a more common feature in our galaxy than we previously calculated…

Sources: “The Extreme Microlensing Event OGLE-2007-BLG-224: Terrestrial Parallax Observation of a Thick-Disk Brown Dwarf,” Gould et al., 2009. arXiv:0904.0249v1 [astro-ph.GA], New Scientist, Astroengine.com

Despite Global Warming, Wildfire Frequency Does Not Increase

An Alaskan wildfire engulfs woodland (John McColgan/BLM Alaska Fire Service)

[/caption]

As global average temperatures rise, it is widely believed the frequency of wildfires will increase. However, this may not be the case. According to analysis of sediment from lake beds in Northern Alaska, the frequency of wildfires didn’t relate to changes in temperature variation over the last few thousand years. This is strange, surely a warmer climate will dry out vegetation faster, creating more fuel for fires to ignite and spread? Apparently not, there appears to be a far more potent controlling factor at play…

In Southern California, the temperatures easily hit 95°F (35°C) today and I noticed the entire neighbourhood pumping a small reservoir’s-worth of water into their manicured lawns (creating an impressive river down the street). Our garden looks a little dry in comparison, I refuse to turn the sprinklers on until we really need it (for now, the hose will do). Summer appears to have arrived early, making me slightly nervous; the wildfires that blighted this region over the the last few years are sure to return. To make matters worse, we had a surprisingly wet winter, helping the spring growth of vegetation. It may be nice and green now, but all I see is surplus firewood.

However, as the last few thousand years have shown us, no matter how hot it gets, the frequency of wildfires may actually decrease.

Using samples from sediment cores at the bottom of Alaskan lakes, climatologist Philip Higuera of Montana State University has discovered it could be the type of vegetation that grows in response to temperature increases that affects the frequency of subsequent wildfires. There is little indication to suggest the frequency of wildfires increased as global average temperatures increased over the past 15,000 years. This might be counter-intuitive, but it would appear nature has an automatic fire-retardation mechanism.

Climate is only one control of fire regimes, and if you only considered climate when predicting fire under climate-change scenarios, you would have a good chance of being wrong,” Higuera says. “You wouldn’t be wrong if vegetation didn’t change, but the greater the probability that vegetation will change, the more important it becomes when predicting future fire regimes.”

Using radiocarbon dating techniques, Higuera’s team were able to accurately date the different layers in the metre-long sediment samples. From there, they analysed the charcoal deposits, therefore deriving the wildfire frequency in North Alaska woodlands. In addition, they analysed pollen content to understand what species of plant were predominant over the past 15,000 years. Then, using known climate data for the same period, the researchers were able to correlate the fire frequency with plant species and then relate the whole lot with trends in climate change. The results are very interesting.

One of the key discoveries was that climate change a was less important factor than vegetation changes when related to frequency of wildfires. According to sediment samples over the millennia, despite very dry periods in climate history, wildfire frequency decreases sharply. It appears that during periods of temperature increases, vegetation species change from flammable shrubs to fire-resistant deciduous trees.

Climate affects vegetation, vegetation affects fire, and both fire and vegetation respond to climate change,” Higuera adds. “Most importantly, our work emphasizes the need to consider the multiple drivers of fire regimes when anticipating their response to climate change.”

Although we may not escape the clutches of wildfires in Southern California this year, the last 15,000 years have shown us that this may gradually change as vegetation adapts to hotter conditions, becoming more fire-resistant…

Source: Physorg.com

Should We Really Tell ET Our Problems?

The design etched to the Pioneer probes. The first interstellar pornography? (NASA)

[/caption]So, you have a radio transmitter and you’ve been tasked to send a message into space to try to communicate with a hypothetical alien civilization. Where do you begin? Probably high on your list is to seek out the best candidate stars to send a signal to. As we only have experience of life on Earth, it’s a pretty good idea to look for Sun-like stars, as for all you know, that is the only place where Life As We Know It™ could exist.

So now you have found the potential location of an alien civilization, what message should you send? Firstly you’d probably want to make a good impression; perhaps sending directions to Earth, a universal map with an arrow pointing at the Solar System. Secondly you might want to identify what/who you are (insert some human physiology here). And third? Perhaps you’d consider sending information about our culture, civilization, history, science; all the good stuff that makes us human.

Would it cross your mind to mention there are 23 bloody conflicts going on right now amongst our own kind? Would you think about telling our potential alien neighbours about what you just had for dinner? Would it be a good idea to tell them about the political corruption in your country, the vast poverty worldwide or the ecological damage we are doing to our own home?

In a recent article written by the director of interstellar message composition at the SETI Institute, the question about communicating honestly with ET, without sanitizing the truth, is asked. Should we really tell an alien civilization about our problems?

Communication with potential alien races is a tricky business (Ian O'Neill)
Communication with potential alien races is a tricky business (Ian O'Neill)
For five decades, the Search for Extraterrestrial Intelligence (SETI) has been scouring the skies for any signal from an intelligent alien civilization. This is a painstaking task that requires much patience and lots of ingenuity. After all, what are we looking for? Assuming extraterrestrial civilizations have worked out how to transmit radio, perhaps we could listen out for that. Unfortunately, apart from the 72 second Wow! signal in 1977, it all seems very quiet out there. If the Drake Equation is to be taken literally, the Milky Way should be teeming with life, some of which should be transmitting their greatest hits right now. There are problems with this theory, as some believe that although aliens might be transmitting, radio signals might not reach us. Perhaps then a sufficiently advanced alien race might be using powerful laser beacons or moving stars to communicate with us. Alas, nothing. Yet.

OK, so let’s turn this around. Perhaps we’ll have more luck if we start transmitting radio signals to Sun-like stars in the hope of an alien race as advanced as ourselves receiving it. This program is known as Messaging Extraterrestrial Intelligence (METI) or “Active-SETI.” But what do we say? One of the earliest messaging attempts was the plaque bolted to the side of the Pioneer spacecraft (pictured top), even though the naked human figures representing male and female caused a stir (some groups considered the naked human form interstellar pornography). Despite a few disputes about what we should be sending into space, generally the messages have been very positive, trying to portray the human race in a very positive light.

Douglas Vakoch, from the SETI Institute in Mountain View, California, disagrees with the policy of sending only positive messages into space via radio transmissions or metal plaques strapped to the sides of spaceships.

An acknowledgment of our flaws and frailties seems a more honest approach than sending a sanitised, one-sided story,” Vakoch said in a recent New Scientist column. “Honesty is a good starting point for a conversation that could last for generations.”

As the director of interstellar message composition, Vakoch obviously knows a thing or two about sending messages to our potential alien neighbours. However, the question as to whether or not we should sanitize our communications seems a little strange. Of course we should transmit the best mankind has to offer! I don’t believe sending messages of culture, science, mathematics, art and music would be setting us up for a fall. If we are indeed the new kids on the block of extraterrestrial civilizations, I think we’d need to make a great impression (depending on whether ET understands what we are trying to communicate in the first place).

The 1977 Wow! signal (SETI)
The 1977 Wow! signal (SETI)
Vakoch is keen to point out that a sufficiently advanced alien civilization is going to be savvy as to what it takes to be a a galactic race (it’s not all roses after all). If they receive a message from mankind full of positive messages, perhaps they won’t trust us. Worse still, as they get to know us, they think we were hiding our human flaws, misleading them in some way. Therefore, we need to be honest up-front. We need to send the views and opinions of as many people as possible, for good or bad, so extraterrestrial civilizations know what they are dealing with; a talented, yet flawed race.

Unfortunately, that goes against human nature. What’s the first thing you do when moving into a new neighbourhood? You might throw a house-warming party, as a way to introduce yourself to the new neighbours. You probably wouldn’t tell your neighbours about your family/money/alcohol/drug/criminal problems at the party. If you did, you might find the room emptying very quickly. It’s not that you are being dishonest, you’re trying to gain their first-impression trust and interest. This principal holds true for companies trying to sell a product (I have yet to have a doorstep sales person telling me his encyclopaedia collection is actually useless when the world has Wikipedia) and to countries forming new diplomatic ties. We know there’s more to the story than just first impressions, but first impressions are the bonds that help develop the relationship in the future.

Assimilation could be ET's solution to human problems
Assimilation could be ET's solution to human problems
So going back to being honest with messaging alien civilizations, if we send “the truth” about our race, we would actually be doing ourselves a disservice. What if the receiving alien civilization doesn’t want to be associated with us as we are considered too aggressive, cruel, greedy or weird?

We can’t second-guess how an extraterrestrial civilization is going to respond to us, there is no precedent of alien communications, so perhaps we should take the “sanitized” approach. Positive information is probably enough information; too much information could turn us into interstellar outcasts before we’ve even had a chance to receive a message from another star. (I thought it was a little too quiet out there, perhaps they received our commercial TV signals.)

And if the advanced alien race deems us “not worthy” on account of the mixed signals we are sending out, they might turn hostile sooner rather than later

Where is the Most Remote Location on Earth?

A heat map of travel-times to nearest city

[/caption]

According to a new study, less than 10% of the world’s land is more than 48 hours of travel from the nearest city. This doesn’t include air travel, it is ground-travel only (i.e. on foot, train, car, boat, bike, horse, donkey). So no matter where you are in the world, there’s a good chance you can get to somewhere substantially populated within two days. At face-value, this might not seem very important, but when you look at the maps, you see many wilderness locations aren’t quite as remote as we once thought they were. The Amazon Rainforest for example is surprisingly well connected (rivers are quite useful in that respect), and the remote deserts of Africa have a pretty efficient road network.

So, where is the most remote location on Earth? How long would it take to get there?

I can happily say that for 5 months I lived in one of the most remote places in the world. The Norwegian archipelago of Svalbard in the High Arctic turns out to be a very extreme place even if you put the polar bears and -30°C temperatures to one side. No matter how hard you try, it would take 2-3 days by boat to travel from Longyearbyen (on the main island of Spitsbergen) to the Norwegian mainland city of Tromsø. Unfortunately, the number of places around the globe that can boast this are rapidly shrinking.

The fact is, the travel time of any point from the nearest settlement of over 50,000 people using only ground-travel is decreasing rapidly. Transportation infrastructures are spreading and population density is increasing, meaning more people are making bigger cities closer together.

Travel times as used by the researchers
Travel times as used by the researchers
A new set of maps created by researchers at the European Commission’s Joint Research Centre in Ispra, Italy, and the World Bank illustrates just how “connected” our world has become and it also highlights the dwindling number of “true” wildernesses.

Based on a computer model that calculates the journey time to the nearest city of 50,000+ people taking only land or water. The variables included in this complex model are types of terrain, road, rail and river network access, altitude, terrain steepness and obstacles (such as border crossings). The key conclusions the researchers gained are that less than 10% of the planet’s landmass is more than 48 hours ground-travel away from the nearest city. The Amazon, for example, only has 20% of its landmass more than 2 days away from the nearest Brazilian city (owed primarily to its vast network of rivers).

The most striking maps include the plotting of the busiest waterways (the English Channel, Mediterranean and South China Seas are the most crowded) and the scope of the world’s road network. In fact, it is little wonder the international community is worried about the increasing numbers of Somalian pirate attacks; another very busy shipping lane is sandwiched between Somalia and Yemen (the key route from the Indian Ocean to the Mediterranean).

The most remot point on the entire planet: 34.7°N 85.7°E - the Tibeten plateau
The most remote point on the entire planet: 34.7°N 85.7°E - the Tibetan plateau
So where is the most remote place on Earth? The Tibetan plateau (pictured left). From 34.7°N 85.7°E, it would take three whole weeks to travel to the cities of Lhasa or Korla. If you were to take this trip, expect to walk for 20 days and drive by car for one day. Partly due to the rough terrain and 5200 metres in altitude, Tibet will probably remain the most extreme place on Earth for some time to come.

It is hoped these maps will serve as a baseline for future studies, showing how nations deal with population growth, how nature is being eroded and possibly providing some insight as to how to manage the planet a little better than we are at present…

View all the maps »

Source: New Scientist

Constraining the Orbits of Planet X and Nemesis

Artists impression of the hypothetical star, Nemesis (Wikipedia)

[/caption]

If Planet X was out there, where would it be? This question posed by an Italian researcher turns out to be a lot more involved than you’d think. As opposed to all the 2012 idiocy hype flying around on the internet, this research is actually based on a little thing called science. By analysing the orbital precession of all the inner-Solar System planets, the researcher has been able to constrain the minimum distance a hypothetical object, from the mass of Mars to the mass of the Sun, could be located in the Solar System. As most of the astronomical community already knows, the two purveyors of doom (Planet X and the Sun’s evil twin, Nemesis) exist only in the over-active imaginations of a few misinformed individuals, not in reality…

Planet X and Nemesis are hypothetical objects with more grounding in ancient prophecy and doomsday theories based on pseudo-science. This might be the case, but Planet X came from far more rational beginnings.

The name “Planet X” was actually coined by Percival Lowell at the start of the 20th century when he predicted there might be a massive planet beyond the orbit of Neptune. Then, in 1930, Clyde Tombaugh appeared to confirm Lowell’s theory; a planet had been discovered and it was promptly named Pluto. However, as time went on, it slowly became apparent that Pluto wasn’t massive enough to explain the original observations of the perturbations of Uranus’ orbit (the reason for Lowell’s Planet X prediction in the first place). By the 1970’s and 80’s modern observation techniques proved that the original perturbations in Uranus’ orbit were measurement error and not being caused by a massive planetary body. The hunt for Planet X pretty much ended with the discovery of Pluto in 1930, but it never lived up to its promise as a massive planetary body (despite what the woefully erroneous doomsday theories say otherwise).

Now an Italian researcher has published results from a study that examines the orbital dynamics of the inner-Solar System planets, and relates them to the gravitational influence of a massive planetary body orbiting the Sun from afar.

To cut a long story short, if a massive planetary body or a small binary sibling of the Sun were close to us, we would notice their gravitational influence in the orbital dynamics of the planets. There may be some indirect indications that a small planetary body might be shaping the Kuiper Cliff, and that a binary partner of the Sun might be disturbing the Oort Cloud every 25 million years or so (relating to the cyclical mass extinctions in Earth’s history, possibly caused by comet impacts), but hard astronomical proof has yet to be found.

Lorenzo Iorio from the National Institute of Nuclear Physics in Pisa (Italy) has taken orbital data from many years of precise observations and used his computations to predict the closest possible distance at which a massive planet could orbit if it was out there.

It turns out that all the planets the mass of Mars and above have been discovered within the Solar System. Iorio computes that the minimum possible distances at which a Mars-mass, Earth-mass, Jupiter-mass and Sun-mass object can orbit around the Sun are 62 AU, 430 AU, 886 AU and 8995 AU respectively. To put this into perspective, Pluto orbits the Sun at an average distance of 39 AU.

So if we used our imaginations a bit, we could say that a sufficiently sized Planet X could be patrolling a snail-paced orbit somewhere beyond Pluto. But there’s an additional problem for Planet X conspiracy theorists. If there was any object of sufficient size (and by “sufficient” I mean Pluto-mass, I’m being generous), according to a 2004 publication by David Jewitt, from the Institute for Astronomy, University of Hawaii, we would have observed such an object by now if it orbited within 320 AU from the Sun.

Suddenly, the suggestion that Planet X will be making an appearance in 2012 and the crazy idea that anything larger than a Pluto-sized object is currently 75 AU away seems silly. Sorry, between here and a few hundred AU away, it’s just us, the known planets and a load of asteroids (and perhaps the odd plutino) for company.

Source: arXiv, Astroengine.com

Kepler Will Be Used to Measure the Size of the Universe

Artist's rendering of the Kepler Mission (NASA)

[/caption]

On April 7th, commands were sent to NASA’s exoplanet-hunting Kepler telescope to eject the 1.3×1.7 metre lens cap so the unprecedented mission could begin its hunt for Earth-like alien worlds orbiting distant stars. However, one UK astronomer won’t be using the Kepler data to detect the faint transits of rocky exoplanets in front of their host stars. He’ll be using it to monitor the light from a special class of variable star, and through the extreme precision of Kepler’s optics he will be joining an international team of collaborators to redefine the size of the Universe…

Kepler is carrying the largest camera ever launched into space. The camera has 42 charge-coupled devices (CCDs) to monitor the very slight changes in star brightness as an exoplanet passes in front of its host star. Considering the fact that it is hoped Kepler will detect exoplanets a little larger than our planet (known as super-Earths), the instrument is extremely sensitive. It is for this reason that not only exoplanet hunters are interested in using Kepler’s sensitive eye.

Using Kepler data, Dr Alan Penny, a researcher at the University of St Andrews will be joining a 200-strong team of astronomers to analyse the light not emitted from exoplanet-harbouring stars, but from a smaller group of variable stars that fluctuate in brightness with striking regularity and precision. These stars are Cepheid variables, also known as “standard candles” as they can be relied upon for their strong correlation between period of variability and absolute luminosity. This means that no matter where Cepheids are observed in galaxies or clusters, astronomers can always deduce the distance from the Earth to the Cepheid with great precision. The only thing limiting astronomers is the precision that can be attained by instrumentation, so when Kepler left Earth, carrying the most advanced and sensitive camera ever to be taken into space, Penny and his collaborators jumped at the chance to use Kepler to refine the measurement of the Universe.

While Kepler is doing its exciting planet-hunting, we will be using its extreme precision to resolve a possible problem with our measurement of the size of the Universe,” said Penny. “These variable stars known as ‘Cepheids’ form the base of a series of steps by which we measure the distance to distant galaxies and, through them, we can measure the size of the Universe.”

Current estimates place the size of the Universe at 93 billion light years across, but Penny believes Kepler observations of a small selection of Cepheids may change this value by a few percent. When precision observations of a very precise stellar period-brightness relationship, it’s nice to be able to use the most precise instrument you can lay your hands on. However, our understanding of the “standard candles” themselves is very poor, and small-scale, dynamic changes on the star itself can go unnoticed on the ground. Kepler should shed some light on gaps in our knowledge of Cepheids as well as give us the best-yet measurement of the scale of our Universe.

These Cepheid stars which get brighter and fainter by some tens of percent every ten to a hundred days are mostly understood. But recently it has become clear that our theories of what happens in the outer layers of these stars which cause the variations in brightness do not totally agree with what we see. The exquisite accuracy of Kepler in measuring star brightness, one hundred times better than we can do from the ground, means we can get such good measurements that we should be able to match theory with observation. Resolving the issue may only change estimates of the size of the Universe by a small amount, but we won’t rest easy until the problem is solved.” — Dr Alan Penny

Source: Physorg.com

NASA Worried: Spirit Reboots Rover Computer, Twice

A panorama by Sprit, taken last month (NASA)

[/caption]

Mars Exploration Rover Spirit is acting a little strange. Over the Easter weekend, it would appear that the tenacious little Mars explorer rebooted its computer not once, but at least two times. Mission scientists were alerted to the problem as some of the communication sessions from Spirit-to-Earth were irregular, prompting mission control to investigate the problem…

While we don’t have an explanation yet, we do know that Spirit’s batteries are charged, the solar arrays are producing energy and temperatures are well within allowable ranges. We have time to respond carefully and investigate this thoroughly,” said John Callas project manager for Spirit at NASA’s Jet Propulsion Laboratory. “The rover is in a stable operations state called automode and taking care of itself. It could stay in this stable mode for some time if necessary while we diagnose the problem.”

Although Spirit successfully communicated with Earth on Friday, Saturday and Sunday, some of the communication sessions were irregular, suggesting that the rover’s computer was resetting during the use of the high-gain dish antenna. The twin rovers, Spirit and Opportunity, have several modes of communication open to them, including a pointable high-gain antenna, non-pointable low-gain antenna and a separate UHF transceiver that uses the Mars satellites as relays.

It seems possible that the issue could be software-based, as new software was uploaded only last month. However, this is a routine upload and Opportunity received identical commands and has not suffered from anomalous reboots. Playing it safe, mission operators are not using the high-gain antenna and have opted for using the slower, low-gain antenna while the investigation into the fault is ongoing. Unfortunately, the thought on everyone’s mind is that the rover may be suffering from age-related problems; after all, the MER mission was originally slated for three months, not five years.

Although there is concern for Spirit, it is hoped that the weekend’s reboot isn’t a sign of bigger problems. The rover may have had a rough mission (one of its wheels stopped moving three years ago, and it has since been dragged everywhere it has trundled), but the science it is producing has been astounding. Hopefully Spirit will fight through this latest glitch and continue to do battle in the Martian regolith for a while yet…

Source: NASA

Fires Rage Through Central America

Hundreds of fires rage in southern Mexico, on the Yucatan Peninsula, and in northern Guatemala and northern Honduras. Image by NASAs Aqua satellite (see the super-high resolution version)

[/caption]

Having just read about the deadly wildfires in Texas and Oklahoma, I was interested to see whether one of NASA’s Earth-monitoring satellites have been tracking the situation from orbit. Whether it is too early for observations to come in, or whether one of the satellites have yet to make a pass directly above the states it is unclear, but along the way I noticed a rather striking image of the Yucatan Peninsula, Central America. In the picture retrieved by NASA’s Aqua satellite are countless wildfires dotted over Mexico, Guatemala and Honduras. It looks like a combination of arson, agricultural activity and accidental blazes are gripping the region, fuelled by dry vegetation…

As tornadoes turn parts of Arkansas into “warzones”, Texas and Oklahoma are dealing with wildfires. Although these states are no stranger to fires, the continuing drought in the southern US states are causing these fires to rage over larger areas for longer periods. Southern California is also sustaining a three-year drought, and in 2008 just north of LA (where I’m located), it seemed that a week didn’t go by without smelling smoke in the air. In fact, at one point, the wildfires burned worryingly close to where I live, filling the house with smoke at 5:30am one morning. Fortunately, we were the lucky ones, but others weren’t so fortunate and fell victim to the direction of the wind, losing property and, in some tragic cases, their lives. Unfortunately, all predictions are that if 2009 is going to be as dry as last year, SoCal will bear the brunt of another round of wildfires, and hearing bad news from Texas and Oklahoma is an uneasy reminder of things to come.

However, on browsing NASA’s Earth Observatory website, it looks like our neighbours are having a hard time with wildfires too. Hundreds of fires were burning at the start of this month, but many didn’t have a natural beginning.

On viewing the imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on board the Aqua satellite, it is hard to comprehend the scale of the wildfire-affected region. MODIS can detect areas of intense heat, allowing NASA’s Earth Observatory program to pinpoint the location and scale of burning vegetation. Red spots (fires) cover southern Mexico, on the Yucatan Peninsula, and in northern Guatemala and northern Honduras and smoke hangs over Campeche Bay in the Gulf of Mexico. This scene isn’t just caused by dry weather, it is a symptom of the pressure being applied to the tropical region by human activity. Land is at a premium in these developing regions of the world, so there is a need for farmers and loggers to clear vast areas of land to economise on the dry conditions to spread the fire. November to May are particularly difficult months as this is central America’s dry season.

Conservationists are on the look-out for accidental fires, but there is always the problem of intentional fires too. As farmland becomes scarce or access to prime logging forests becomes difficult, arson becomes a huge problem.

Fortunately, modern technology is helping countries to locate fires in real-time before they have a chance to spread. MODIS data is fed into the Fire Information for Resource Management System (FIRMS) and had users in 60 countries only months after the system was set up in 2006. Land managers and conservation groups in Central America receive messages via mobile phones and email should a fire be detected in their region, so hopefully the risk of large-scale damage can be limited.

Source: Earth Observatory