New Canadian Radio Telescope is Detecting Fast Radio Bursts

The CHIME Telescope, located at the Dominion Radio Astrophysical Observatory (DRAO), in British Columbia. Credit: chime-experiment.ca

Since they were first detected in 2007, Fast Radio Bursts (FRBs) have been a source of mystery to astronomers. In radio astronomy, this phenomenon refers to transient radio pulses coming from distant sources that typically last a few milliseconds on average. Despite the detection of dozens of events since 2007, scientists are still not sure what causes them – though theories range from exploding stars, black holes, and magnetars to alien civilizations!

To shed light on this mysterious phenomena, astronomers are looking to new instruments to help search for and study FRBs. One of these is the Canadian Hydrogen Intensity Mapping Experiment (CHIME), a revolutionary new radio telescope located at the Dominion Radio Astrophysical Observatory (DRAO) in British Columbia. On July 25th, still in its first year, this telescope made its first-ever detection, an event known as FRB 180725A.

The detection of FRB 180725A was announced online in a “Astronomer’s Telegram” post, which is intended to alert the astronomical community about possible new finds and encourage follow-up observations. The detection of FRB 180725A is very preliminary at this point, and more research is needed before its existence as an FRB can be confirmed.

As they stated in the Astronomers Telegram announcement, the radio was signal was detected on July 25th, at precisely 17:59:43.115 UTC (09:59.43.115 PST), and at a radio frequency of 400 MHz:

“The automated pipeline triggered the recording to disk of ~20 seconds of buffered raw intensity data around the time of the FRB. The event had an approximate width of 2 ms and was found at dispersion measure 716.6 pc/cm^3 with a signal-to-noise ratio S/N ~20.6 in one beam and 19.4 in a neighboring beam. The centers of these, approximately 0.5 deg wide and circular beams, were at RA, Dec = (06:13:54.7, +67:04:00.1; J2000) and RA, Dec = (06:12:53.1, +67:03:59.1; J2000).”

Research into Fast Radio Bursts is still in its infancy, being a little more than a decade old. The first ever to be detected was the famous Lorimer Burst, which was named after it discoverer – Duncan Lorimer, from West Virginia University. This burst lasted a mere five milliseconds and appeared to be coming from a location near the Large Magellanic Cloud, billions of light years away.

So far, the only FRB that has been found to be repeating was the mysterious signal known as FRB 121102, which was detected by the Arecibo radio telescope in Puerto Rico in 2012. The nature of this FRB was first noticed by a team of students from McGill University (led by then-PhD Student Paul Scholz), who sifted through the Arecibo data and determined that the initial burst was followed by 10 additional burst consistent with the original signal.

The NSF’s Arecibo Observatory, which is located in Puerto Rico, is the world largest radio telescope. Arecibo detected 11 FRBs over the course of 2 months. Credit: NAIC

In addition to being the first time that this Canadian facility detected a possible FRB coming from space, this is the first time that an FRB has been detected below the 700 MHz range. However, as the CHIME team indicate in their announcement, other signals of equal intensity may have occurred in the past, which were simply not recognized as FRBs at the time.

“Additional FRBs have been found since FRB 180725A and some have flux at frequencies as low as 400 MHz,” they wrote. “These events have occurred during both the day and night and their arrival times are not correlated with known on-site activities or other known sources of terrestrial RFI (Radio Frequency Identification).”

As a result, this most-recent detection (if confirmed) could help astronomers shed some additional light on what causes FRBs, not to mention place some constraints on what frequencies they can occur at. Much like the study of gravitational waves, the field of study is new but rapidly growing, and made possible by the addition of cutting-edge instruments and facilities around the world.

Further Reading: CNET

It Looks Like Plate Tectonics Aren’t Required to Support Life

Artist's concept of Kepler-69c, a super-Earth-size planet in the habitable zone of a star like our sun, located about 2,700 light-years from Earth in the constellation Cygnus. Credit: NASA

When looking for potentially-habitable extra-solar planets, scientists are somewhat restricted by the fact that we know of only one planet where life exists (i.e. Earth). For this reason, scientists look for planets that are terrestrial (i.e. rocky), orbit within their star’s habitable zones, and show signs of biosignatures such as atmospheric carbon dioxide – which is essential to life as we know it.

This gas, which is the largely result of volcanic activity here on Earth, increases surface heat through the greenhouse effect and cycles between the subsurface and the atmosphere through natural processes. For this reason, scientists have long believed that plate tectonics are essential to habitability. However, according to a new study by a team from Pennsylvania State University, this may not be the case.

The study, titled “Carbon Cycling and Habitability of Earth-Sized Stagnant Lid Planets“, was recently published in the scientific journal Astrobiology. The study was conducted by Bradford J. Foley and Andrew J. Smye, two assistant professors from the department of geosciences at Pennsylvania State University.

The Earth’s Tectonic Plates. Credit: msnucleus.org

On Earth, volcanism is the result of plate tectonics and occurs where two plates collide. This causes subduction, where one plate is pushed beneath the other and deeper into the subsurface. This subduction changes the dense mantle into buoyant magma, which rises through the crust to the Earth’s surface and creates volcanoes. This process can also aid in carbon cycling by pushing carbon into the mantle.

Plate tectonics and volcanism are believe to have been central to the emergence of life here on Earth, as it ensured that our planet had sufficient heat to maintain liquid water on its surface. To test this theory, Professors Foley and Smye created models to determine how habitable an Earth-like planet would be without the presence of plate tectonics.

These models took into account the thermal evolution, crustal production and CO2 cycling to constrain the habitability of rocky, Earth-sized stagnant lid planets. These are planets where the crust consists of a single, giant spherical plate floating on mantle, rather than in separate pieces. Such planets are thought to be far more common than planets that experience plate tectonics, as no planets beyond Earth have been confirmed to have tectonic plates yet. As Prof. Foley explained in a Penn State News press release:

“Volcanism releases gases into the atmosphere, and then through weathering, carbon dioxide is pulled from the atmosphere and sequestered into surface rocks and sediment. Balancing those two processes keeps carbon dioxide at a certain level in the atmosphere, which is really important for whether the climate stays temperate and suitable for life.”

Map of the Earth showing fault lines (blue) and zones of volcanic activity (red). Credit: zmescience.com

Essentially, their models took into account how much heat a stagnant lid planet’s climate could retain based on the amount of heat and heat-producing elements present when the planet formed (aka. its initial heat budget). On Earth, these elements include uranium which produces thorium and heat when it decays, which then decays to produce potassium and heat.

After running hundreds of simulations, which varied the planet’s size and chemical composition, they found that stagnant lid planets would be able to maintain warm enough temperatures that liquid water could exist on their surfaces for billions of years. In extreme cases, they could sustain life-supporting temperatures for up to 4 billion years, which is almost the age of the Earth.

As Smye indicated, this is due in part to the fact that plate tectonics are not always necessary for volcanic activity:

“You still have volcanism on stagnant lid planets, but it’s much shorter lived than on planets with plate tectonics because there isn’t as much cycling. Volcanoes result in a succession of lava flows, which are buried like layers of a cake over time. Rocks and sediment heat up more the deeper they are buried.”

Image of the Sarychev volcano (in Russia’s Kuril Islands) caught during an early stage of eruption on June 12, 2009. Taken by astronauts aboard the International Space Station. Credit: NASA

The researchers also found that without plate tectonics, stagnant lid planets could still have enough heat and pressure to experience degassing, where carbon dioxide gas can escape from rocks and make its way to the surface. On Earth, Smye said, the same process occurs with water in subduction fault zones. This process increases based on the quantity of heat-producing elements present in the planet. As Foley explained:

“There’s a sweet spot range where a planet is releasing enough carbon dioxide to keep the planet from freezing over, but not so much that the weathering can’t pull carbon dioxide out of the atmosphere and keep the climate temperate.”

According to the researchers’ model, the presence and amount of heat-producing elements were far better indicators for a planet’s potential to sustain life. Based on their simulations, they found that the initial composition or size of a planet is very important for determining whether or not it will become habitable. Or as they put it, the potential habitability of a planet is determined at birth.

By demonstrating that stagnant lid planets could still support life, this study has the potential for greatly extending the range of what scientists consider to be potentially-habitable. When the James Webb Space Telescope (JWST) is deployed in 2021, examining the atmospheres of stagnant lid planets to determine the presence of biosignatures (like CO2) will be a major scientific objective.

Knowing that more of these worlds could sustain life is certainly good news for those who are hoping that we find evidence of extra-terrestrial life in our lifetimes.

Further Reading: PennState, Astrobiology

Five Teams Compete to Design a 3D Printed Mars Habitat for NASA

Team Zopherus of Rogers, Arkansas, is the first-place winner in NASA’s 3D-Printed Habitat Challenge, Phase 3: Level 1 competition. Credit: NASA

If and when we decide to go to Mars (and stay there), the Martian settlers will face some serious challenges. For one, the planet is extremely cold compared to Earth, averaging at about -63 °C (-82°F), which is comparable to cold night in Antarctica. On top of that, there’s the incredibly thin atmosphere that is unbreathable to humans and terrestrial creatures. Add to that the radiation, and you begin to see why settling Mars will be difficult.

But as the saying goes, necessity is the mother of invention. And to stimulate the invention process, NASA has partnered with Bradley University of Peoria to launch the 3D-Printed Habitat Centennial Challenge competition. As part of NASA’s Centennial Challenges, which are sponsored by the Space Technology Mission Directorate, this competition recently awarded $100,000 in prize money to five teams for their design concepts.

The NASA Centennial Challenges were initiated in 2005 to directly engage the public, and produce revolutionary applications for space exploration challenges. The program offers incentive prizes to stimulate innovation in basic and applied research, technology development, and prototype demonstration. To administer the competition, Bradley University also partnered with sponsors Caterpillar, Bechtel and Brick & Mortar Ventures.

For the competition, participants were tasked with creating digital representations of the physical and functional characteristics of a Martian habitat using specialized software tools. A panel of NASA, academic and industry experts awarded the team points based on various criteria, which determined how much prize money each winning team got. Out of 18 submissions from all over the world, 5 teams were selected.

In order of how much prize money they were awarded, the winning teams were:

  1. Team Zopherus of Rogers, Arkansas – $20,957.95
  2. AI. SpaceFactory of New York – $20,957.24
  3. Kahn-Yates of Jackson, Mississippi – $20,622.74
  4. SEArch+/Apis Cor of New York – $19,580.97
  5. Northwestern University of Evanston, Illinois – $17,881.10

The design competition emphasizes all the challenges that building a life-supporting habitat on Mars would entail, which includes the sheer distances involved and the differences in atmosphere and landscapes. In short, the teams needed to create habitats that would be insulated and air-tight and could also be built using local materials (aka. in-situ resource utilization).

The competition began in 2014 and has been structured in three phases. For Phase 1, the Design Competition (which was completed in 2015 with $50,000 prize purse), the teams were required to submit a rendering of their proposed habitat. Phase 2, the Structural Member Competition, focused on material technologies and required teams to create structural components. This phase was completed in 2017 with a $1.1 million prize purse.

For Phase 3, the On-Site Habitat Competition – which is the current phase of the competition – competitors were tasked with fabricated sub-scale versions of their habitats. This phase has five levels of competition, which consist of two virtual levels and three construction levels. For the former, the teams were tasked with using Building Information Modeling (BIM) software to design a habitat that combines all the structural requirements and systems it must contain.

For the construction levels, the teams will be required to autonomously fabricate 3D-printed elements of the habitat, culminating with a one-third-scale printed habitat for the final level. By the end of this phase, teams will be awarded prize money from a $2 million purse. As Monsi Roman, the program manager for NASA’s Centennial Challenges, said in a recent NASA press statement:

“We are thrilled to see the success of this diverse group of teams that have approached this competition in their own unique styles. They are not just designing structures, they are designing habitats that will allow our space explorers to live and work on other planets. We are excited to see their designs come to life as the competition moves forward.”

The winning entries included team Zorphues’ concept for a modular habitat that was inspired by biological structures here on Earth. The building-process begins with a lander (which is also a mobile print factory) reaching the surface and scanning the environment to find a good “print area”. It then walks over this area and deploys rovers to gather materials, then seals to the ground to provide a pressurized print environment.

The main module is then assembled using pre-fabricated components (like airlocks, windows, atmospheric control, toilets, sinks, etc), and the structure is printed around it. The printer then walks itself to an adjacent location, and prints another module using the same method. In time, a number of habitats are connected to the main module that provide spaces for living, recreation, food production, scientific studies, and other activities.

For their concept, the second place team (Team AI. SpaceFactory) selected a vertically-oriented cylinder as the most efficient shape for their Marsha habitat. According to the team, this design is not only the ideal pressure environment, but also maximizes the amount of usable space, allows for the structure to be vertically-divided based on activities, is well-suited to 3-D printing and takes up less surface space.

The team’s also designed their habitat to deal with temperature changes on Mars, which are significant. Their solution was to design the entire structure as a flanged shell that moves on sliding bearings at its foundation in response to temperature changes. The structure is also a double shell, with the outer (pressure) shell separate from the inner habitat entirely. This optimizes air flow and allows for light to filters in to the entire habitat.

Next up is the Khan-Yates habitat, which the team designed to be specifically-suited to withstand dust storms and harsh climates on the Red Planet. This coral-like dome consists of a lander that would set down in the equatorial region, then print a foundation and footing layer using local materials. The print arm would then transition vertically to begin printing the shell and the floors.

The outer shell is studded with windows that allow for a well-lit environment, the outer shell is separate from the core, and the shape of the structure is designed to ensure that dust storms flow around the structure. In fourth place was SEArch+/Apis Cor’s Mars X house, a habitat designed to provide maximum radiation protection while also ensuring natural light and connections to the Martian landscape.

The habitat is constructed by mobile robotic printers, which are deployed from a Hercules Single-Stage Reusable Lander. The design is inspired by Nordic architecture, and uses “light scoops” and floor-level viewing apertures to ensure that sunlight in the northern latitudes makes it into the interior. The two outer (and overlapping) shells house the living areas, which consist of two inflatable spaces with transparent CO2 inflated window pockets.

Fifth place went to the team from Northwestern University for their Martian 3Design habitat, which consists of an inner sphere closed-shell and an outer parabolic dome. According to the team, this habitat provides protection from the Martian elements through three design features. The first is the internal shape of the structure, which consists of a circular foundation, an inflatable pressure vessel that serves as the main living area, and the outer shell.

The second feature is the entryway system, which extend from opposite ends of the structure and serves as entrances and exits and could provide junctions with future pods. The third feature is the cross-beams that are the structural backbone of the dome and are optimized for pressure-loading under Martian gravity and atmospheric conditions, and provide continuous protection from radiation and the elements.

The interior layout is based on the NASA Hawai’i Space Exploration Analog and Simulation (HI-SEAS) habitat, and is divided between “wet areas” and “dry areas”. These areas are placed on opposite sides of the habitat to optimize the use of resources by concentrated in them on one side (rather than have them running throughout that habitat), and space is also divided by a central, retractable wall that separates the interior into public and private areas.

Together, these concepts embody the aims of the 3D-Printed Habitat Centennial Challenge, which is to harness the talents of citizen inventors to develop the technologies necessary to build sustainable shelters that will one-day allow humans to live on the Moon, Mars and beyond. As Lex Akers, dean of the Caterpillar College of Engineering and Technology at Bradley University, said of the competition:

“We are encouraging a wide range of people to come up with innovative designs for how they envision a habitat on Mars. The virtual levels allow teams from high schools, universities and businesses that might not have access to large 3D printers to still be a part of the competition because they can team up with those who do have access to such machinery for the final level of the competition.”

Carrying on in the tradition of the Centennial Prizes, NASA is seeking public engagement with this competition to promote interest in space exploration and address future challenges. It also seeks to leverage new technologies in order to solve the many engineering, technical and logistical problems presented by space travel. Someday, if and when human beings are living on the Moon, Mars, and other locations in the Solar System, the habitats they call home could very well be the work of students, citizen inventors and space enthusiasts.

For more information on the 3-D Pinrted Habitat Challenge, check out the competition web page.

Further Reading: NASA

The Coldest Place in Space Has Been Created. Next Challenge, Coldest Place in the Universe

This series of graphs show the changing density of a cloud of atoms as it is cooled to lower and lower temperatures (going from left to right) approaching absolute zero. Credit: NASA/JPL-Caltech

Despite decades of ongoing research, scientists are trying to understand how the four fundamental forces of the Universe fit together. Whereas quantum mechanics can explain how three of these forces things work together on the smallest of scales (electromagnetism, weak and strong nuclear forces), General Relativity explains how things behaves on the largest of scales (i.e. gravity). In this respect, gravity remains the holdout.

To understand how gravity interacts with matter on the tiniest of scales, scientists have developed some truly cutting-edge experiments. One of these is NASA’s Cold Atom Laboratory (CAL), located aboard the ISS, which recently achieved a milestone by creating clouds of atoms known as Bose-Einstein condensates (BECs). This was the first time that BECs have been created in orbit, and offers new opportunities to probe the laws of physics.

Originally predicted by Satyendra Nath Bose and Albert Einstein 71 years ago, BECs are essentially ultracold atoms that reach temperatures just above absolute zero, the point at which atoms should stop moving entirely (in theory). These particles are long-lived and precisely controlled, which makes them the ideal platform for studying quantum phenomena.

The Cold Atom Laboratory (CAL), which consists of two standardized containers that will be installed on the International Space Station. Credit: NASA/JPL-Caltech/Tyler Winn

This is the purpose of the CAL facility, which is to study ultracold quantum gases in a microgravity environment. The laboratory was installed in the US Science Lab aboard the ISS in late May and is the first of its kind in space. It is designed to advance scientists’ ability to make precision measurements of gravity and study how it interacts with matter at the smallest of scales.

As Robert Thompson, the CAL project scientist and a physicist at NASA’s Jet Propulsion Laboratory, explained in a recent press release:

“Having a BEC experiment operating on the space station is a dream come true. It’s been a long, hard road to get here, but completely worth the struggle, because there’s so much we’re going to be able to do with this facility.”

About two weeks ago, CAL scientists confirmed that the facility had produced BECs from atoms of rubidium – a soft, silvery-white metallic element in the alkali group. According to their report, they had reached temperatures as low as 100 nanoKelvin, one-ten million of one Kelvin above absolute zero (-273 °C; -459 °F). This is roughly 3 K (-270 °C; -454 °F) colder than the average temperature of space.

Because of their unique behavior, BECs are characterized as a fifth state of matter, distinct from gases, liquids, solids and plasma. In BECs, atoms act more like waves than particles on the macroscopic scale, whereas this behavior is usually only observable on the microscopic scale. In addition, the atoms all assume their lowest energy state and take on the same wave identity, making them indistinguishable from one another.

The”physics package” inside the Cold Atom Lab, where ultracold clouds of atoms called Bose-Einstein condensates are produced. Credit: NASA/JPL-Caltech/Tyler Winn

In short, the atom clouds begin to behave like a single “super atom” rather than individual atoms, which makes them easier to study. The first BECs were produced in a lab in 1995 by a science team consisting of Eric Cornell, Carl Wieman and Wolfgang Ketterle, who shared the 2001 Nobel Prize in Physics for their accomplishment. Since that time, hundreds of BEC experiments have been conducted on Earth and some have even been sent into space aboard sounding rockets.

But the CAL facility is unique in that it is the first of its kind on the ISS, where scientists can conduct daily studies over long periods. The facility consists of two standardized containers, which consist of the larger “quad locker” and the smaller “single locker”. The quad locker contains CAL’s physics package, the compartment where CAL will produce clouds of ultra-cold atoms.

This is done by using magnetic fields or focused lasers to create frictionless containers known as “atom traps”. As the atom cloud decompresses inside the atom trap, its temperature naturally drops, getting colder the longer it remains in the trap. On Earth, when these traps are turned off, gravity causes the atoms to begin moving again, which means they can only be studied for fractions of a second.

Aboard the ISS, which is a microgravity environment, BECs can decompress to colder temperatures than with any instrument on Earth and scientists are able to observe individual BECs for five to ten seconds at a time and repeat these measurements for up to six hours per day. And since the facility is controlled remotely from the Earth Orbiting Missions Operation Center at JPL, day-to-day operations require no intervention from astronauts aboard the station.

JPL scientists and members of the Cold Atom Lab’s atomic physics team (left to right) David Aveline, Ethan Elliott and Jason Williams. Credit: NASA/JPL-Caltech

Robert Shotwell, the chief engineer of JPL’s astronomy and physics directorate, has overseen the project since February 2017. As he indicated in a recent NASA press release:

“CAL is an extremely complicated instrument. Typically, BEC experiments involve enough equipment to fill a room and require near-constant monitoring by scientists, whereas CAL is about the size of a small refrigerator and can be operated remotely from Earth. It was a struggle and required significant effort to overcome all the hurdles necessary to produce the sophisticated facility that’s operating on the space station today.”

Looking ahead, the CAL scientists want to go even further and achieve temperatures that are lower than anything achieved on Earth. In addition to rubidium, the CAL team is also working towards making BECSs using two different isotopes of potassium atoms. At the moment, CAL is still in a commissioning phase, which consists of the operations team conducting a long series of tests see how the CAL facility will operate in microgravity.

However, once it is up and running, five science groups – including groups led by Cornell and Ketterle – will conduct experiments at the facility during its first year. The science phase is expected to begin in early September and will last three years. As Kamal Oudrhiri, JPL’s mission manager for CAL, put it:

“There is a globe-spanning team of scientists ready and excited to use this facility. The diverse range of experiments they plan to perform means there are many techniques for manipulating and cooling the atoms that we need to adapt for microgravity, before we turn the instrument over to the principal investigators to begin science operations.”

Given time, the Cold Atom Lab (CAL) may help scientists to understand how gravity works on the tiniest of scales. Combined with high-energy experiments conducted by CERN and other particle physics laboratories around the world, this could eventually lead to a Theory of Everything (ToE) and a complete understanding of how the Universe works.

And be sure to check out this cool video (no pun!) of the CAL facility as well, courtesy of NASA:

Further Reading: NASA

Look at This Adorable Pen-Sized Booster, Perfect for Tiny Satellites

The Fenix propulsion system, a concept for a CubeSat booster developed by Italian tech company D-Orbit. Credit: D-Orbit

When it comes to space exploration, the motto “keep it simple” isn’t always followed! For the most part, satellites, spacecraft, telescopes, and the many other technologies that allow humans to study and explore the Universe are the result of highly-technical and complex feats of engineering. But sometimes, it is the simplest ideas that offer the most innovative solutions.

This is especially true when it comes to the today’s space agencies, who are concerned with cutting costs and increasing accessibility to space. A good example is the Fenix propulsion system, a proposal created by Italian tech company D-Orbit. As part of the last year’s Space Exploration Masters, this pen-sized booster will allow CubeSats to maneuver and accomplish more in space.

The Space Exploration Masters, which the European Space Agency (ESA) initiated in 2017, seeks to encourage space-based innovation and provide opportunities for commercial development. As such, this annual competition has become central to the implementation of the ESA Space Exploration strategy. For their application last year, D-Orbit was jointly awarded the the ESA and Space Application Services prize.

The Fenix propulsion system, as it would be fitted to a CubeSat. Credit: D-Orbit

The thruster prototype itself measures only 10 cm long and 2 cm wide (~4 by 0.8 inches) and contain solid propellant that is triggered by a simple electrical ignition system. The boosters are designed to be placed at each corner of a 10 x 10 x 10 cm CubeSat, or can be doubled up for added thrust. Thanks to their lightweight and compact size, they do not take up much instrument space or add significantly to a CubeSat’s weight.

Currently, CubeSats are deployed directly into space, deorbit at the end of their missions, and have no means to change their orbits. But with this simple, chemical-propellant thruster, CubeSats could function for longer periods and would be able to take on more complicated missions. For instance, if they can maneuver in orbit, they will be able to study the Moon and asteroids from different angles.

In addition, boosters will allow CubeSats to deorbit themselves once they are finished their missions, thus reducing the threat of space debris. According to the latest report from the Space Debris Office at the European Space Operations Center (ESOC), an estimated 19,894 bits of space junk were circling our planet by the end of 2017, with a combined mass of at least 8135 metric tons (8967 US tons). This problem is only expected to get worse.

In fact, it is estimated that the small satellite market will grow by $5.3 billion in the next decade (according to Space Works and Eurostat) and many private companies are looking to provide regular launch services to accommodate that growth. As such, a propulsion system that not only presents opportunities to do more with CubeSats, but in a way that will not add to problem of space debris, will be highly sought-after.

Artist’s impression of a series of CubeSats orbiting Earth. Credit: ESA/Medialab

In addition to the ESA and Space Application Services prize, D-Orbit won a four-month ticket to test their prototype on the newly-installed ICE Cubes facility, which is located in the Columbus module aboard the International Space Station. This facility is the first European commercial research center to operate aboard the ISS, and the D-Orbit team will use to test the booster’s safe ignition mechanism inside an ICE cube experiment.

This experiment, which will not involve firing the actual propulsion system, will help ensure that the booster can operate safe and effectively in space. Sensors and cameras will record the sparks, triggered by an electrical impulse, while the team relies on the ICE Cubes facility’s dedicated control center to provide them with remote viewing opportunities from the ground.

The Fenix boosters are set to launch for the ISS by the end of next year and, if successful, D-Orbit will likely secure permission to test their propulsion system in space. And if all goes well, future generations of CubeSats – which have already made Low Earth Orbit (LEO) accessible to private companies and research institutes – will be capable of performing far more tasks in orbit.

For this year’s Space Exploration Masters, the ESA is partnering with the United Nations World Health Organization (WHO) to address health and food. For the main challenge, participants will be tasked with coming up with applications that promote nutritious food and food security, both on- and 0ff-planet. Among other challenges, this year’s SEM will also be looking for ideas that make missions more sustainable and new ways to use future spacecraft.

For more information on this year’s Space Exploration Masters, check out the ESA website page.

Further Reading: ESA

Uh oh, Mars Doesn’t Have Enough Carbon Dioxide to be Terraformed

Artist's conception of a terraformed Mars. Credit: Ittiz/Wikimedia Commons

For almost a century now, the concept of terraforming has been explored at length by both science fiction writers and scientists alike. Much like setting foot on another planet or traveling to the nearest star, the idea of altering an uninhabitable planet to make it suitable for humans is a dream many hope to see accomplished someday. At present, much of that hope and speculation is aimed at our neighboring planet, Mars.

But is it actually possible to terraform Mars using our current technology? According to a new NASA-sponsored study by a pair of scientists who have worked on many NASA missions, the answer is no. Put simply, they argue that there is not enough carbon dioxide gas (CO2) that could practically be put back into Mars’ atmosphere in order to warm Mars, a crucial step in any proposed terraforming process.

The study, titled “Inventory of CO2 available for terraforming Mars“, recently appeared in the journal Nature Astronomy. The study was conducted by Bruce Jakosky – a professor of geological sciences and the associate director of the Laboratory for Atmospheric and Space Physics (LASP) at the University of Colorado, Boulder – and Christopher S. Edwards, an assistant professor of planetary science at Northern Arizona University and the leader of the Edwards Research Group.

The study was supported in part by NASA through the Mars Atmospheric and Volatile EvolutioN (MAVEN) and Mars Odyssey THEMIS (Thermal Emission Imaging System) projects. Whereas Professor Jakosky was the Principal Investigator on the MAVEN mission, Professor Edwards is a participating scientist on the Mars Science Laboratory Curiosity Rover (MSL), and worked on the Mars Odyssey THEMIS mission (among other Mars missions).

As we explored in a previous article, “How Do We Terraform Mars?“, many methods have been suggested for turning the Red Planet green. Many of these methods call for warming the surface in order to melt the polar ice caps, which would release an abundant amount of CO2 to thicken the atmosphere and trigger a greenhouse effect. This would in turn cause additional CO2 to be released from the soil and minerals, reinforcing the cycle further.

According to many proposals, this would be followed by the introduction of photosynthetic organisms such as cyanobacteria, which would slowly convert the atmospheric CO2 into oxygen gas and elemental carbon. This very method was suggested in a 1976 NASA study, titled “On the Habitability of Mars: An Approach to Planetary Ecosynthesis“. Since that time, multiple studies and even student teams have proposed using cyanobacteria to terraform Mars.

However, after conducting their analysis, Professors Jakosky and Edwards concluded that triggering a greenhouse effect on Mars would not be as simple as all that. For the sake of their study, Jakosky and Edwards relied on about 20 years of data accumulated by multiple spacecraft observations of Mars. As Edwards indicated in a recent NASA press release:

“These data have provided substantial new information on the history of easily vaporized (volatile) materials like CO2 and H2O on the planet, the abundance of volatiles locked up on and below the surface, and the loss of gas from the atmosphere to space.”

Scientists were able to gauge the rate of water loss on Mars by measuring the ratio of water and HDO from today and 4.3 billion years ago. Credit: Kevin Gill

To determine if Mars had enough gases for a greenhouse effect, Jakosky and Edwards analyzed data from NASA’s Mars Reconnaissance Orbiter (MRO) and Mars Odyssey spacecraft to determine the abundance of carbon-bearing minerals in Martian soil and CO2 in polar ice caps. They they used data from NASA’s MAVEN mission to determine the loss of the Martian atmosphere to space. As Prof. Jakosky explained:

“Carbon dioxide (CO2) and water vapor (H2O) are the only greenhouse gases that are likely to be present on Mars in sufficient abundance to provide any significant greenhouse warming… Our results suggest that there is not enough CO2 remaining on Mars to provide significant greenhouse warming were the gas to be put into the atmosphere; in addition, most of the COgas is not accessible and could not be readily mobilized. As a result, terraforming Mars is not possible using present-day technology.”

Although Mars has significant quantities of water ice, previous analyses have shown that water vapor would not be able to sustain a greenhouse effect by itself. In essence, the planet is too cold and the atmosphere too thin for the water to remain in a vaporous or liquid state for very long. According to the team, this means that significant warming would need to take place involving CO2 first.

However, Mars atmospheric pressure averages at about 0.636 kPA, which is the equivalent of about 0.6% of Earth’s air pressure at sea level. Since Mars is also roughly 52% further away from the Sun than Earth (1.523 AUs compared to 1 AU), researchers estimate that a CO2 pressure similar to Earth’s total atmospheric pressure would be needed to raise temperatures enough to allow for water to exist in a liquid state.

Artist’s rendering of a solar storm hitting Mars and stripping ions from the planet’s upper atmosphere. Credits: NASA/GSFC

According to the team’s analysis, melting the polar ice caps (which is the most accessible source of carbon dioxide) would only contribute enough CO2 to double the Martian atmospheric pressure to 1.2% that of Earth’s. Another source is the dust particles in Martian soil, which the researchers estimate would provide up to 4% of the needed pressure. Other possible sources of carbon dioxide are those that are locked in mineral deposits and water-ice molecule structures known as “clathrates”.

However, using the recent NASA spacecraft observations of mineral deposits, Jakosky and Edwards estimate that these would likely yield less than 5% of the require pressure each. What’s more, accessing even the closest minerals to the surface would require significant strip mining, and accessing all the CO2 attached to dust particles would require strip mining the entire planet to a depth of around 90 meters (100 yards).

Accessing carbon-bearing minerals deep in the Martian crust could be a possible solution, but the depth of these deposits is currently unknown. In addition, recovering them with current technology would be incredibly expensive and energy-intensive, making extraction highly impractical. Other methods have been suggested, however, which include importing flourine-based compounds and volatiles like ammonia.

The former was proposed in 1984 by James Lovelock and Michael Allaby in their book, The Greening of Mars. In it, Lovelock and Allaby described how Mars could be warmed by importing chlorofluorocarbons (CFCs) to trigger global warming. While very effective at triggering a greenhouse effect, these compounds are short-lived and would need to be introduced in significant amounts (hence why the team did not consider them).

NASA’s MAVEN spacecraft is depicted in orbit around an artistic rendition of planet Mars, which is shown in transition from its ancient, water-covered past, to the cold, dry, dusty world that it has become today. Credit: NASA

The idea of importing volatiles like ammonia is an even more time-honored concept, and was proposed by Dandridge M. Cole and Donald Cox in their 1964 book, “Islands in Space: The Challenge of the Planetoids, the Pioneering Work“. Here, Cole and Cox indicated how ammonia ices could be transported from the outer Solar System (in the form of iceteroids and comets) and then impacted on the surface.

However, Jakosky and Edwards’ calculations reveal that many thousands of these icy objects would be required, and the sheer distance involved in transporting them make this an impractical solution using today’s technology. Last, but not least, the team considered how atmospheric loss could be prevented (which could be done using a magnetic shield). This would allow for the atmosphere to build up naturally due to outgassing and geologic activity.

Unfortunately, the team estimates that at the current rate at which outgassing occurs, it would take about 10 million years just to double Mars’ current atmosphere. In the end, it appears that any effort to terraform Mars will have to wait for the development of future technologies and more practical methods.

These technologies would most likely involve more cost-effective means for conducting deep-space missions, like nuclear-thermal or nuclear-electric propulsion. The establishment of permanent outposts on Mars would also be an important first step, which could be dedicated to thickening the atmosphere by producing greenhouse gases – something humans have already proven to be very good at here on Earth!

Project Nomad, a concept for terraforming Mars using mobile, factory-skyscrapers from the 2013 Skyscraper Competition. Credit: evolo.com/Antonio Ares Sainz, Joaquin Rodriguez Nuñez, Konstantino Tousidonis Rial

There’s also the possibility of importing methane gas from the outer Solar System, another super-greenhouse gas, which is also indigenous to Mars. While it constitutes only a tiny percentage of the atmosphere, significant plumes have been detected in the past during the summer months. This includes the “tenfold spike” detected by the Curiosity rover in 2014, which pointed to a subterranean source. If these sources could be mined, methane gas might not even need to be imported.

For some time, scientists have known that Mars was not always the cold, dry, and inhospitable place that it is today. As evidenced by the presence of dry riverbeds and mineral deposits that only form in the presence of liquid water, scientists have concluded that billions of years ago, Mars was a warmer, wetter place. However, between 4.2 and 3.7 billion years ago, Mars’ atmosphere was slowly stripped away by solar wind.

This discovery has led to renewed interest in the colonizing and terraforming of Mars. And while transforming the Red Planet to make it suitable for human needs may not be doable in the near-future, it may be possible to get the process started in just a few decades’ time. It may not happen in our lifetime, but that does not mean that the dream of one-day making “Earth’s Twin” truly live up to its name won’t come true.

Further Reading: NASA

With All These New Planets Found in the Habitable Zone, Maybe it’s Time to Fine Tune the Habitable Zone

Artist’s impression of how an an Earth-like exoplanet might look. Credit: ESO.

In the past few decades, thousands of extra-solar planets have been discovered within our galaxy. As of July 28th, 2018, a total of 3,374 extra-solar planets have been confirmed in 2,814 planetary systems. While the majority of these planets have been gas giants, an increasing number have been terrestrial (i.e. rocky) in nature and were found to be orbiting within their stars’ respective habitable zones (HZ).

However, as the case of the Solar System shows, HZs do not necessary mean a planet can support life. Even though Venus and Mars are at the inner and the outer edge of the Sun’s HZ (respectively), neither is capable of supporting life on its surface. And with more potentially-habitable planets being discovered all the time, a new study suggests that it might be time to refine our definition of habitable zones.

The study, titled “A more comprehensive habitable zone for finding life on other planets“, recently appeared online. The study was conducted by Dr. Ramses M. Ramirez, a research scientist with the Earth-Life Science Institute at the Tokyo Institute of Technology. For years, Dr. Ramirez has been involved in the study of potentially-habitable worlds and built climate models to assess the processes that make planets habitable.

A diagram depicting the Habitable Zone (HZ) boundaries, and how the boundaries are affected by star type. Credit: Wikipedia Commons/Chester Harman

As Dr. Ramirez indicated in his study, the most generic definition of a habitable zone is the circular region around a star where surface temperatures on an orbiting body would be sufficient to maintain water in a liquid state. However, this alone does not mean a planet is habitable, and additional considerations need to be taken into account to determine if life could truly exist there. As Dr. Ramirez told Universe Today via email:

“The most popular incarnation of the HZ is the classical HZ. This classical definition assumes that the most important greenhouse gases in potentially habitable planets are carbon dioxide and water vapor. It also assumes that habitability on such planets is sustained by the carbonate-silicate cycle, as is the case for the Earth. On our planet, the carbonate-silicate cycle is powered by plate tectonics.

“The carbonate-silicate cycle regulates the transfer of carbon dioxide between the atmosphere, surface, and interior of the Earth. It acts as a planetary thermostat over long timescales and ensures that there is not too much CO2 in the atmosphere (the planet gets too hot) or too little (the planet gets too cold). The classical HZ also (typically) assumes that habitable planets possess total water inventories (e.g. total water in the oceans and seas) similar in size to that on the Earth.”

This is what can be referred to as the “low-hanging fruit” approach, where scientists have looked for signs of habitability based on what we as humans are most familiar with. Given that the only example we have of habitability is planet Earth, exoplanet studies have been focused on finding planets that are “Earth-like” in composition (i.e. rocky), orbit, and size.

Diagram showing GJ 625’s habitable zone in comparison’s to the Sun’s. Credit: IAC

However, in recent years this definition has come to be challenged by newer studies. As exoplanet research has moved away from merely detecting and confirming the existence of bodies around other stars and moved into characterization, newer formulations of HZs have emerged that have attempted to capture the diversity of potentially-habitable worlds.

As Dr. Ramirez explained, these newer formulations have complimented traditional notions of HZs by considering that habitable planets may have different atmospheric compositions:

“For instance, they consider the influence of additional greenhouses gases, like CH4 and H2, both of which have been considered important for early conditions on both Earth and Mars. The addition of these gases makes the habitable zone wider than what would be predicted by the classical HZ definition. This is great, because planets thought to be outside the HZ, like TRAPPIST-1h, may now be within it. It has also been argued that planets with dense CO2-CH4 atmospheres near the outer edge of the HZ of hotter stars may be inhabited because it is hard to sustain such atmospheres without the presence of life.”

One such study was conducted by Dr. Ramirez and Lisa Kaltenegger, an associate professor with the Carl Sagan Institute at Cornell University. According to a paper they produced in 2017, which appeared in the Astrophysical Journal Letters, exoplanet-hunters could find planets that would one day become habitable based on the presence of volcanic activity – which would be discernible through the presence of hydrogen gas (H2) in their atmospheres.

Stellar temperature versus distance from the star compared to Earth for the classic habitable zone (shaded blue) and the volcanic habitable zone extension (shaded red). Credit: R. Ramirez, Carl Sagan Institute, Cornell

This theory is a natural extension of the search for “Earth-like” conditions, which considers that Earth’s atmosphere was not always as it is today. Basically, planetary scientists theorize that billions of years ago, Earth’s early atmosphere had an abundant supply of hydrogen gas (H2) due to volcanic outgassing and interaction between hydrogen and nitrogen molecules in this atmosphere is what kept the Earth warm long enough for life to develop.

In Earth’s case, this hydrogen eventually escaped into space, which is believed to be the case for all terrestrial planets. However, on a planet where there is sufficient levels of volcanic activity, the presence of hydrogen gas in the atmosphere could be maintained, thus allowing for a greenhouse effect that would keep their surfaces warm. In this respect, the presence of hydrogen gas in a planet’s atmosphere could extend a star’s HZ.

According to Ramirez, there is also the factor of time, which is not typically taken into account when assessing HZs. In short, stars evolve over time and put out varying levels of radiation based on their age. This has the effect of altering where a star’s HZ reaches, which may not encompass a planet that is currently being studied. As Ramirez explained:

“[I]t has been shown that M-dwarfs (really cool stars) are so bright and hot when they first form that they can desiccate any young planets that are later determined to be in the classical HZ. This underscores the point that just because a planet is currently located in the habitable zone, it doesn’t mean that it is actually habitable (let alone inhabited). We should be able to watch out for these cases.

Finally, there is the issue of what kinds of star system astronomers have been observing in the hunt for exoplanets. Whereas many surveys have examined G-type yellow dwarf star (which is what our Sun is), much research has been focused on M-type (red dwarf) stars of late because of their longevity and the fact that they believed to be the most likely place to find rocky planets that orbit within their stars’ HZs.

“Whereas most previous studies have focused on single star systems, recent work suggests that habitable planets may be found in binary star systems or even red giant or white dwarf systems, potentially habitable planets may also take the form of desert worlds or even ocean worlds that are much wetter than the Earth,” says Ramirez. “Such formulations not only greatly expand the parameter space of potentially habitable planets to search for, but they allow us to filter out the worlds that are most (and least) likely to host life.”

In the end, this study shows that the classical HZ is not the only tool that can be used to asses the possibility of extra-terrestrial life. As such, Ramirez recommends that in the future, astronomers and exoplanet-hunters should supplement the classical HZ with the additional considerations raised by these newer formulations. In so doing, they just may be able to maximize their chances for finding life someday.

“I recommend that scientists pay real special attention to the early stages of planetary systems because that helps determine the likelihood that a planet that is currently located in the present day habitable zone is actually worth studying further for more evidence of life,” he said. “I also recommend that the various HZ definitions are used in conjunction so that we can best determine which planets are most likely to host life. That way we can rank these planets and determine which ones to spend most of our telescope time and energy on. Along the way we would also be testing how valid the HZ concept is, including determining how universal the carbonate-silicate cycle is on a cosmic scale.”

Further Reading: arXiv

Einstein Was Right… Again! Successful Test of General Relativity Near a Supermassive Black Hole

Artist’s impression of the path of the star S2 as it passes very close to the supermassive black hole at the centre of the Milky Way. Credit: ESO/M. Kornmesser

In 1915, Albert Einstein published his famous Theory of General Relativity, which provided a unified description of gravity as a geometric property of space and time. This theory gave rise to the modern theory of gravitation and revolutionized our understanding of physics. Even though a century has passed since then, scientists are still conducting experiments that confirm his theory’s predictions.

Thanks to recent observations made by a team of international astronomers (known as the GRAVITY collaboration), the effects of General Relativity have been revealed using a Supermassive Black Hole (SMBH) for the very first time. These findings were the culmination of a 26-year campaign of observations of the SMBH at the center of the Milky Way (Sagittarius A*) using the European Southern Observatory‘s (ESO) instruments.

The study which describes the team’s findings recently appeared in the journal Astronomy and Astrophysics, titled “Detection of the gravitational redshift in the orbit of the star S2 near the Galactic centre massive black hole“. The study was led by Roberto Arbuto of the ESO and included members from the GRAVITY collaboration – which is led by Reinhard Genzel of the Max Planck Institute for Extraterrestrial Physics (MPE) and includes astronomers from multiple European universities and research institutes.

Annotated image of the path of the star S2 as it passes very close to the supermassive black hole at the center of the Milky Way. Credit: ESO/M. Kornmesser

For the sake of their study, the team relied on data gathered by the VLT’s extremely sensitive and high-precision instruments. These included the GRAVITY astrometric and interferometry instrument, the Spectrograph for INtegral Field Observations in the Near Infrared (SINFONI) instrument, and the Nasmyth Adaptive Optics System (NAOS) – Near-Infrared Imager and Spectrograph (CONICA) instrument, which are together known as NACO.

The new infrared observations collected by these instruments allowed the team to monitor one of the stars (S2) that orbits Sagittarius A* as it passed in front of the black hole – which took place in May of 2018. At the closest point in its orbit, the star was at a distance of less than 20 billion km (12.4 billion mi) from the black hole and was moving at a speed in excess of 25 million km/h (15 million mph) – almost three percent of the speed of light.

Whereas the SINFONI instrument was used to measure the velocity of S2 towards and away from Earth, the GRAVITY instrument in the VLT Interferometer (VLTI) made extraordinarily precise measurements of the changing position of S2 in order to define the shape of its orbit. The GRAVITY instrument then created the sharp images that revealed the motion of the star as it passed close to the black hole.

The team then compared the position and velocity measurements to previous observations of S2 using other instruments. They then compared these results with predictions made by Newton’s Law of Universal Gravitation, General Relativity, and other theories of gravity. As expected, the new results were consistent with the predictions made by Einstein over a century ago.

As Reinhard Genzel, who in addition to being the leader of the GRAVITY collaboration was a co-author on the paper, explained in a recent ESO press release:

“This is the second time that we have observed the close passage of S2 around the black hole in our galactic center. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution. We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects.”

When observed with the VLT’s new instruments, the team noted an effect called gravitational redshift, where the light coming from S2 changed color as it drew closer to the black hole. This was caused by the very strong gravitational field of the black hole, which stretched the wavelength of the star’s light, causing it to shift towards the red end of the spectrum.

The change in the wavelength of light from S2 agrees precisely with what Einstein’s field equation’s predicted. As Frank Eisenhauer – a researcher from the Max Planck Institute of Extraterrestrial Physics, the Principal Investigator of GRAVITY and the SINFONI spectrograph, and a co-author on the study – indicated:

Our first observations of S2 with GRAVITY, about two years ago, already showed that we would have the ideal black hole laboratory. During the close passage, we could even detect the faint glow around the black hole on most of the images, which allowed us to precisely follow the star on its orbit, ultimately leading to the detection of the gravitational redshift in the spectrum of S2.

Whereas other tests have been performed that have confirmed Einstein’s predictions, this is the first time that the effects of General Relativity have been observed in the motion of a star around a supermassive black hole. In this respect, Einstein has been proven right once again, using one the most extreme laboratory to date! What’s more, it confirmed that tests involving relativistic effects can provide consistent results over time and space.

“Here in the Solar System we can only test the laws of physics now and under certain circumstances,” said Françoise Delplancke, head of the System Engineering Department at ESO. “So it’s very important in astronomy to also check that those laws are still valid where the gravitational fields are very much stronger.”

In the near future, another relativistic test will be possible as S2 moves away from the black hole. This is known as a Schwarzschild precession, where the star is expected to experience a small rotation in its orbit. The GRAVITY Collaboration will be monitoring S2 to observe this effect as well, once again relying on the VLT’s very precise and sensitive instruments.

As Xavier Barcons (the ESO’s Director General) indicated, this accomplishment was made possible thanks to the spirit of international cooperation represented by the GRAVITY collaboration and the instruments they helped the ESO develop:

“ESO has worked with Reinhard Genzel and his team and collaborators in the ESO Member States for over a quarter of a century. It was a huge challenge to develop the uniquely powerful instruments needed to make these very delicate measurements and to deploy them at the VLT in Paranal. The discovery announced today is the very exciting result of a remarkable partnership.”

And be sure to check out this video of the GRAVITY Collaboration’s successful test, courtesy of the ESO:

Further Reading: ESO, Astronomy and Astrophysics

New Photos of Saturn and Mars from Hubble

This image shows the recent observations of the planets Mars and Saturn made with the NASA/ESA Hubble Space Telescope. Credit: NASA, ESA, STScI, M. Mutchler (STScI), A. Simon (GSFC) and the OPAL Team, J. DePasquale (STScI)

During the summer of 2018, the planets of Mars and Saturn (one after the other) have been in opposition. In astronomical terms, opposition refers to when a planet is on the opposite side of the Earth relative to the Sun. This not only means that the planet is closer to Earth in its respective orbit, but that is also fully lit by the Sun (as seen from Earth) and much more visible.

As a result, astronomers are able to observe these planets in greater detail. The Hubble Space Telescope took advantage of this situation to do what it has done best for the past twenty-eight years – capture some breathtaking images of both planets! Hubble made its observations of Saturn in June and Mars in July, and showed both planets close to their opposition.

Continue reading “New Photos of Saturn and Mars from Hubble”

Mars is 1000x Drier Than the Driest Places on Earth

Mosaic image of "Wdowiak Ridge", taken by NASA's Mars Exploration Rover Opportunity on Sept. 17th, 2014. Credit: NASA/JPL

For generations, many have dreamed about the day when it would be possible to set foot on Mars – aka. “Earth’s Twin” planet. And in the past few years, multiple orbiters, landers and rovers have revealed evidence of past water on Mars, not to mention the possibility that water still exists underground. These findings have fueled the desire to send crewed missions to Mars, not to mention proposals to establish a colony there.

However, this enthusiasm may seem a little misguided when you consider all the challenges the Martian environment presents. In addition to it being very cold and subject to a lot of radiation, the surface of Mars today is also extremely dry. According to a new study led by researchers from NASA’s Ames Research Center, Martian soil is roughly 1000 times drier than some of the driest regions on Earth.

The study, titled “Constraints on the Metabolic Activity of Microorganisms in Atacama Surface Soils Inferred from Refractory Biomarkers: Implications for Martian Habitability and Biomarker Detection, recently appeared in the journal Astrobiology. The study was led by members from NASA Ames Research Center and included researchers from the Georgia Institute of Technology, the Carl Sagan Center at the SETI Institute, the Centro de Astrobiologia (INTA-CSIC), the NASA Goddard Space Flight Center, and the Massachusetts Institute of Technology.

The Atacama Desert in northern Chile. Credit: NASA/Frank Tavares

For the sake of their study, the research team sought to determine if microorganisms can survive under the types of conditions present on Mars. To answer this question, the team traveled to the the Atacama Desert in Chile, a 1000 km (620 mi) strip of land on South America’s west coast. With an average rainfall of just 1 to 3 mm (0.04 to 0.12 in) a year, the Atacama desert is known as the driest nonpolar place in the world.

However, the Atacama desert is not uniformly dry, and experiences different levels of precipitation depending on the latitude. From the southern end to the northern end, annual precipitation shifts from a few millimeters of rain per year to only a few millimeters of rain per decade. This environment provides an opportunity to search for life at decreasing levels of precipitation, thus allowing researchers to place constraints on microorganism survivability.

It is at the northern end of the desert (in what is known as the Antofagasta region) where conditions become most Mars-like. Here, the average annual rainfall is just 1 mm a year, which has made it a popular destination for scientists looking to simulate a Martian environment. In addition to seeing if microbes could survive in these dry conditions, the team also sought to determine if they were capable of growth and reproduction.

As Mary Beth Wilhelm – an astrobiologist at the Georgia Institute of Technology, NASA’s Ames Research Center, and lead author of the new study – explained in a recent NASA press release:

“On Earth, we find evidence of microbial life everywhere. However, in extreme environments, it’s important to know whether a microbe is dormant and just barely surviving, or really alive and well… By learning if and how microbes stay alive in extremely dry regions on Earth, we hope to better understand if Mars once had microbial life and whether it could have survived until today.”

Researchers collect samples from the surface of the Atacama Desert in Chile, going a few centimeters into the ground. Credits: NASA Ames Research Center

After collecting soil samples from across the Atacama Desert and brought them back to their lab at Ames, the research team began performing tests to see if their microorganism samples showed any indication of stress markers. These are a key way in which life can be shown to be growing, since organisms in a dormant state (i.e. that are just surviving) show no signs of stress markers.

Specifically, they looked for changes in the lipid structure of the cells outer membranes, which typically become more rigid in response to stress. What they found was that in the less dry parts of the Atacama Desert, this stress marker was present; but strangely, these same markers were missing in the driest regions of the desert where microbes would be more stressed.

Based on these and other results, the team concluded that there is a transition line for microorganisms in environments like the Atacama Desert. On one side of this line, the presence of minute amounts of water is enough for organisms to still be able to grow. On the other side, the environment is so dry that organisms can survive but will not grow and reproduce.

The team was also able to find evidence of microbes that had been dead in the Atacama soil samples for at least 10,000 years. They were able to determine this by examining the amino acids of the microbes, which are the building blocks of proteins, and examining the rate at which their structure changed. This find was rather surprising, seeing as how it is extremely rare that the remnant of ancient life be found on the surface of Earth.

This artist’s concept depicts NASA’s Mars 2020 rover exploring Mars. Credit: NASA

Given that Mars is 1,000 times drier than even the driest parts of Atacama, these results were not encouraging news for those hoping that microbial life will still be found there. However, the fact that the remnants of past microbial life were found in the driest areas of Chile’s desert – which would have existed when conditions were wetter and were well-preserved – is very good news when it comes to the search for past life on Mars.

Essentially, if microbial life did exist on Mars back when it was a warmer, wetter environment, traces of that ancient life might still exist. As Wilhelm explained:

“Before we go to Mars, we can use the Atacama like a natural laboratory and, based on our results, adjust our expectations for what we might find when we get there. Knowing the surface of Mars today might be too dry for life to grow, but that traces of microbes can last for thousands of years helps us design better instruments to not only search for life on and under the planet’s surface, but to try and unlock the secrets of its distant past.”

In the future, missions like NASA’s Mars 2020 rover will be seeking to procure samples of Martian soil. If NASA’s proposed “Journey to Mars” takes place by the 2030s as planned, these samples could then be returned to Earth for analysis. With luck, these soil samples will reveal evidence of past life and prove that Mars was once a habitable planet!

Further Reading: NASA