Juno Data Shows that Some of Jupiter’s Moons are Leaving “Footprints” in its Aurorae

The Juno Infrared Auroral Mapper (JIRAM) captured this infrared image of Jupiter's south pole. This part of Jupiter cannot be seen from Earth. Image: NASA/JPL-Caltech/SwRI/MSSS

Since it arrived in orbit around Jupiter in July of 2016, the Juno mission has been sending back vital information about the gas giant’s atmosphere, magnetic field and weather patterns. With every passing orbit – known as perijoves, which take place every 53 days – the probe has revealed things about Jupiter that scientists will rely on to learn more about its formation and evolution.

Interestingly, some of the most recent information to come from the mission involves how two of its moons affect one of Jupiter’s most interesting atmospheric phenomenon. As they revealed in a recent study, an international team of researchers discovered how Io and Ganymede leave “footprints” in the planet’s aurorae. These findings could help astronomers to better understand both the planet and its moons.

The study, titled “Juno observations of spot structures and a split tail in Io-induced aurorae on Jupiter“, recently appeared in the journal Science. The study was led by A. Mura of the International Institute of Astrophysics (INAF) and included members from NASA’s Goddard Space Flight Center, NASA’s Jet Propulsion Laboratory, the Italian Space Agency (ASI), the Southwest Research Institute (SwRI), the Johns Hopkins University Applied Physics Laboratory (JHUAPL), and multiple universities.

Infrared images obtained by the Cassini probe, showing disturbances in Jupiter’s aurorae caused by Io and Ganymede. Credit: (c) Science (2018).

Much like aurorae here on Earth, Jupiter’s aurorae are produced in its upper atmosphere when high-energy electrons interact with the planet’s powerful magnetic field. However, as the Juno probe recently demonstrated using data gathered by Ultraviolet Spectrograph (UVS) and Jovian Energetic Particle Detector Instrument (JEDI), Jupiter’s magnetic field is significantly more powerful than anything we see on Earth.

In addition to reaching power levels 10 to 30 times greater than anything higher than what is experienced here on Earth (up to 400,000 electron volts), Jupiter’s norther and southern auroral storms also have oval-shaped disturbances that appear whenever Io and Ganymede pass close to the planet. As they explain in their study:

“A northern and a southern main auroral oval are visible, surrounded by small emission features associated with the Galilean moons. We present infrared observations, obtained with the Juno spacecraft, showing that in the case of Io, this emission exhibits a swirling pattern that is similar in appearance to a von Kármán vortex street.”

A Von Kármán vortex street, a concept in fluid dynamics, is basically a repeating pattern of swirling vortices caused by a disturbance. In this case, the team found evidence of a vortex streaming for hundreds of kilometers when Io passed close to the planet, but which then disappeared as the moon moved farther away from the planet.

Reconstructed view of Jupiter’s northern lights through the filters of the Juno Ultraviolet Imaging Spectrograph instrument on Dec. 11, 2016, as the Juno spacecraft approached Jupiter, passed over its poles, and plunged towards the equator. Credit: NASA/JPL-Caltech/Bertrand Bonfond

The team also found two spots in the auroral belt created by Ganymede, where the extended tail from the main auroral spots eventually split in two. While the team was not sure what causes this split, they venture that it could be caused by interaction between Ganymede and Jupiter’s magnetic field (since Ganymede is the only Jovian moon to have its own magnetic field).

These features, they claim, suggest that magnetic interactions between Jupiter and Ganymede are more complex than previously thought. They also indicate that neither of the footprints were where they expected to find them, which suggests that models of the planet’s magnetic interactions with its moons may be in need of revision.

Studying Jupiter’s magnetic storms is one of the primary goals of the Juno mission, as is learning more about the planet’s interior structure and how it has evolved over time. In so doing, astronomers hope to learn more about how the Solar System came to be. NASA also recently extended the mission to 2021, giving it three more years to gather data on these mysteries.

And be sure to enjoy this video of the Juno mission, courtesy of the Jet Propulsion Laboratory:

Further Reading: phys.org, Science

Cassini’s “Grande Finale” Earns an Emmy Nomination!

An artist's illustration of the Cassini probe's Grand Finale. Image: NASA/JPL/CalTech
An artist's illustration of the Cassini probe's Grand Finale. Image: NASA/JPL/CalTech

In 1997, the NASA/ESA Cassini-Huygens mission launched from Earth and began its long journey towards the Saturn system. In 2004, the Cassini orbiter arrived around Saturn and would spend the next thirteen years studying the gas giant, its rings, and its system of Moons. On September 15th, 2017, the mission ended when the probe entered Saturn’s upper atmosphere and burned up.

This was known as Cassini’s “Grand Finale“, which began with the probe plunging into the unexplored region that lies between Saturn’s atmosphere and its rings and culminated with live coverage of it entering the atmosphere. In honor of the mission and NASA’s outstanding coverage of its final months, NASA was recently nominated for an Emmy Award by The Academy of Television Arts & Sciences.

The award is in the category of Outstanding Original Interactive Program, which recognizes the JPL’s multi-month digital campaign that celebrated the mission’s science and engineering accomplishments – which included news, web, education, television and social media efforts. It is also a nod to the agency’s success in communicating why the spacecraft concluded its mission in the skies of Saturn.

Essentially, the spacecraft was intentionally destroyed in Saturn’s atmosphere to prevent the possibility of it contaminating any of Saturn’s moons. Throughout the thirteen years it spent studying the Saturn system, Cassini found compelling evidence for the possible existence of life on Titan and in Enceladus’ interior ocean. In addition, scientists have speculated that there may be interior oceans within Rhea and Dione.

In this respect, Cassini ended its mission the same way the Galileo probe did in 2003. After spending 8 years studying Jupiter and its system the moons, the probe crashed into the gas giant’s upper atmosphere in order to prevent any possible contamination of Europa or Ganymede, which are also thought to have an interior oceans that could support life.

The “Grand Finale” campaign began on April 26th, 2017, and continued until the craft entered Saturn’s atmosphere on Sept. 15th, 2017, with the spacecraft sending back science to the very last second. The campaign utilized several different forms of media, was interactive, and was very comprehensive, providing regular updates and vital information about the mission.

As NASA indicated on their Cassini website:

“The multi-faceted campaign included regular updates on Twitter, Facebook, Snapchat, Instagram and the Cassini mission website; multiple live social, web and TV broadcasts during which reporter and public questions were answered; a dramatic short film to communicate the mission’s story and preview its endgame; multiple 360-degree videos, including NASA’s first 360-degree livestream of a mission event from inside JPL mission control; an interactive press kit; a steady drumbeat of articles to keep fans updated with news and features about the people behind the mission; state-standards aligned educational materials; a celebration of art by amateur space enthusiasts; and software to provide real-time tracking of the spacecraft, down to its final transmission to Earth.”

The short film, titled “For Your Consideration: The NASA Cassini Grand Finale“, showcases the missions many accomplishments, pays tribute to all those who made it happen and who helped inform the public and communicate the importance of the mission.

The Primetime Emmys will be awarded be on September 17th in Los Angeles. The Creative Arts Emmys, which includes interactive awards, will be presented during a separate ceremony on Saturday, Sept. 15th, at the Microsoft Theatre in Los Angeles. Other contenders include Back to the Moon, a Google Spotlight Stories App; Blade Runner 2049: Memory Lab, Coco VR, and Spiderman Homecoming, three Oculus VR experiences.

And be sure to check out the videos, FYC: NASA Cassini Grand Finale, below:

Further Reading: NASA

New Research Raises Hopes for Finding Life on Mars, Pluto and Icy Moons

Artist's impression of a water vapor plume on Europa. Credit: NASA/ESA/K. Retherford/SWRI

Since the 1970s, when the Voyager probes captured images of Europa’s icy surface, scientists have suspected that life could exist in interior oceans of moons in the outer Solar System. Since then, other evidence has emerged that has bolstered this theory, ranging from icy plumes on Europa and Enceladus, interior models of hydrothermal activity, and even the groundbreaking discovery of complex organic molecules in Enceladus’ plumes.

However, in some locations in the outer Solar System, conditions are very cold and water is only able to exist in liquid form because of the presence of toxic antifreeze chemicals. However, according to a new study by an international team of researchers, it is possible that bacteria could survive in these briny environments. This is good news for those hoping to find evidence of life in extreme environments of the Solar System.

The study which details their findings, titled “Enhanced Microbial Survivability in Subzero Brines“, recently appeared in the scientific journal Astrobiology. The study was conducted by Jacob Heinz from the Center of Astronomy and Astrophysics at the Technical University of Berlin (TUB), and included members from Tufts University, Imperial College London, and Washington State University.

Based on new evidence from Jupiter’s moon Europa, astronomers hypothesize that chloride salts bubble up from the icy moon’s global liquid ocean and reach the frozen surface. Credit: NASA/JPL-Caltech

Basically, on bodies like Ceres, Callisto, Triton, and Pluto – which are either far from the Sun or do not have interior heating mechanisms – interior oceans are believed to exist because of the presence of certain chemicals and salts (such as ammonia). These “antifreeze” compounds ensure that their oceans have lower freezing points, but create an environment that would be too cold and toxic to life as we know it.

For the sake of their study, the team sought to determine if microbes could indeed survive in these environments by conducting tests with Planococcus halocryophilus, a bacteria found in the Arctic permafrost. They then subjected this bacteria to solutions of sodium, magnesium and calcium chloride as well as perchlorate, a chemical compound that was found by the Phoenix lander on Mars.

They then subjected the solutions to temperatures ranging from +25°C to -30°C through multiple freeze and thaw cycles. What they found was that the bacteria’s survival rates depended on the solution and temperatures involved. For instance, bacteria suspended in chloride-containing (saline) samples had better chances of survival compared to those in perchlorate-containing samples – though survival rates increased the more the temperatures were lowered.

For instance, the team found that bacteria in a sodium chloride (NaCl) solution died within two weeks at room temperature. But when temperatures were lowered to 4 °C (39 °F), survivability began to increase and almost all the bacteria survived by the time temperatures reached -15 °C (5 °F). Meanwhile, bacteria in the magnesium and calcium-chloride solutions had high survival rates at –30 °C (-22 °F).

Artist rendering showing an interior cross-section of the crust of Enceladus, which shows how hydrothermal activity may be causing the plumes of water at the moon’s surface. Credits: NASA-GSFC/SVS, NASA/JPL-Caltech/Southwest Research Institute

The results also varied for the three saline solvents depending on the temperature. Bacteria in calcium chloride (CaCl2) had significantly lower survival rates than those in sodium chloride (NaCl) and magnesium chloride (MgCl2)between 4 and 25 °C (39 and 77 °F), but lower temperatures boosted survival in all three.  The survival rates in perchlorate solution were far lower than in other solutions.

However, this was generally in solutions where perchlorate constituted 50% of the mass of the total solution (which was necessary for the water to remain liquid at lower temperatures), which would be significantly toxic. At concentrations of 10%, bacteria was still able to grow. This is semi-good news for Mars, where the soil contains less than one weight percent of perchlorate.

However, Heinz also pointed out that salt concentrations in soil are different than those in a solution. Still, this could be still be good news where Mars is concerned, since temperatures and precipitation levels there are very similar to parts of Earth – the Atacama Desert and parts of Antarctica. The fact that bacteria have can survive such environments on Earth indicates they could survive on Mars too.

In general, the research indicated that colder temperatures boost microbial survivability, but this depends on the type of microbe and the composition of the chemical solution. As Heinz told Astrobiology Magazine:

“[A]ll reactions, including those that kill cells, are slower at lower temperatures, but bacterial survivability didn’t increase much at lower temperatures in the perchlorate solution, whereas lower temperatures in calcium chloride solutions yielded a marked increase in survivability.”

This full-circle view from the panoramic camera (Pancam) on NASA’s Mars Exploration Rover Spirit shows the terrain surrounding the location called “Troy,” where Spirit became embedded in soft soil during the spring of 2009. Credit: NASA/JPL

The team also found that bacteria did better in saltier solutions when it came to freezing and thawing cycles. In the end, the results indicate that survivability all comes down to a careful balance. Whereas lower concentrations of chemical salts meant that bacteria could survive and even grow, the temperatures at which water would remain in a liquid state would be reduced. It also indicated that salty solutions improve bacteria survival rates when it comes to freezing and thawing cycles.

Of course, the team emphasized that just because bacteria can subsist in certain conditions doesn’t mean they will thrive there. As Theresa Fisher, a PhD student at Arizona State University’s School of Earth and Space Exploration and a co-author on the study, explained:

“Survival versus growth is a really important distinction, but life still manages to surprise us. Some bacteria can not only survive in low temperatures, but require them to metabolize and thrive. We should try to be unbiased in assuming what’s necessary for an organism to thrive, not just survive.”  

As such, Heinz and his colleagues are currently working on another study to determine how different concentrations of salts across different temperatures affect bacterial propagation. In the meantime, this study and other like it are able to provide some unique insight into the possibilities for extraterrestrial life by placing constraints on the kinds of conditions that they can survive and grow in.

These studies also allow help when it comes to the search for extraterrestrial life, since knowing where life can exist allows us to focus our search efforts. In the coming years, missions to Europa, Enceladus, Titan and other locations in the Solar System will be looking for biosignatures that indicate the presence of life on or within these bodies. Knowing that life can survive in cold, briny environments opens up additional possibilities.

Further Reading: Astrobiology Magazine, Astrobiology

Kepler Mission Placed in Hibernation to Download Data Before its Last Campaign

Artist's concept of the Kepler mission with Earth in the background. Credit: NASA/JPL-Caltech
Artist's concept of the Kepler mission with Earth in the background. Credit: NASA/JPL-Caltech

The Kepler space telescope has had a relatively brief but distinguished career of service with NASA. Having launched in 2009, the space telescope has spent the past nine years observing distant stars for signs of planetary transits (i.e. the Transit Method). In that time, it has been responsible for the detection of 2,650 confirmed exoplanets, which constitutes the majority of the more than 38oo planets discovered so far.

Earlier this week, the Kepler team was notified that the space telescope’s fuel tank is running very low. NASA responded by placing the spacecraft in hibernation in preparation for a download of its scientific data, which it collected during its latest observation campaign. Once the data is downloaded, the team expects to start its last observation campaign using whatever fuel it has left.

Since 2013, Kepler has been conducting its “Second Light” (aka. K2) campaign, where the telescope has continued conducting observations despite the loss of two of its reaction wheels. Since May 12th, 2018, Kepler has been on its 18th observation campaign, which has consisted of it studying a patch of sky in the vicinity of the Cancer constellation – which it previously studied in 2015.

NASA’s Kepler spacecraft has been on an extended mission called K2 after two of its four reaction wheels failed in 2013. Credit: NASA

In order to send the data back home, the spacecraft will point is large antenna back towards Earth and transmit it via the Deep Space Network. However, the DSN is responsible for transmitting data from multiple missions and time needs to be allotted in advance. Kepler is scheduled to send data from its 18th campaign back in August, and will remain in a stable orbit and safe mode in order to conserve fuel until then.

On August 2nd, the Kepler team will command the spacecraft to awaken and will maneuver the craft to the correct orientation to transmit the data. If all goes well, they will begin Kepler’s 19th observation campaign on August 6th with what fuel the spacecraft still has. At present, NASA expects that the spacecraft will run out of fuel in the next few months.

However, even after the Kepler mission ends, scientists and engineers will continue to mine the data that has already been sent back for discoveries. According to a recent study by an international team of scientists, 24 new exoplanets were discovered using data from the 10th observation campaign, which has brought the total number of Kepler discoveries to 2,650 confirmed exoplanets.

An artist’s conception of how common exoplanets are throughout the Milky Way Galaxy. Image Credit: Wikipedia

In the coming years, many more exoplanet discoveries are anticipated as the next-generation of space telescopes begin collecting their first light or are deployed to space. These include the Transiting Exoplanet Survey Satellite (TESS), which launched this past April, and the James Webb Space Telescope (JWST) – which is currently scheduled to launch sometime in 2021.

However, it will be many years before any mission can rival the accomplishments and contributions made by Kepler! Long after she is retired, her legacy will live on in the form of her discoveries.

Further Reading: NASA

NASA is Looking for New Ways to Deal With Trash on Deep Space Missions

Garbage is offloaded from the ISS onto a commercial resupply vehicle and then removed from the station using the Canadarm 2. Credit: NASA

Life aboard the International Space Station is characterized by careful work and efficiency measures. Not only do astronauts rely on an average of 12 metric tons of supplies a year – which is shipped to the station from Earth – they also produce a few metric tons of garbage. This garbage must be carefully stored so that it doesn’t accumulate, and is then sent back to the surface on commercial supply vehicles.

This system works well for a station in orbit. But what about spacecraft that are conducted long-duration missions? These ships will not have the luxury of meeting with a regular cadence of commercial ships that will drop off supplies and haul away their garbage. To address this, NASA is investigating possible solutions for how to handle space trash for deep space missions.

For this purpose, NASA is turning to its partners in the commercial sector to develop concepts for Trash Compaction and Processing Systems (TCPS). In a solicitation issued through the Next Space Technologies for Exploration Partnerships (NextSTEP), NASA recently issued a Board Agency Announcement that called for the creation of prototypes and eventually flight demonstrations that would fly to the ISS.

The International Space Station (ISS), seen here with Earth as a backdrop. Credit: NASA

The details of the proposal were outlined in Appendix F of the Board Agency Announcement, titled “Logistics Reduction in Space by Trash Compaction and Processing System“. As they state in this section:

“NASA’s ultimate goal is to develop capabilities to enable missions that are not reliant on resupply from Earth thus making them more sustainable and affordable. NASA is implementing this by employing a capability-driven approach to its human spaceflight strategy. The approach is based on developing a suite of evolving capabilities that provide specific functions to solve exploration challenges. These investments in initial capabilities can continuously be leveraged and reused, enabling more complex operations over time and exploration of more distant solar system destinations.”

When it comes right down to it, storing trash inside a spacecraft is serious challenge. Not only does it consume precious volume, it can also create physical and biological hazards for the crew. Storing garbage also means that leftover resources can not be repurposed or recycled. All told, the BAA solicitation is looking for solutions that will compact trash, remove biological and physical hazards, and recover resources for future use.

To this end, they are looking for ideas and technologies for a TCPS that could operate on future generations of spaceships. As part of the Advanced Exploration Systems (AES) Habitat’s Logistics Reduction (LR), the TCPS is part of NASA’s larger goal of identifying and developing technologies that reduce logistical mass, volume, and the amount of time the crew dedicates to logistics management.

NASA’ Heat Melt Compactor (HMC), a device that will recover residual water from astronaut’s trash and compact the trash to provide volume reduction, or perhaps some usefulness as an ionizing radiation shield. Credit: NASA

The objectives of the TCPS , as is stated in the Appendix, are fourfold:

“(1) trash compaction to a suitable form for efficient long-endurance storage; (2) safe processing of trash to eliminate and/or reduce the risk of biological activity; (3) stabilize the trash physically, geometrically, and biologically; and (4) manage gaseous, aqueous, and particulate effluents. The TCPS will be the first step toward development and testing of a fully-integrated unit for further Exploration Missions and future space vehicles.”

The development will occur in two phases. In Phase A, selected companies will create a concept TCPS system, conduct design reviews with NASA, and validate them through prototype ground demonstrations. In Phase B, a system will be prepared for transport to the ISS so that a demonstration cant take place aboard the station as early as 2022.

The various companies that submit proposals will not be working in the dark, as NASA has been developing waste management systems since the 1980s. These include recent developments like the Heat Melt Compactor (HMC) experiment, a device that will recover residual water from astronaut’s garbage and compact trash to provide volume reduction (or perhaps an ionizing radiation shield).

The Kounotori2 H-II Transfer Vehicle (HTV-2), after taken on the ISS’ trash, is moved from the space station by the Canadarm 2 to await the arrival of the Space Shuttle Discovery’s STS-133 mission. Credit: NASA

Other examples include the “trash to gas” technologies, which are currently being pursued under the Logistics Reduction and Repurposing project (LRR). Using the HMC, this process involves creating methane gas from trash to make rocket propellant. Together, these technologies would not only allow astronauts on long-duration spaceflights to conserve room, but also extract useful resources from their garbage.

NASA plans to host an industry day on July 24th in order to let potential industry partners know exactly what they are looking for, describe available NASA facilities, and answer questions from potential respondents. Official proposals from aspiring partners are due no later than August 22nd, 2018, and whichever proposals make the cut will be tested on the ISS in the coming decade!

Further Reading: NASA, FBO

New Insights Into What Might Have Smashed Uranus Over Onto its Side

Uranus
A new study indicates that a massive impact may be why Uranus orbits on its side. Credit: NASA/JPL/Voyager mission

The gas/ice giant Uranus has long been a source of mystery to astronomers. In addition to presenting some thermal anomalies and a magnetic field that is off-center, the planet is also unique in that it is the only one in the Solar System to rotate on its side. With an axial tilt of 98°, the planet experiences radical seasons and a day-night cycle at the poles where a single day and night last 42 years each.

Thanks to a new study led by researchers from Durham University, the reason for these mysteries may finally have been found. With the help of NASA researchers and multiple scientific organizations, the team conducted simulations that indicated how Uranus may have suffered a massive impact in its past. Not only would this account for the planet’s extreme tilt and magnetic field, it would also explain why the planet’s outer atmosphere is so cold.

Continue reading “New Insights Into What Might Have Smashed Uranus Over Onto its Side”

Instead of Building Single Monster Scopes like James Webb, What About Swarms of Space Telescopes Working Together?

In the future, telescopes may consist of distributed arrays rather than single instruments - like NASA's Terrestrial Planet Finder (TPF), a system of space telescopes for detecting extrasolar terrestrial planets. Credit: NASA

In the coming decade, a number of next-generation instruments will take to space and begin observing the Universe. These will include the James Webb Space Telescope (JWST), which is likely to be followed by concepts like the Large Ultraviolet/Optical/Infrared Surveyor (LUVOIR), the Origins Space Telescope (OST), the Habitable Exoplanet Imager (HabEx) and the Lynx X-ray Surveyor.

These missions will look farther into the cosmos than ever before and help astronomers address questions like how the Universe evolved and if there is life in other star systems. Unfortunately, all these missions have two things in common: in addition to being very large and complex, they are also very expensive. Hence why some scientists are proposing that we rely on more cost-effective ideas like swarm telescopes.

Two such scientists are Jayce Dowell and Gregory B. Taylor, a research assistant professor and professor (respectively) with the Department of Physics and Astronomy at the University of New Mexico. Together, the pair outlined their idea in a study titled “The Swarm Telescope Concept“, which recently appeared online and was accepted for publication by the Journal of Astronomical Instrumentation.

Illustration of NASA’s James Webb Space Telescope. Credits: NASA

As they state in their study, traditional astronomy has focused on the construction, maintenance and operation of single telescopes. The one exception to this is radio astronomy, where facilities have been spread over an extensive geographic area in order to obtain high angular resolution. Examples of this include the Very Long Baseline Array (VLBA), and the proposed Square Kilometer Array (SKA).

In addition, there’s also the problem of how telescopes are becoming increasingly reliant on computing and digital signal processing. As they explain in their study, telescopes commonly carry out multiple simultaneous observation campaigns, which increases the operational complexity of the facility due to conflicting configuration requirements and scheduling considerations.

A possible solution, according to Dowell and Taylor, is to rethink telescopes. Instead of a single instrument, the telescope would consist of a distributed array where many autonomous elements come together through a data transport system to function as a single facility. This approach, they claim, would be especially useful when it comes to the Next Generation Very Large Array (NGVLA) – a future interferometer that will build on the legacy of the Karl G. ansky Very Large Array and Atacama  Large Millimeter/submillimeter Array (ALMA). As they state in their study:

“At the core of the swarm telescope is a shift away from thinking about an observatory as a monolithic entity. Rather, an observatory is viewed as many independent parts that work together to accomplish scientific observations. This shift requires moving part of the decision making about the facility away from the human schedulers and operators and transitioning it to “software defined operators” that run on each part of the facility. These software agents then communicate with each other and build dynamic arrays to accomplish the goals of multiple observers, while also adjusting for varying observing conditions and array element states across the facility.”

This idea for a distributed telescope is inspired by the concept of swarm intelligence, where large swarms of robots  are programmed to interact with each other and their environment to perform complex tasks. As they explain, the facility comes down to three major components: autonomous element control, a method of inter-element communication, and data transport management.

Of these components, the most critical is the autonomous element control which governs the actions of each element of the facility. While similar to traditional monitoring and control systems used to control individual robotic telescopes, this system would be different in that it would be responsible for far more. Overall, the element control would be responsible for ensuring the safety of the telescope and maximizing the utilization of the element.

“The first, safety of the element, requires multiple monitoring points and preventative actions in order to identify and prevent problems,” they explain. “The second direction requires methods of relating the goals of an observation to the performance of an element in order to maximize the quantity and quality of the observations, and automated methods of recovering from problems when they occur.”

The second component, inter-element communication, is what allows the individual elements to come together to form the interferometer. This can take the form of a leaderless system (where there is no single point of control), or an organizer system, where all of the communication between the elements and with the observation queue is done through a single point of control (i.e. the organizer).

Long Wavelength Array, operated by the University of New Mexico. Credit: phys.unm.edu

Lastly, their is the issue of data transport management, which can take one of two forms based on existing telescopes. These include fully 0ff-line systems, where correlation is done post-observation – used by the Very Long Baseline Array (VLBA) – to fully-connected systems, where correlation is done in real-time (as with the VLA).  For the sake of their array, the team emphasized how connectivity and correlation are a must.

After considering all these components and how they are used by existing arrays, Dowell and Taylor conclude that the swarm concept is a natural extension of the advances being made in robotic and thinking telescopes, as well as interferometry. The advantages of this are spelled out in their conclusions:

“It allows for more efficient operations of facilities by moving much of the daily operational work done by humans to autonomous control systems. This, in turn, frees up personnel to focus on the scientific output of the telescope. The swarm concept can also combine the unused resources of the different elements together to form an ad hoc array.”

In addition, swarm telescopes will offer new opportunities and funding since they will consist of small elements that can be owned and operated by different entities. In this way, different organizations would be able to conduct science with their own elements while also being able to benefit from large-scale interferometric observations.

Graphic depiction of Modular Active Self-Assembling Space Telescope Swarms
Credit: D. Savransky

This concept is similar to the Modular Active Self-Assembling Space Telescope Swarms, which calls for a swarm of robots that would assemble in space to form a 30 meter (~100 ft) telescope. The concept was proposed by a team of American astronomers led by Dmitri Savransky, an assistant professor of mechanical and aerospace engineering at Cornell University.

This proposals was part of the 2020 Decadal Survey for Astrophysics and was recently selected for Phase I development as part of the 2018 NASA Innovative Advanced Concepts (NIAC) program. So while many large-scale telescopes will be entering service in the near future, the next-next-generation of telescopes could include a few arrays made up of swarms of robots directed by artificial intelligence.

Such arrays would be capable of achieving high-resolution astronomy and interferometry at lower costs, and could free up large, complex arrays for other observations.

Further Reading: arXiv

What Would a Camera on a Breakthrough Starshot Spacecraft See if it’s Going at High Velocity?

Project Starshot, an initiative sponsored by the Breakthrough Foundation, is intended to be humanity's first interstellar voyage. Credit: breakthroughinitiatives.org

In April of 2016, Russian billionaire Yuri Milner announced the creation of Breakthrough Starshot. As part of his non-profit scientific organization (known as Breakthrough Initiatives), the purpose of Starshot was to design a lightsail nanocraft that would be capable of achieving speeds of up to 20% the speed of light and reaching the nearest star system – Alpha Centauri (aka. Rigel Kentaurus) – within our lifetimes.

At this speed – roughly 60,000 km/s (37,282 mps) – the probe would be able to reach Alpha Centauri in 20 years, where it could then capture images of the star and any planets orbiting it. But according to a recent article by Professor Bing Zhang, an astrophysicist from the University of Nevada, researchers could get all kinds of valuable data from Starshot and similar concepts long before they ever reached their destination.

The article appeared in The Conversation under the title “Observing the universe with a camera traveling near the speed of light“. The article was a follow-up to a study conducted by Prof. Zhang and Kunyang Li – a graduate student from the Center for Relativistic Astrophysics at the Georgia Institute of Technology – that appeared in The Astrophysical Journal (titled “Relativistic Astronomy“).

Prof. Albert Einstein at the 11th Josiah Willard Gibbs lecture at the meeting of the American Association for the Advancement of Science in 1934. Credit: AP Photo

To recap, Breakthrough Starshot seeks to leverage recent technological developments to mount an interstellar mission that will reach another star within a single generation. The spacecraft would consist of an ultra-light nanocraft and a lightsail, the latter of which would accelerated by a ground-based laser array up to speeds of hundreds of kilometers per second.

Such a system would allow the tiny spacecraft to conduct a flyby mission of Alpha Centauri in about 20 years after it is launched, which could then beam home images of possible planets and other scientific data (such as analysis of magnetic fields). Recently, Breakthrough Starshot held an “industry day” where they submitted a Request For Proposals (RFP) to potential bidders to build the laser sail.

According to Zhang, a lightsail-driven nanocraft traveling at a portion of the speed of light would also be a good way to test Einstein’s theory of Special Relativity.  Simply put, this law states that the speed of light in a vacuum is constant, regardless of the inertial reference frame or motion of the source. In short, such a spacecraft would be able to take advantage of the features of Special Relativity and provide a new mode to study astronomy.

Based on Einstein’s theory, different objects in different “rest frames” would have different measures of the lengths of space and time. In this sense, an object moving at relativistic speeds would view distant astronomical objects differently as light emissions from these objects would be distorted. Whereas objects in front of the spacecraft would have the wavelength of their light shortened, objects behind it would have them lengthened.

This diagram shows the difference between unshifted, redshifted and blueshifted targets. Credit: NASA

This phenomenon, known as the “Doppler Effect”, results in light being shifted towards the blue end (“blueshift”) or the red end (“redshift”) of the spectrum for approaching and retreating objects, respectively. In 1929, astronomer Edwin Hubble used redshift measurements to determine that distant galaxies were moving away from our own, thus demonstrating that the Universe was in a state of expansion.

Because of this expansion (known as the Hubble Expansion), much of the light in the Universe is redshifted and only measurable in difficult-to-observe infrared wavelengths. But for a camera moving at relativistic speeds, according to Prof. Zhang, this redshifted light would become bluer since the motion of the camera would counteract the effects of cosmic expansion.

This effect, known as “Doppler boosting”, would cause the faint light from the early Universe to be amplified and allow distant objects to be studied in more detail. In this respect, astronomers would be able to study some of the earliest objects in the known Universe, which would offer more clues as to how it evolved over time. As Prof. Zhang explained to Universe Today via email, this would allow for some unique opportunities to test Special Relativity:

“In the rest frame of the camera, the emission of the objects in the hemisphere of the camera motion is blue-shifted. For bright objects with detailed spectral observations from the ground, one can observe them in flight. By comparing their blue-shifted flux at a specific blue-shifted frequency with the flux of the corresponding (de-blueshifted) frequency on the ground, one can precisely test the Doppler boosting prediction in Special Relativity.”

Observed image of nearby galaxy M51 (left) and how the image would look through a camera moving at half the speed of light (right). Credit: Zhang & Li, 2018, The Astrophysical Journal, 854, 123, CC BY-ND

In addition, the frequency and intensity of light – and also the size of distant objects – would also change as far as the observer was concerned. In this respect, the camera would act as a lens and a wide-field camera, magnifying the amount of light it collects and letting astronomers observe more objects within the same field of view. By comparing the observations collected by the camera to those collected by a camera from the ground, astronomers could also test the probe’s Lorentz Factor.

This factor indicates how time, length, and relativistic mass change for an object while that object is moving, which is another prediction of Special Relativity. Last, but not least, Prof. Zhang indicates that probes traveling at relativistic speeds would not need to be sent to any specific destination in order to conduct these tests. As he explained:

“The concept of “relativistic astronomy” is that one does not really need to send the cameras to specific star systems. No need to aim (e.g. to Alpha Centauri system), no need to decelerate. As long as the signal can be transferred back to earth, one can learn a lot of things. Interesting targets include high-redshift galaxies, active galactic nuclei, gamma-ray bursts, and even electromagnetic counterparts of gravitational waves.”

However, there are some drawbacks to this proposal. For starters, the technology behind Starshot is all about accomplishing the dream of countless generations – i.e. reaching another star system (in this case, Alpha Centauri) – within a single generation.

And as Professor Abraham Loeb – the Frank B. Baird Jr. Professor of Science at Harvard University and the Chair and the Breakthrough Starshot Committee – told Universe Today via email, what Prof. Zhang is proposing can be accomplished by other means:

>“Indeed, there are benefits to having a camera move near the speed of light toward faint sources, such as the most distant dwarf galaxies in the early universe. But the cost of launching a camera to the required speed would be far greater than building the next generation of large telescopes which will provide us with a similar sensitivity. Similarly, the goal of testing special relativity can be accomplished at a much lower cost.”

Of course, it will be many years before a project like Starshot can be mounted, and many challenges need to be addressed in the meantime. But it is exciting to know that in meantime, scientific applications can be found for such a mission that go beyond exploration. In a few decades, when the mission begins to make the journey to Alpha Centauri, perhaps it will also be able to conduct tests on Special Relativity and other physical laws while in transit.

Further Reading: The Conversation, The Astrophysical Journal

NASA Has Awarded a Contract to Study Flying Drones on Venus

Black Swift Technologies has won a NASA contract to develop a drone to study Venus' upper atmosphere. Credit: Black Swift Technologies

In the coming decades, NASA and other space agencies hope to mount some ambitious missions to other planets in our Solar System. In addition to studying Mars and the outer Solar System in greater detail, NASA intends to send a mission to Venus to learn more about the planet’s past. This will include studying Venus’ upper atmosphere to determine if the planet once had liquid water (and maybe even life) on its surface.

In order to tackle this daunting challenge, NASA recently partnered with Black Swift Technologies – a Boulder-based company specializing in unmanned aerial systems (UAS) – to build a drone that could survive in Venus’ upper atmosphere. This will be no easy task, but if their designs should prove equal to the task, NASA will be awarding the company a lucrative contract for a Venus aerial drone.

In recent years, NASA has taken a renewed interest in Venus, thanks to climate models that have indicated that it (much like Mars) may have also had liquid water on its surface at one time. This would have likely consisted of a shallow ocean that covered much of the planet’s surface roughly 2 billion years ago, before the planet suffered a runaway Greenhouse Effect that left it the hot and hellish world it is today.

Artist’s impression of the surface of Venus, showing its lightning storms and a volcano in the distance. Credit and ©: European Space Agency/J. Whatmore

In addition, a recent study – which included scientists from NASA’s Ames Research Center and Jet Propulsion Laboratory – indicated that there could be microbial life in Venus’ cloud tops. As such, there is considerable motivation to send aerial platforms to Venus that would be capable of studying Venus’ cloud tops and determining if there are any traces of organic life or indications of the planet’s past surface water there.

As Jack Elston, the co-founded of Black Swift Technologies, explained in an interview with the Daily Camera:

“They’re looking for vehicles to explore just above the cloud layer. The pressure and temperatures are similar to what you’d find on Earth, so it could be a good environment for looking for evidence of life. The winds in the upper atmosphere of Venus are incredibly strong, which creates design challenge.”

To meet this challenge, the company intends to create a drone that will use these strong winds to keep the craft aloft while reducing the amount of electricity it needs. So far, NASA has awarded an initial six-month contract to the company to design a drone and provided specifications on what it needs. This contract included a $125,000 grant by the federal governments’ Small Business Innovation Research program.

This program aims to encourage “domestic small businesses to engage in Federal Research/Research and Development (R/R&D) that has the potential for commercialization.” The company hopes to use some of this grant money to take on more staff and build a drone that NASA would be confident about sending int Venus’ upper atmosphere, where conditions are particularly challenging.

Aircraft like the Venus Atmospheric Maneuverable Platform (VAMP) could explore the cloud tops of Venus for possible signs of life. Credit: Northrop Grumman Corp.

As Elston explained to Universe Today via email, these challenges represent an opportunity for innovation:

“Our project centers around a unique aircraft and method for harvesting energy from Venus’s upper atmosphere that doesn’t require additional sources of energy for propulsion.  Our experience working on unmanned aircraft systems that interact with severe convective storms on Earth will hopefully provide a valuable contribution to the ongoing discussion for how best to explore this turbulent environment. Additionally, the work we do will help inform better designs of our own aircraft and should lead to longer observation times and more robust aircraft to observe everything from volcanic plumes to hurricanes.”

At the end of the six month period, Black Swift will present its concept to NASA for approval. “If they like what we’ve come up with, they’ll fund another two-year project to build prototypes,” said Elston. “That second-phase contract is expected to be worth $750,000.”

This is not the first time that Black Swift has partnered with NASA to created unmanned aerial vehicles to study harsh environments. Last year, the company was awarded a second phase contract worth $875,000 to build a drone that could monitor the temperature, gas levels, winds and pressure levels inside the volcanoes of Costa Rica. After a series of test flights, the drone is expected to be deployed to Hawaii, where it will study the geothermal activity occurring there.

The Russian Academy of Sciences’ Space Research Institute (IKI) Venera-D mission concept includes a Venus orbiter that would operate for up to three years, and a lander designed to survive the incredibly harsh conditions a spacecraft would encounter on Venus’ surface for a few hours. Credit: NASA/JPL-Caltech

If BlackSwift’s concept for a Venus drone makes the cut, their aerial drone will join other mission concepts like the DAVINCI spacecraft, the Venus Emissivity, Radio Science, InSAR, Topography, and Spectroscopy (VERITAS) spacecraft, the Venus Atmospheric Maneuverable Platform (VAMP), or Russia’s Venera-D mission – which is currently scheduled to explore Venus during the late 2020s.

A number of other concepts are being investigated for exploring Venus’ surface to learn more about its geological history. These include a “Steampunk” (i.e. analog) rover that would rely on no electronic parts,  or a vehicle that uses a Stored-Chemical Energy and Power System (SCEPS) – aka. a Sterling engine – to conduct in-situ exploration.

All of these missions aim to reach Venus and brave its harsh conditions in order to determine whether or not “Earth’s Sister Planet” was once a more habitable planet, and how it evolved over time to become the hot and hellish place it is today.

Further Reading: The Drive, Daily Camera

Stunning First Ever Photograph of a Newly Forming Planet

This spectacular image from the SPHERE instrument on ESO's Very Large Telescope is the first clear image of a planet caught in the very act of formation around the dwarf star PDS 70. Credit: ESO/A. Müller et al.

For decades, the most widely-accepted view of how our Solar System formed has been the Nebular Hypothesis. According to this theory, the Sun, the planets, and all other objects in the Solar System formed from nebulous material billions of years ago. This dust experienced a gravitational collapse at the center, forming our Sun, while the rest of the material formed a circumstellar debris ring that coalesced to form the planets.

Thanks to the development of modern telescopes, astronomers have been able to probe other star systems to test this hypothesis. Unfortunately, in most cases, astronomers have only been able to observe debris rings around stars with hints of planets in formation. It was only recently that a team of European astronomers were able to capture an image of a newborn planet, thus demonstrating that debris rings are indeed the birthplace of planets.

The team’s research appeared in two papers that were recently published in Astronomy & Astrophysics, titled “Discovery of a planetary-mass companion within the gap of the transition disk around PDS 70” and “Orbital and atmospheric characterization of the planet within the gap of the PDS 70 transition disk.” The team behind both studies included member from the Max Planck Institute for Astronomy (MPIA) as well as multiple observatories and universities.

Near infrared image of the PDS70 disk obtained with the SPHERE instrument. Credit: ESO/A. Müller, MPIA

For the sake of their studies, the teams selected PDS 70b, a planet that was discovered at a distance of 22 Astronomical Units (AUs) from its host star and which was believed to be a newly-formed body. In the first study – which was led by Miriam Keppler of the Max Planck Institute for Astronomy – the team indicated how they studied the protoplanetary disk around the star PDS 70.

PDS 70 is a low-mass T Tauri star located in the constellation Centaurus, approximately 370 light-years from Earth. This study was performed using archival images in the near-infrared band taken by the Spectro-Polarimetric High-contrast Exoplanet REsearch instrument (SPHERE) instrument on the ESO’s Very Large Telescope (VLT) and the Near-Infrared Coronagraphic Imager on the Gemini South Telescope.

Using these instruments, the team made the first robust detection of a young planet (PDS 70b) orbiting within a gap in its star’s protoplanetary disc and located roughly three billion km (1.86 billion mi) from its central star – roughly the same distance between Uranus and the Sun. In the second study, led by Andre Muller (also from the MPIA) the team describes how they used the SPHERE instrument to measure the brightness of the planet at different wavelengths.

From this, they were able to determine that PDS 70b is a gas giant that has about nine Jupiter masses and a surface temperature of about 1000 °C (1832 °F), making it a particularly “Hot Super-Jupiter”. The planet must be younger than its host star, and is probably still growing. The data also indicated that the planet is surrounded by clouds that alter the radiation emitted by the planetary core and its atmosphere.

Thanks to the advanced instruments used, the team was also able to acquire an image of the planet and its system. As you can see from the image (posted at top) and the video below, the planet is visible as a bright point to the right of the blackened center of the image. This dark region is due to a corongraph, which blocks the light from the star so the team could detect the much-fainter companion.

As Miriam Keppler, a postdoctoral student at the MPIA, explained in a recent ESO press statement:

“These discs around young stars are the birthplaces of planets, but so far only a handful of observations have detected hints of baby planets in them. The problem is that until now, most of these planet candidates could just have been features in the disc.”

In addition to spotting the young planet, the research teams also noted that it has sculpted the protoplanetary disc orbiting the star. Essentially, the planet’s orbit has traced a giant hole in the center of the disc after accumulating material from it. This means that PDS 70 b is still located in the vicinity of its birth place, is likely to still be accumulating material and will continue to grow and change.

For decades, astronomers have been aware of these gaps in the protoplanetary disc and speculated that they were produced by a planet. Now, they finally have the evidence to support this theory. As André Müller explained:

Keppler’s results give us a new window onto the complex and poorly-understood early stages of planetary evolution. We needed to observe a planet in a young star’s disc to really understand the processes behind planet formation.

These studies will be a boon to astronomers, especially when it comes to theoretical models of planet formation and evolution. By determining the planet’s atmospheric and physical properties, the astronomers have been able to test key aspects of the Nebular Hypothesis. The discovery of this young, dust-shrouded planet would not have been were if not for the capabilities of ESO’s SPHERE instrument.

This instrument studies exoplanets and discs around nearby stars using a technique known as high-contrast imaging, but also relies on advanced strategies and data processing techniques. In addition to blocking the light from a star with a coronagraph, SPHERE is able to filter out the signals of faint planetary companions around bright young stars at multiple wavelengths and epochs.

As Prof. Thomas Henning – the director at MPIA, the German co-investigator of the SPHERE instrument, and a senior author on the two studies – stated in a recent MPIA press release:

“After ten years of developing new powerful astronomical instruments such as SPHERE, this discovery shows us that we are finally able to find and study planets at the time of their formation. That is the fulfillment of a long-cherished dream.”

Future observations of this system will also allow astronomers to test other aspects of planet formation models and to learn about the early history of planetary systems. This data will also go a long way towards determining how our own Solar System formed and evolved during its early history.

Further Reading: ESO, MPIA, Astronomy & Astrophysics, Astronomy & Astrophysics (2)