Astronauts Could Rely on Algae as the Perfect Life Support Partner

When planning for long-duration crewed missions, one of the most important things is to make sure that the crews have enough of the bare essentials to last. This is no easy task, since a crewed spacecraft will be a crew’s entire world for months on end. That means that a sufficient amount of food, water and oxygen will need to be brought along.

According to a new investigation being conducted aboard the International Space Station, a possible solution could lie with a hybrid life support system (LSS). In such a system, which could be used aboard spacecraft and space stations in the near future, microalgae would be used to clean the air and water, and possibly even manufacture food for the crew.

Continue reading “Astronauts Could Rely on Algae as the Perfect Life Support Partner”

The Starhops Have Begun!

According to Elon Musk, SpaceX’s Starship Hopper just completed its inaugural hop test at the company’s South Texas Launch Site. As the first of many, this test is intended to validate the sophisticated Raptor engines that will be used aboard the full-scale Starship spacecraft, which is intrinsic to Musks’ long-term vision of providing intercontinental flights and making commercial trips to the Moon and Mars.

Continue reading “The Starhops Have Begun!”

New Research Reveals How Galaxies Stay Hot and Bothered

It’s relatively easy for galaxies to make stars. Start out with a bunch of random blobs of gas and dust. Typically those blobs will be pretty warm. To turn them into stars, you have to cool them off. By dumping all their heat in the form of radiation, they can compress. Dump more heat, compress more. Repeat for a million years or so.

Eventually pieces of the gas cloud shrink and shrink, compressing themselves into a tight little knots. If the densities inside those knots get high enough, they trigger nuclear fusion and voila: stars are born.

Continue reading “New Research Reveals How Galaxies Stay Hot and Bothered”

A New Atomic Clock has been Built that Would be off by Less than a Second Since the Big Bang

Timeline of the Big Bang and the expansion of the Universe. If the new atomic clock had been turned on at the Big Bang, it would be off by less than a single second now, almost 14 billion years later. Credit: NASA

Physicists have developed an atomic clock so accurate that it would be off by less than a single second in 14 billion years. That kind of accuracy and precision makes it more than just a timepiece. It’s a powerful scientific instrument that could measure gravitational waves, take the measure of the Earth’s gravitational shape, and maybe even detect dark matter.

How did they do it?

Continue reading “A New Atomic Clock has been Built that Would be off by Less than a Second Since the Big Bang”

Chinese Fusion Experiment Reaches 100 Million Degrees

Fusion power has been the fevered dream of scientists, environmentalists and futurists for almost a century. For the past few decades, scientists have been attempting to find a way to create sustainable fusion reactions that would provide human beings with clean, abundant energy, which would finally break our dependence on fossil fuels and other unclean methods.

In recent years, many positive strides have been made that are bringing the “fusion era” closer to reality. Most recently, scientists working with the Experimental Advanced Superconducting Tokamak (EAST) – aka. the “Chinese artificial sun” – set a new record by super-heating clouds of hydrogen plasma to over 100 million degrees – a temperature which is six times hotter than the Sun itself!

Continue reading “Chinese Fusion Experiment Reaches 100 Million Degrees”

Next Generation Telescopes Could Use “Teleportation” to Take Better Images

Telescopes have come a long way in the past few centuries. From the comparatively modest devices built by astronomers like Galileo Galilei and Johannes Kepler, telescopes have evolved to become massive instruments that require an entire facility to house them and a full crew and network of computers to run them. And in the coming years, much larger observatories will be constructed that can do even more.

Unfortunately, this trend towards larger and larger instruments has many drawbacks. For starters, increasingly large observatories require either increasingly large mirrors or many telescopes working together – both of which are expensive prospects.  Luckily, a team from MIT has proposed combining interferometry with quantum-teleportation, which could significantly increase the resolution of arrays without relying on larger mirrors.

A New Solution to the Space Junk Problem. Spacecraft with Plasma Beams to Force Space Junk to Burn Up

A satellite using a bi-directional plasma thruster can direct one beam at space junk, sending it harmlessly into Earth's atmosphere. The other opposite beam can stabilize the position of the satellite itself. Image: Takahashi et. al. 2018.

Space junk is a growing problem. For decades we have been sending satellites into orbit around Earth. Some of them de-orbit and burn up in Earth’s atmosphere, or crash into the surface. But most of the stuff we send into orbit is still up there.

This is becoming an acute problem as years go by and we launch more and more hardware into orbit. Since the very first satellite—Sputnik 1—was launched into orbit in 1957, over 8000 satellites have ben placed in orbit. As of 2018, an estimated 4900 are still in orbit. About 3000 of those are not operational. They’re space junk. The risk of collision is growing, and scientists are working on solutions. The problem will compound itself over time, as collisions between objects create more pieces of debris that have to be dealt with.

Continue reading “A New Solution to the Space Junk Problem. Spacecraft with Plasma Beams to Force Space Junk to Burn Up”

Technosignatures are NASA’s New Target for Detecting Other Civilizations in Space. Wait. What’s a Technosignature?

Artist's impression of a Dyson Sphere. The construction of such a massive engineering structure would create a technosignature that could be detected by humanity. Credit: SentientDevelopments.com/Eburacum45

NASA is targeting technosignatures in its renewed effort to detect alien civilizations. Congress asked NASA to re-boot its search for other civilizations a few months ago. Their first step towards that goal is the NASA Technosignatures Workshop, held in Houston from September 26th to 28th, 2018.
Continue reading “Technosignatures are NASA’s New Target for Detecting Other Civilizations in Space. Wait. What’s a Technosignature?”

Instead of Building Single Monster Scopes like James Webb, What About Swarms of Space Telescopes Working Together?

In the coming decade, a number of next-generation instruments will take to space and begin observing the Universe. These will include the James Webb Space Telescope (JWST), which is likely to be followed by concepts like the Large Ultraviolet/Optical/Infrared Surveyor (LUVOIR), the Origins Space Telescope (OST), the Habitable Exoplanet Imager (HabEx) and the Lynx X-ray Surveyor.

These missions will look farther into the cosmos than ever before and help astronomers address questions like how the Universe evolved and if there is life in other star systems. Unfortunately, all these missions have two things in common: in addition to being very large and complex, they are also very expensive. Hence why some scientists are proposing that we rely on more cost-effective ideas like swarm telescopes.

Two such scientists are Jayce Dowell and Gregory B. Taylor, a research assistant professor and professor (respectively) with the Department of Physics and Astronomy at the University of New Mexico. Together, the pair outlined their idea in a study titled “The Swarm Telescope Concept“, which recently appeared online and was accepted for publication by the Journal of Astronomical Instrumentation.

Illustration of NASA’s James Webb Space Telescope. Credits: NASA

As they state in their study, traditional astronomy has focused on the construction, maintenance and operation of single telescopes. The one exception to this is radio astronomy, where facilities have been spread over an extensive geographic area in order to obtain high angular resolution. Examples of this include the Very Long Baseline Array (VLBA), and the proposed Square Kilometer Array (SKA).

In addition, there’s also the problem of how telescopes are becoming increasingly reliant on computing and digital signal processing. As they explain in their study, telescopes commonly carry out multiple simultaneous observation campaigns, which increases the operational complexity of the facility due to conflicting configuration requirements and scheduling considerations.

A possible solution, according to Dowell and Taylor, is to rethink telescopes. Instead of a single instrument, the telescope would consist of a distributed array where many autonomous elements come together through a data transport system to function as a single facility. This approach, they claim, would be especially useful when it comes to the Next Generation Very Large Array (NGVLA) – a future interferometer that will build on the legacy of the Karl G. ansky Very Large Array and Atacama  Large Millimeter/submillimeter Array (ALMA). As they state in their study:

“At the core of the swarm telescope is a shift away from thinking about an observatory as a monolithic entity. Rather, an observatory is viewed as many independent parts that work together to accomplish scientific observations. This shift requires moving part of the decision making about the facility away from the human schedulers and operators and transitioning it to “software defined operators” that run on each part of the facility. These software agents then communicate with each other and build dynamic arrays to accomplish the goals of multiple observers, while also adjusting for varying observing conditions and array element states across the facility.”

This idea for a distributed telescope is inspired by the concept of swarm intelligence, where large swarms of robots  are programmed to interact with each other and their environment to perform complex tasks. As they explain, the facility comes down to three major components: autonomous element control, a method of inter-element communication, and data transport management.

Of these components, the most critical is the autonomous element control which governs the actions of each element of the facility. While similar to traditional monitoring and control systems used to control individual robotic telescopes, this system would be different in that it would be responsible for far more. Overall, the element control would be responsible for ensuring the safety of the telescope and maximizing the utilization of the element.

“The first, safety of the element, requires multiple monitoring points and preventative actions in order to identify and prevent problems,” they explain. “The second direction requires methods of relating the goals of an observation to the performance of an element in order to maximize the quantity and quality of the observations, and automated methods of recovering from problems when they occur.”

The second component, inter-element communication, is what allows the individual elements to come together to form the interferometer. This can take the form of a leaderless system (where there is no single point of control), or an organizer system, where all of the communication between the elements and with the observation queue is done through a single point of control (i.e. the organizer).

Long Wavelength Array, operated by the University of New Mexico. Credit: phys.unm.edu

Lastly, their is the issue of data transport management, which can take one of two forms based on existing telescopes. These include fully 0ff-line systems, where correlation is done post-observation – used by the Very Long Baseline Array (VLBA) – to fully-connected systems, where correlation is done in real-time (as with the VLA).  For the sake of their array, the team emphasized how connectivity and correlation are a must.

After considering all these components and how they are used by existing arrays, Dowell and Taylor conclude that the swarm concept is a natural extension of the advances being made in robotic and thinking telescopes, as well as interferometry. The advantages of this are spelled out in their conclusions:

“It allows for more efficient operations of facilities by moving much of the daily operational work done by humans to autonomous control systems. This, in turn, frees up personnel to focus on the scientific output of the telescope. The swarm concept can also combine the unused resources of the different elements together to form an ad hoc array.”

In addition, swarm telescopes will offer new opportunities and funding since they will consist of small elements that can be owned and operated by different entities. In this way, different organizations would be able to conduct science with their own elements while also being able to benefit from large-scale interferometric observations.

Graphic depiction of Modular Active Self-Assembling Space Telescope Swarms
Credit: D. Savransky

This concept is similar to the Modular Active Self-Assembling Space Telescope Swarms, which calls for a swarm of robots that would assemble in space to form a 30 meter (~100 ft) telescope. The concept was proposed by a team of American astronomers led by Dmitri Savransky, an assistant professor of mechanical and aerospace engineering at Cornell University.

This proposals was part of the 2020 Decadal Survey for Astrophysics and was recently selected for Phase I development as part of the 2018 NASA Innovative Advanced Concepts (NIAC) program. So while many large-scale telescopes will be entering service in the near future, the next-next-generation of telescopes could include a few arrays made up of swarms of robots directed by artificial intelligence.

Such arrays would be capable of achieving high-resolution astronomy and interferometry at lower costs, and could free up large, complex arrays for other observations.

Further Reading: arXiv