Are We Entering the Era of Quantum Telescopes?

Beyond James Webb and LUVOIR, the future of astronomy could come down to telescopes that rely on quantum mechanics. Credit: Anton Pozdnyakov

For astronomers, one of the greatest challenges is capturing images of objects and phenomena that are difficult to see using optical (or visible light) telescopes. This problem has been largely addressed by interferometry, a technique where multiple telescopes gather signals, which is then combined to create a more complete picture. Examples include the Event Horizon Telescope, which relies on observatories from around the world to capture the first images of the supermassive black hole (SMBH) at the center of the M87 galaxy, and of Sagittarius A* at the center of the Milky Way.

That being said, classic interferometry requires that optical links be maintained between observatories, which imposes limitations and can lead to drastically increased costs. In a recent study, a team of astrophysicists and theoretical physicists proposed how these limitations could be overcome by relying on quantum mechanics. Rather than relying on optical links, they propose how the principle of quantum entanglements could be used to share photons between observatories. This technique is part of a growing field of research that could lead to “quantum telescopes” someday.

Continue reading “Are We Entering the Era of Quantum Telescopes?”

Do Advanced Civilizations use Black Holes as Giant Quantum Computers?

Artist view of an active supermassive black hole. Credit: ESO/L. Calçada

If life is common in our Universe, and we have every reason to suspect it is, why do we not see evidence of it everywhere? This is the essence of the Fermi Paradox, a question that has plagued astronomers and cosmologists almost since the birth of modern astronomy. It is also the reasoning behind the Hart-TIpler Conjecture, one of the many (many!) proposed resolutions, which asserts that if advanced life had emerged in our galaxy sometime in the past, we would see signs of their activity everywhere we looked. Possible indications include self-replicating probes, megastructures, and other Type III-like activity.

On the other hand, several proposed resolutions challenge the notion that advanced life would operate on such massive scales. Others suggest that advanced extraterrestrial civilizations would be engaged in activities and locales that would make them less noticeable. In a recent study, a German-Georgian team of researchers proposed that advanced extraterrestrial civilizations (ETCs) could use black holes as quantum computers. This makes sense from a computing standpoint and offers an explanation for the apparent lack of activity we see when we look at the cosmos.

Continue reading “Do Advanced Civilizations use Black Holes as Giant Quantum Computers?”

Fermilab’s Muon g-2 Experiment Finally Gives Particle Physicists a Hint of What Lies Beyond the Standard Model

The Muon g-2 experiment at the Fermi National Accelerator Laboratory (Fermilab). Credit: Reidar Hahn/Fermilab

Since the long-awaited detection of the Higgs Boson in 2012, particle physicists have been probing deeper into the subatomic realm in the hope of investigating beyond the Standard Model of Particle Physics. In so doing, they hope to confirm the existence of previously unknown particles and the existence of exotic physics, as well as learning more about how the Universe began.

At the Fermi National Accelerator Laboratory (aka. Fermilab), researchers have been conducting the Muon g-2 experiment, which recently announced the results of their first run. Thanks to the unprecedented precision of their instruments, the Fermilab team found that muons in their experiment did not behave in a way that is consistent with the Standard Model, resolving a discrepancy that has existed for decades.

Continue reading “Fermilab’s Muon g-2 Experiment Finally Gives Particle Physicists a Hint of What Lies Beyond the Standard Model”

Quantum Theory Proposes That Cause and Effect Can Go In Loops

Causality is one of those difficult scientific topics that can easily stray into the realm of philosophy.  Science’s relationship with the concept started out simply enough: an event causes another event later in time.  That had been the standard understanding of the scientific community up until quantum mechanics was introduced.  Then, with the introduction of the famous “spooky action at a distance” that is a side effect of the concept of quantum entanglement, scientists began to question that simple interpretation of causality.

Now, researchers at the Université Libre de Bruxelles (ULB) and the University of Oxford have come up with a theory that further challenges that standard view of causality as a linear progress from cause to effect.  In their new theoretical structure, cause and effect can sometimes take place in cycles, with the effect actually causing the cause.

Continue reading “Quantum Theory Proposes That Cause and Effect Can Go In Loops”

LIGO Will Squeeze Light To Overcome The Quantum Noise Of Empty Space

The LIGO Hanford Observatory in Washington State. Credit: LIGO Observatory
The LIGO Hanford Observatory in Washington State. Credit: LIGO Observatory

When two black holes merge, they release a tremendous amount of energy. When LIGO detected the first black hole merger in 2015, we found that three solar masses worth of energy was released as gravitational waves. But gravitational waves don’t interact strongly with matter. The effects of gravitational waves are so small that you’d need to be extremely close to a merger to feel them. So how can we possibly observe the gravitational waves of merging black holes across millions of light-years?

Continue reading “LIGO Will Squeeze Light To Overcome The Quantum Noise Of Empty Space”

French Scientists Claim to Have Created Metallic Hydrogen

Using two diamonds, scientists squeezed hydrogen to pressures above those in Earth's core. Credit: Sang-Heon Shim, Arizona State University

Scientists have long speculated that at the heart of a gas giant, the laws of material physics undergo some radical changes. In these kinds of extreme pressure environments, hydrogen gas is compressed to the point that it actually becomes a metal. For years, scientists have been looking for a way to create metallic hydrogen synthetically because of the endless applications it would offer.

At present, the only known way to do this is to compress hydrogen atoms using a diamond anvil until they change their state. And after decades of attempts (and 80 years since it was first theorized), a team of French scientists may have finally created metallic hydrogen in a laboratory setting. While there is plenty of skepticism, there are many in scientific community who believe this latest claim could be true.

Continue reading “French Scientists Claim to Have Created Metallic Hydrogen”

Antimatter Behaves Exactly the Same as Regular Matter in Double Slit Experiments

Credit: University of Bern

In 1924, French physicist Louis de Broglie proposed that photons – the subatomic particle that constitutes light – behave as both a particle and a wave. Known as “particle-wave duality”, this property has been tested and shown to apply with other subatomic particles (electrons and neutrons) as well as larger, more complex molecules.

Recently, an experiment conducted by researchers with the QUantum Interferometry and Gravitation with Positrons and LAsers (QUPLAS) collaboration demonstrated that this same property applies to antimatter. This was done using the same kind of interference test (aka. double-slit experiment) that helped scientists to propose particle-wave duality in the first place.

Continue reading “Antimatter Behaves Exactly the Same as Regular Matter in Double Slit Experiments”

The Coldest Place in Space Has Been Created. Next Challenge, Coldest Place in the Universe

This series of graphs show the changing density of a cloud of atoms as it is cooled to lower and lower temperatures (going from left to right) approaching absolute zero. Credit: NASA/JPL-Caltech

Despite decades of ongoing research, scientists are trying to understand how the four fundamental forces of the Universe fit together. Whereas quantum mechanics can explain how three of these forces things work together on the smallest of scales (electromagnetism, weak and strong nuclear forces), General Relativity explains how things behaves on the largest of scales (i.e. gravity). In this respect, gravity remains the holdout.

To understand how gravity interacts with matter on the tiniest of scales, scientists have developed some truly cutting-edge experiments. One of these is NASA’s Cold Atom Laboratory (CAL), located aboard the ISS, which recently achieved a milestone by creating clouds of atoms known as Bose-Einstein condensates (BECs). This was the first time that BECs have been created in orbit, and offers new opportunities to probe the laws of physics.

Originally predicted by Satyendra Nath Bose and Albert Einstein 71 years ago, BECs are essentially ultracold atoms that reach temperatures just above absolute zero, the point at which atoms should stop moving entirely (in theory). These particles are long-lived and precisely controlled, which makes them the ideal platform for studying quantum phenomena.

The Cold Atom Laboratory (CAL), which consists of two standardized containers that will be installed on the International Space Station. Credit: NASA/JPL-Caltech/Tyler Winn

This is the purpose of the CAL facility, which is to study ultracold quantum gases in a microgravity environment. The laboratory was installed in the US Science Lab aboard the ISS in late May and is the first of its kind in space. It is designed to advance scientists’ ability to make precision measurements of gravity and study how it interacts with matter at the smallest of scales.

As Robert Thompson, the CAL project scientist and a physicist at NASA’s Jet Propulsion Laboratory, explained in a recent press release:

“Having a BEC experiment operating on the space station is a dream come true. It’s been a long, hard road to get here, but completely worth the struggle, because there’s so much we’re going to be able to do with this facility.”

About two weeks ago, CAL scientists confirmed that the facility had produced BECs from atoms of rubidium – a soft, silvery-white metallic element in the alkali group. According to their report, they had reached temperatures as low as 100 nanoKelvin, one-ten million of one Kelvin above absolute zero (-273 °C; -459 °F). This is roughly 3 K (-270 °C; -454 °F) colder than the average temperature of space.

Because of their unique behavior, BECs are characterized as a fifth state of matter, distinct from gases, liquids, solids and plasma. In BECs, atoms act more like waves than particles on the macroscopic scale, whereas this behavior is usually only observable on the microscopic scale. In addition, the atoms all assume their lowest energy state and take on the same wave identity, making them indistinguishable from one another.

The”physics package” inside the Cold Atom Lab, where ultracold clouds of atoms called Bose-Einstein condensates are produced. Credit: NASA/JPL-Caltech/Tyler Winn

In short, the atom clouds begin to behave like a single “super atom” rather than individual atoms, which makes them easier to study. The first BECs were produced in a lab in 1995 by a science team consisting of Eric Cornell, Carl Wieman and Wolfgang Ketterle, who shared the 2001 Nobel Prize in Physics for their accomplishment. Since that time, hundreds of BEC experiments have been conducted on Earth and some have even been sent into space aboard sounding rockets.

But the CAL facility is unique in that it is the first of its kind on the ISS, where scientists can conduct daily studies over long periods. The facility consists of two standardized containers, which consist of the larger “quad locker” and the smaller “single locker”. The quad locker contains CAL’s physics package, the compartment where CAL will produce clouds of ultra-cold atoms.

This is done by using magnetic fields or focused lasers to create frictionless containers known as “atom traps”. As the atom cloud decompresses inside the atom trap, its temperature naturally drops, getting colder the longer it remains in the trap. On Earth, when these traps are turned off, gravity causes the atoms to begin moving again, which means they can only be studied for fractions of a second.

Aboard the ISS, which is a microgravity environment, BECs can decompress to colder temperatures than with any instrument on Earth and scientists are able to observe individual BECs for five to ten seconds at a time and repeat these measurements for up to six hours per day. And since the facility is controlled remotely from the Earth Orbiting Missions Operation Center at JPL, day-to-day operations require no intervention from astronauts aboard the station.

JPL scientists and members of the Cold Atom Lab’s atomic physics team (left to right) David Aveline, Ethan Elliott and Jason Williams. Credit: NASA/JPL-Caltech

Robert Shotwell, the chief engineer of JPL’s astronomy and physics directorate, has overseen the project since February 2017. As he indicated in a recent NASA press release:

“CAL is an extremely complicated instrument. Typically, BEC experiments involve enough equipment to fill a room and require near-constant monitoring by scientists, whereas CAL is about the size of a small refrigerator and can be operated remotely from Earth. It was a struggle and required significant effort to overcome all the hurdles necessary to produce the sophisticated facility that’s operating on the space station today.”

Looking ahead, the CAL scientists want to go even further and achieve temperatures that are lower than anything achieved on Earth. In addition to rubidium, the CAL team is also working towards making BECSs using two different isotopes of potassium atoms. At the moment, CAL is still in a commissioning phase, which consists of the operations team conducting a long series of tests see how the CAL facility will operate in microgravity.

However, once it is up and running, five science groups – including groups led by Cornell and Ketterle – will conduct experiments at the facility during its first year. The science phase is expected to begin in early September and will last three years. As Kamal Oudrhiri, JPL’s mission manager for CAL, put it:

“There is a globe-spanning team of scientists ready and excited to use this facility. The diverse range of experiments they plan to perform means there are many techniques for manipulating and cooling the atoms that we need to adapt for microgravity, before we turn the instrument over to the principal investigators to begin science operations.”

Given time, the Cold Atom Lab (CAL) may help scientists to understand how gravity works on the tiniest of scales. Combined with high-energy experiments conducted by CERN and other particle physics laboratories around the world, this could eventually lead to a Theory of Everything (ToE) and a complete understanding of how the Universe works.

And be sure to check out this cool video (no pun!) of the CAL facility as well, courtesy of NASA:

Further Reading: NASA

Physicists Take Big Step Towards Quantum Computing and Encryption with new Experiment

Artist’s concept of the experiment in which two atoms are being entangled over a distance of 400 meters. Credit: Wenjamin Rosenfeld

Quantum entanglement remains one of the most challenging fields of study for modern physicists. Described by Einstein as “spooky action at a distance”, scientists have long sought to reconcile how this aspect of quantum mechanics can coexist with classical mechanics. Essentially, the fact that two particles can be connected over great distances violates the rules of locality and realism.

Formally, this is a violation of Bell’s Ineqaulity, a theory which has been used for decades to show that locality and realism are valid despite being inconsistent with quantum mechanics. However, in a recent study, a team of researchers from the Ludwig-Maximilian University (LMU) and the Max Planck Institute for Quantum Optics in Munich conducted tests which once again violate Bell’s Inequality and proves the existence of entanglement.

Their study, titled “Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes“, was recently published in the Physical Review Letters. Led by Wenjamin Rosenfeld, a physicist at LMU and the Max Planck Institute for Quantum Optics, the team sought to test Bell’s Inequality by entangling two particles at a distance.

John Bell, the Irish physicist who devised a test to show that nature does not ‘hide variables’ as Einstein had proposed. Credit: CERN\

Bell’s Inequality (named after Irish physicist John Bell, who proposed it in 1964) essentially states that properties of objects exist independent of being observed (realism), and no information or physical influence can propagate faster than the speed of light (locality). These rules perfectly described the reality we human beings experience on a daily basis, where things are rooted in a particular space and time and exist independent of an observer.

However, at the quantum level, things do not appear to follow these rules. Not only can particles be connected in non-local ways over large distances (i.e. entanglement), but the properties of these particles cannot be defined until they are measured. And while all experiments have confirmed that the predictions of quantum mechanics are correct, some scientists have continued to argue that there are loopholes that allow for local realism.

To address this, the Munich team conducted an experiment using two laboratories at LMU. While the first lab was located in the basement of the physics department, the second was located in the basement of the economics department – roughly 400 meters away. In both labs, teams captured a single rubidium atom in an topical trap and then began exciting them until they released a single photon.

As Dr. Wenjamin Rosenfeld explained in an Max Planck Institute press release:

“Our two observer stations are independently operated and are equipped with their own laser and control systems. Because of the 400 meters distance between the laboratories, communication from one to the other would take 1328 nanoseconds, which is much more than the duration of the measurement process. So, no information on the measurement in one lab can be used in the other lab. That’s how we close the locality loophole.”

The experiment was performed in two locations 398 meters apart at the Ludwig Maximilian University campus in Munich, Germany. Credit: Rosenfeld et al/American Physical Society

Once the two rubidium atoms were excited to the point of releasing a photon, the spin-states of the rubidium atoms and the polarization states of the photons were effectively entangled. The photons were then coupled into optical fibers and guided to a set-up where they were brought to interference. After conducting a measurement run for eight days, the scientists were able to collected around 10,000 events to check for signs entanglement.

This would have been indicated by the spins of the two trapped rubidium atoms, which would be pointing in the same direction (or in the opposite direction, depending on the kind of entanglement). What the Munich team found was that for the vast majority of the events, the atoms were in the same state (or in the opposite state), and that there were only six deviations consistent with Bell’s Inequality.

These results were also statistically more significant than those obtained by a team of Dutch physicists in 2015. For the sake of that study, the Dutch team conducted experiments using electrons in diamonds at labs that were 1.3 km apart. In the end, their results (and other recent tests of Bell’s Inequality) demonstrated that quantum entanglement is real, effectively closing the local realism loophole.

As Wenjamin Rosenfeld explained, the tests conducted by his team also went beyond these other experiments by addressing another major issue. “We were able to determine the spin-state of the atoms very fast and very efficiently,” he said. “Thereby we closed a second potential loophole: the assumption, that the observed violation is caused by an incomplete sample of detected atom pairs”.

By obtaining proof of the violation of Bell’s Inequality, scientists are not only helping to resolve an enduring incongruity between classical and quantum physics. They are also opening the door to some exciting possibilities. For instance, for years, scientist have anticipated the development of quantum processors, which rely on entanglements to simulate the zeros and ones of binary code.

Computers that rely on quantum mechanics would be exponentially faster than conventional microprocessors, and would ushering in a new age of research and development. The same principles have been proposed for cybersecurity, where quantum encryption would be used to cypher information, making it invulnerable to hackers who rely on conventional computers.

Last, but certainly not least, there is the concept of Quantum Entanglement Communications, a method that would allow us to transmit information faster than the speed of light. Imagine the possibilities for space travel and exploration if we are no longer bound by the limits of relativistic communication!

Einstein wasn’t wrong when he characterized quantum entanglements as “spooky action”. Indeed, much of the implications of this phenomena are still as frightening as they are fascinating to physicists. But the closer we come to understanding it, the closer we will be towards developing an understanding of how all the known physical forces of the Universe fit together – aka. a Theory of Everything!

Further Reading: LMU, Physical Review Letters

New Explanation for Dark Energy? Tiny Fluctuations of Time and Space

A new study from researchers from the University of British Columbia offers a new explanation of Dark Energy. Credit: NASA

Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this elusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the Universe is expanding due to fluctuations in space and time.

The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

Diagram showing the Lambda-CBR universe, from the Big Bang to the the current era. Credit: Alex Mittelmann/Coldcreation

The team began by addressing the inconsistencies arising out of the two main theories that together explain all natural phenomena in the Universe. These theories are none other than General Relativity and quantum mechanics, which effectively explain how the Universe behaves on the largest of scales (i.e. stars, galaxies, clusters) and the smallest (subatomic particles).

Unfortunately, these two theories are not consistent when it comes to a little matter known as gravity, which scientists are still unable to explain in terms of quantum mechanics. The existence of Dark Energy and the expansion of the Universe are another point of disagreement. For starters, candidates theories like vacuum energy – which is one of the most popular explanations for Dark Energy – present serious incongruities.

According to quantum mechanics, vacuum energy would have an incredibly large energy density to it. But if this is true, then General Relativity predicts that this energy would have an incredibly strong gravitational effect, one which would be powerful enough to cause the Universe to explode in size. As Prof. Unruh shared with Universe Today via email:

“The problem is that any naive calculation of the vacuum energy gives huge values. If one assumes that there is some sort of cutoff so one cannot get energy densities much greater than the Planck energy density (or about 1095 Joules/meter³)  then one finds that one gets a Hubble constant – the time scale on which the Universe roughly doubles in size – of the order of 10-44 sec. So, the usual approach is to say that somehow something reduces that down so that one gets the actual expansion rate of about 10 billion years instead. But that ‘somehow’ is pretty mysterious and no one has come up with an even half convincing mechanism.”

Timeline of the Big Bang and the expansion of the Universe. Credit: NASA

Whereas other scientists have sought to modify the theories of General Relativity and quantum mechanics in order to resolve these inconsistencies, Wang and his colleagues sought a different approach. As Wang explained to Universe Today via email:

“Previous studies are either trying to modify quantum mechanics in some way to make vacuum energy small or trying to modify General Relativity in some way to make gravity numb for vacuum energy. However, quantum mechanics and General Relativity are the two most successful theories that explain how our Universe works… Instead of trying to modify quantum mechanics or General Relativity, we believe that we should first understand them better. We takes the large vacuum energy density predicted by quantum mechanics seriously and just let them gravitate according to General Relativity without modifying either of them.”

For the sake of their study, Wang and his colleagues performed new sets of calculations on vacuum energy that took its predicted high energy density into account. They then considered the possibility that on the tiniest of scales – billions of times smaller than electrons – the fabric of spacetime is subject to wild fluctuations, oscillating at every point between expansion and contraction.

Could fluctuations at the tiniest levels of space time explain Dark Energy and the expansion of the cosmos? Credit: University of Washington

As it swings back and forth, the result of these oscillations is a net effect where the Universe expands slowly, but at an accelerating rate. After performing their calculations, they noted that such an explanation was consistent with both the existence of quantum vacuum energy density and General Relativity. On top of that, it is also consistent with what scientists have been observing in our Universe for almost a century. As Unruh described it:

“Our calculations showed that one could consistently regard [that] the Universe on the tiniest scales is actually expanding and contracting at an absurdly fast rate; but that on a large scale, because of an averaging over those tiny scales, physics would not notice that ‘quantum foam’. It has a tiny residual effect in giving an effective cosmological constant (dark energy type effect). In some ways it is like waves on the ocean which travel as if the ocean were perfectly smooth but really we know that there is this incredible dance of the atoms that make up the water, and waves average over those fluctuations, and act as if the surface was smooth.”

In contrast to conflicting theories of a Universe where the various forces that govern it cannot be resolved and must cancel each other out, Wang and his colleagues presents a picture where the Universe is constantly in motion. In this scenario, the effects of vacuum energy are actually self-cancelling, and also give rise to the expansion and acceleration we have been observing all this time.

While it may be too soon to tell, this image of a Universe that is highly-dynamic (even on the tiniest scales) could revolutionize our understanding of spacetime. At the very least, these theoretical findings are sure to stimulate debate within the scientific community, as well as experiments designed to offer direct evidence. And that, as we know, is the only way we can advance our understanding of this thing known as the Universe.

Further Reading: UBC News, Physical Review D