How in the world could you possibly look inside a star? You could break out the scalpels and other tools of the surgical trade, but good luck getting within a few million kilometers of the surface before your skin melts off. The stars of our universe hide their secrets very well, but astronomers can outmatch their cleverness and have found ways to peer into their hearts using, of all things, sound waves. Continue reading “Scientists are using artificial intelligence to see inside stars using sound waves”
Telescopes have come a long way in the past few centuries. From the comparatively modest devices built by astronomers like Galileo Galilei and Johannes Kepler, telescopes have evolved to become massive instruments that require an entire facility to house them and a full crew and network of computers to run them. And in the coming years, much larger observatories will be constructed that can do even more.
Unfortunately, this trend towards larger and larger instruments has many drawbacks. For starters, increasingly large observatories require either increasingly large mirrors or many telescopes working together – both of which are expensive prospects. Luckily, a team from MIT has proposed combining interferometry with quantum-teleportation, which could significantly increase the resolution of arrays without relying on larger mirrors.
A team of researchers from the University of Nebraska–Lincoln recently conducted an experiment where they were able to accelerate plasma electrons to close to the speed of light. This “optical rocket”, which pushed electrons at a force a trillion-trillion times greater than that generated by a conventional rocket, could have serious implications for everything from space travel to computing and nanotechnology.
Despite decades of ongoing research, scientists are trying to understand how the four fundamental forces of the Universe fit together. Whereas quantum mechanics can explain how three of these forces things work together on the smallest of scales (electromagnetism, weak and strong nuclear forces), General Relativity explains how things behaves on the largest of scales (i.e. gravity). In this respect, gravity remains the holdout.
To understand how gravity interacts with matter on the tiniest of scales, scientists have developed some truly cutting-edge experiments. One of these is NASA’s Cold Atom Laboratory (CAL), located aboard the ISS, which recently achieved a milestone by creating clouds of atoms known as Bose-Einstein condensates (BECs). This was the first time that BECs have been created in orbit, and offers new opportunities to probe the laws of physics.
Originally predicted by Satyendra Nath Bose and Albert Einstein 71 years ago, BECs are essentially ultracold atoms that reach temperatures just above absolute zero, the point at which atoms should stop moving entirely (in theory). These particles are long-lived and precisely controlled, which makes them the ideal platform for studying quantum phenomena.
This is the purpose of the CAL facility, which is to study ultracold quantum gases in a microgravity environment. The laboratory was installed in the US Science Lab aboard the ISS in late May and is the first of its kind in space. It is designed to advance scientists’ ability to make precision measurements of gravity and study how it interacts with matter at the smallest of scales.
As Robert Thompson, the CAL project scientist and a physicist at NASA’s Jet Propulsion Laboratory, explained in a recent press release:
“Having a BEC experiment operating on the space station is a dream come true. It’s been a long, hard road to get here, but completely worth the struggle, because there’s so much we’re going to be able to do with this facility.”
About two weeks ago, CAL scientists confirmed that the facility had produced BECs from atoms of rubidium – a soft, silvery-white metallic element in the alkali group. According to their report, they had reached temperatures as low as 100 nanoKelvin, one-ten million of one Kelvin above absolute zero (-273 °C; -459 °F). This is roughly 3 K (-270 °C; -454 °F) colder than the average temperature of space.
Because of their unique behavior, BECs are characterized as a fifth state of matter, distinct from gases, liquids, solids and plasma. In BECs, atoms act more like waves than particles on the macroscopic scale, whereas this behavior is usually only observable on the microscopic scale. In addition, the atoms all assume their lowest energy state and take on the same wave identity, making them indistinguishable from one another.
In short, the atom clouds begin to behave like a single “super atom” rather than individual atoms, which makes them easier to study. The first BECs were produced in a lab in 1995 by a science team consisting of Eric Cornell, Carl Wieman and Wolfgang Ketterle, who shared the 2001 Nobel Prize in Physics for their accomplishment. Since that time, hundreds of BEC experiments have been conducted on Earth and some have even been sent into space aboard sounding rockets.
But the CAL facility is unique in that it is the first of its kind on the ISS, where scientists can conduct daily studies over long periods. The facility consists of two standardized containers, which consist of the larger “quad locker” and the smaller “single locker”. The quad locker contains CAL’s physics package, the compartment where CAL will produce clouds of ultra-cold atoms.
This is done by using magnetic fields or focused lasers to create frictionless containers known as “atom traps”. As the atom cloud decompresses inside the atom trap, its temperature naturally drops, getting colder the longer it remains in the trap. On Earth, when these traps are turned off, gravity causes the atoms to begin moving again, which means they can only be studied for fractions of a second.
Aboard the ISS, which is a microgravity environment, BECs can decompress to colder temperatures than with any instrument on Earth and scientists are able to observe individual BECs for five to ten seconds at a time and repeat these measurements for up to six hours per day. And since the facility is controlled remotely from the Earth Orbiting Missions Operation Center at JPL, day-to-day operations require no intervention from astronauts aboard the station.
Robert Shotwell, the chief engineer of JPL’s astronomy and physics directorate, has overseen the project since February 2017. As he indicated in a recent NASA press release:
“CAL is an extremely complicated instrument. Typically, BEC experiments involve enough equipment to fill a room and require near-constant monitoring by scientists, whereas CAL is about the size of a small refrigerator and can be operated remotely from Earth. It was a struggle and required significant effort to overcome all the hurdles necessary to produce the sophisticated facility that’s operating on the space station today.”
Looking ahead, the CAL scientists want to go even further and achieve temperatures that are lower than anything achieved on Earth. In addition to rubidium, the CAL team is also working towards making BECSs using two different isotopes of potassium atoms. At the moment, CAL is still in a commissioning phase, which consists of the operations team conducting a long series of tests see how the CAL facility will operate in microgravity.
However, once it is up and running, five science groups – including groups led by Cornell and Ketterle – will conduct experiments at the facility during its first year. The science phase is expected to begin in early September and will last three years. As Kamal Oudrhiri, JPL’s mission manager for CAL, put it:
“There is a globe-spanning team of scientists ready and excited to use this facility. The diverse range of experiments they plan to perform means there are many techniques for manipulating and cooling the atoms that we need to adapt for microgravity, before we turn the instrument over to the principal investigators to begin science operations.”
Given time, the Cold Atom Lab (CAL) may help scientists to understand how gravity works on the tiniest of scales. Combined with high-energy experiments conducted by CERN and other particle physics laboratories around the world, this could eventually lead to a Theory of Everything (ToE) and a complete understanding of how the Universe works.
And be sure to check out this cool video (no pun!) of the CAL facility as well, courtesy of NASA:
Further Reading: NASA
Neutron stars are famous for combining a very high-density with a very small radius. As the remnants of massive stars that have undergone gravitational collapse, the interior of a neutron star is compressed to the point where they have similar pressure conditions to atomic nuclei. Basically, they become so dense that they experience the same amount of internal pressure as the equivalent of 2.6 to 4.1 quadrillion Suns!
In spite of that, neutron stars have nothing on protons, according to a recent study by scientists at the Department of Energy’s Thomas Jefferson National Accelerator Facility. After conducting the first measurement of the mechanical properties of subatomic particles, the scientific team determined that near the center of a proton, the pressure is about 10 times greater than the pressure in the heart of a neutron star.
The study which describes the team’s findings, titled “The pressure distribution inside the proton“, recently appeared in the scientific journal Nature. The study was led by Volker Burkert, a nuclear physicist at the Thomas Jefferson National Accelerator Facility (TJNAF), and co-authored by
Basically , they found that the pressure conditions at the center of a proton were 100 decillion pascals – about 10 times the pressure at the heart of a neutron star. However, they also found that pressure inside the particle is not uniform, and drops off as the distance from the center increases. As Volker Burkert, the Jefferson Lab Hall B Leader, explained:
“We found an extremely high outward-directed pressure from the center of the proton, and a much lower and more extended inward-directed pressure near the proton’s periphery… Our results also shed light on the distribution of the strong force inside the proton. We are providing a way of visualizing the magnitude and distribution of the strong force inside the proton. This opens up an entirely new direction in nuclear and particle physics that can be explored in the future.”
Protons are composed of three quarks that are bound together by the strong nuclear force, one of the four fundamental forces that government the Universe – the other being electromagnetism, gravity and weak nuclear forces. Whereas electromagnetism and gravity produce the effects that govern matter on the larger scales, weak and strong nuclear forces govern matter at the subatomic level.
Previously, scientists thought that it was impossible to obtain detailed information about subatomic particles. However, the researchers were able to obtain results by pairing two theoretical frameworks with existing data, which consisted of modelling systems that rely on electromagnetism and gravity. The first model concerns generalized parton distributions (GDP) while the second involve gravitational form factors.
Patron modelling refers to modeling subatomic entities (like quarks) inside protons and neutrons, which allows scientist to create 3D images of a proton’s or neutron’s structure (as probed by the electromagnetic force). The second model describes the scattering of subatomic particles by classical gravitational fields, which describes the mechanical structure of protons when probed via the gravitational force.
As noted, scientists previously thought that this was impossible due to the extreme weakness of the gravitational interaction. However, recent theoretical work has indicated that it could be possible to determine the mechanical structure of a proton using electromagnetic probes as a substitute for gravitational probes. According to Latifa Elouadrhiri – a Jefferson Lab staff scientist and co-author on the paper – that is what their team set out to prove.
“This is the beauty of it. You have this map that you think you will never get,” she said. “But here we are, filling it in with this electromagnetic probe.”
For the sake of their study, the team used the DOE’s Continuous Electron Beam Accelerator Facility at the TJNAF to create a beam of electrons. These were then directed into the nuclei of atoms where they interacted electromagnetically with the quarks inside protons via a process called deeply virtual Compton scattering (DVCS). In this process, an electron exchanges a virtual photon with a quark, transferring energy to the quark and proton.
Shortly thereafter, the proton releases this energy by emitting another photon while remaining intact. Through this process, the team was able to produced detailed information of the mechanics going on in inside the protons they probed. As Francois-Xavier Girod, a Jefferson Lab staff scientist and co-author on the paper, explained the process:
“There’s a photon coming in and a photon coming out. And the pair of photons both are spin-1. That gives us the same information as exchanging one graviton particle with spin-2. So now, one can basically do the same thing that we have done in electromagnetic processes — but relative to the gravitational form factors, which represent the mechanical structure of the proton.”
The next step, according to the research team, will be to apply the technique to even more precise data that will soon be released. This will reduce uncertainties in the current analysis and allow the team to reveal other mechanical properties inside protons – like the internal shear forces and the proton’s mechanical radius. These results, and those the team hope to reveal in the future, are sure to be of interest to other physicists.
“We are providing a way of visualizing the magnitude and distribution of the strong force inside the proton,” said Burkert. “This opens up an entirely new direction in nuclear and particle physics that can be explored in the future.”
Perhaps, just perhaps, it will bring us closer to understanding how the four fundamental forces of the Universe interact. While scientists understand how electromagnetism and weak and strong nuclear forces interact with each other (as described by Quantum Mechanics), they are still unsure how these interact with gravity (as described by General Relativity).
If and when the four forces can be unified in a Theory of Everything (ToE), one of the last and greatest hurdles to a complete understanding of the Universe will finally be removed.
Stephen Hawking is rightly seen as one of the most influential scientists of our time. In his time on this planet, the famed physicist, science communicator, author and luminary became a household name, synonymous with the likes of Einstein, Newton and Galileo. What is even more impressive is the fact that he managed to maintain his commitment to science, education and humanitarian efforts despite suffering from a slow, degenerative disease.
Even though Hawking recently passed away, his influence is still being felt. Shortly before his death, Hawking submitted a paper offering his final theory on the origins of the Universe. The paper, which was published earlier this week (on Wednesday, May 2nd), offers a new take on the Big Bang Theory that could revolutionize the way we think of the Universe, how it was created, and how it evolved.
The paper, titled “A smooth exit from eternal inflation?“, was published in the Journal of High Energy Physics. The theory was first announced at a conference at the University of Cambridge in July of last year, where Professor Thomas Hertog (a Belgian physicist at KU Leuven University) shared Hawking’s paper (which Hertog co-authored) on the occasion of his 75th birthday.
According to the current scientific consensus, all of the current and past matter in the Universe came into existence at the same time – roughly 13.8 billion years ago. At this time, all matter was compacted into a very small ball with infinite density and intense heat. Suddenly, this ball started to inflate at an exponential rate, and the Universe as we know it began.
However, it is widely believed that since this inflation started, quantum effects will keep it going forever in some regions of the Universe. This means that globally, the Universe’s inflation is eternal. In this respect, the observable part of our Universe (measuring 13.8 billion light-years in any direction) is just a region in which inflation has ended and stars and galaxies formed.
As Hawking explained in an interview with Cambridge University last autumn:
“The usual theory of eternal inflation predicts that globally our universe is like an infinite fractal, with a mosaic of different pocket universes, separated by an inflating ocean. The local laws of physics and chemistry can differ from one pocket universe to another, which together would form a multiverse. But I have never been a fan of the multiverse. If the scale of different universes in the multiverse is large or infinite the theory can’t be tested. ”
In their new paper, Hawking and Hertog offer a new theory that predicts that the Universe is not an infinite fractal-like multiverse, but is finite and reasonably smooth. In short, they theorize that the eternal inflation, as part of the theory of the Big Bang, is wrong. As Hertog explained:
“The problem with the usual account of eternal inflation is that it assumes an existing background universe that evolves according to Einstein’s theory of general relativity and treats the quantum effects as small fluctuations around this. However, the dynamics of eternal inflation wipes out the separation between classical and quantum physics. As a consequence, Einstein’s theory breaks down in eternal inflation.”
In contrast to this, Hawking and Hertog offer an explanation based on String Theory, a branch of theoretical physics that attempts to unify General Relativity with quantum physics. This theory was proposed to explain how gravity interacts with the three other fundamental forces of the Universe (weak and strong nuclear forces and electromagnetism), thus producing a Theory of Everything (ToE).
To put it simply, this theory describes the fundamental constituents of the Universe as tiny, one-dimensional vibrating strings. Hawking and Hertog’s approach uses the holography concept of string theory, which postulates that the Universe is a large and complex hologram. In this theory, physical reality in certain 3D spaces can be mathematically reduced to 2D projections on a surface.
Together, Hawking and Hertog developed a variation of this concept to project out the dimension of time in eternal inflation. This enabled them to describe eternal inflation without having to rely on General Relativity, thus reducing inflation to a timeless state defined on a spatial surface at the beginning of time. In this respect, the new theory represents a change from Hawking’s earlier work on “no boundary theory”.
Also known as the Hartle and Hawking No Bounary Proposal, this theory viewed the Universe like a quantum particle – assigning it a wave function that described all possible Universes. This theory also predicted that if you go back in time to the beginning of the Universe, it would shrink and close off like a sphere. Lastly, it predicted that the Universe would eventually stop expanding and collapse in on itself.
As Hertog explains, this new theory is a departure from that earlier work:
“When we trace the evolution of our universe backwards in time, at some point we arrive at the threshold of eternal inflation, where our familiar notion of time ceases to have any meaning. Now we’re saying that there is a boundary in our past.”
Using this theory, Hawking and Hertog were able to derive more reliable predictions about the global structure of the Universe. In addition, a Universe predicted to emerge from eternal inflation on the past boundary is also finite and much simpler. Last, but not least, the theory is more predictive and testable than the infinite Multiverse predicted by the old theory of eternal inflation.
“We are not down to a single, unique universe, but our findings imply a significant reduction of the multiverse, to a much smaller range of possible universes,” said Hawking. In theory, a finite and smooth Universe is one we can observe (at least locally) and will be governed by physical laws that we are already familiar with. Compared to an infinite number of Universes governed by different physical laws, it certainly simplifies the math!
Looking ahead, Hertog plans to study the implications of this theory on smaller scales using data obtained by space telescopes about the local Universe. In addition, he hopes to take advantage of recent studies concerning gravitational waves (GWs) and the many events that have been detected. Essentially, Hertog believes that primordial GWs generated at the exit from eternal inflation are the most promising means to test the model.
Due to the expansion of our Universe since the Big Bang, these GWs would have very long wavelengths, ones which are outside the normal range of the Laser Interferometry Gravitational-Wave Observatory‘s (LIGO) or Virgo‘s detectors. However, the Laser Interferometry Space Antenna (LISA) – an ESA-led plan for a space-based gravitational wave observatory – and other future experiments may be capable of measuring them.
Even though he is longer with us, Hawking’s final theory could be his profound contribution to science. If future research should prove him correct, then Hawking will have resolved one of the most daunting problems in modern astrophysics and cosmology. Just one more achievement from a man who spent his life changing how people think about the Universe!
Further Reading: University of Cambridge
Lighting has always been a source of awe and mystery for us lowly mortals. In ancient times, people associated it with Gods like Zeus and Thor, the fathers of the Greek and Norse pantheons. With the birth of modern science and meteorology, lighting is no longer considered the province of the divine. However, this does not mean that the sense of mystery it carries has diminished one bit.
For example, scientists have found that lightning occurs in the atmospheres of other planets, like the gas giant Jupiter (appropriately!) and the hellish world of Venus. And according to a recent study from Kyoto University, gamma rays caused by lighting interact with air molecules, regularly producing radioisotopes and even positrons – the antimatter version of electrons.
The study, titled “Photonuclear Reactions Triggered by Lightning Discharge“, recently appeared in the scientific journal Nature. The study was led by Teruaki Enoto, a researcher from The Hakubi Center for Advanced Research at Kyoto University, and included members from the University of Tokyo, Hokkaido University, Nagoya University, the RIKEN Nishina Center, the MAXI Team, and the Japan Atomic Energy Agency.
For some time, physicists have been aware that small bursts of high-energy gamma rays can be produced by lightning storms – what are known as “terrestrial gamma-ray flashes”. They are believed to be the result of static electrical fields accelerating electrons, which are then slowed by the atmosphere. This phenomenon was first discovered by space-based observatories, and rays of up to 100,000 electron volts (100 MeV) have been observed.
Given the energy levels involved, the Japanese research team sought to examine how these bursts of gamma rays interact with air molecules. As Teruaki Enoto from Kyoto University, who leads the project, explained in a Kyoto University press release:
“We already knew that thunderclouds and lightning emit gamma rays, and hypothesized that they would react in some way with the nuclei of environmental elements in the atmosphere. In winter, Japan’s western coastal area is ideal for observing powerful lightning and thunderstorms. So, in 2015 we started building a series of small gamma-ray detectors, and placed them in various locations along the coast.”
Unfortunately, the team ran into funding problems along the way. As Enoto explained, they decided to reach out to the general public and established a crowdfunding campaign to fund their work. “We set up a crowdfunding campaign through the ‘academist’ site,” he said, “in which we explained our scientific method and aims for the project. Thanks to everybody’s support, we were able to make far more than our original funding goal.”
Thanks to the success of their campaign, the team built and installed particle detectors across the northwest coast of Honshu. In February of 2017, they installed four more detectors in Kashiwazaki city, which is a few hundred meters away from the neighboring town of Niigata. Immediately after the detectors were installed, a lightning strike took place in Niigata, and the team was able to study it.
What they found was something entirely new and unexpected. After analyzing the data, the team detected three distinct gamma-ray bursts of varying duration. The first was less than a millisecond long, the second was gamma ray-afterglow that took several milliseconds to decay, and the last was a prolonged emission lasting about one minute. As Enoto explained:
“We could tell that the first burst was from the lightning strike. Through our analysis and calculations, we eventually determined the origins of the second and third emissions as well.”
They determined that the second afterglow was caused by the lightning reacting with nitrogen in the atmosphere. Essentially, gamma rays are capable of causing nitrogen molecules to lose a neutron, and it was the reabsorption of these neutrons by other atmospheric particles that produced the gamma-ray afterglow. The final, prolonged emission was the result of unstable nitrogen atoms breaking down.
It was here that things really got interesting. As the unstable nitrogen broke down, it released positrons that then collided with electrons, causing matter-antimatter annihilations that released more gamma rays. As Enoto explained, this demonstrated, for the first time that antimatter is something that can occur in nature due to common mechanisms.
“We have this idea that antimatter is something that only exists in science fiction,” he said. “Who knew that it could be passing right above our heads on a stormy day? And we know all this thanks to our supporters who joined us through ‘academist’. We are truly grateful to all.”
If these results are indeed correct, than antimatter is not the extremely rare substance that we tend to think it is. In addition, the study could present new opportunities for high-energy physics and antimatter research. All of this research could also lead to the development of new or refined techniques for creating it.
Looking ahead, Enoto and his team hopes to conduct more research using the ten detectors they still have operating along the coast of Japan. They also hope to continue involving the public with their research, a process that goes far beyond crowdfunding and includes the efforts of citizen scientists to help process and interpret data.
For more than three decades, the internal structure and evolution of Uranus and Neptune has been a subject of debate among scientists. Given their distance from Earth and the fact that only a few robotic spacecraft have studied them directly, what goes on inside these ice giants is still something of a mystery. In lieu of direct evidence, scientists have relied on models and experiments to replicate the conditions in their interiors.
For instance, it has been theorized that within Uranus and Neptune, the extreme pressure conditions squeeze hydrogen and carbon into diamonds, which then sink down into the interior. Thanks to an experiment conducted by an international team of scientists, this “diamond rain” was recreated under laboratory conditions for the first time, giving us the first glimpse into what things could be like inside ice giants.
The study which details this experiment, titled “Formation of Diamonds in Laser-Compressed Hydrocarbons at Planetary Interior Conditions“, recently appeared in the journal Nature Astronomy. Led by Dr. Dominik Kraus, a physicist from the Helmholtz-Zentrum Dresden-Rossendorf Institute of Radiation Physics, the team included members from the SLAC National Accelerator Laboratory, the Lawrence Livermore National Laboratory and UC Berkeley.
For decades, scientists have held that the interiors of planets like Uranus and Neptune consist of solid cores surrounded by a dense concentrations of “ices”. In this case, ice refers to hydrogen molecules connected to lighter elements (i.e. as carbon, oxygen and/or nitrogen) to create compounds like water and ammonia. Under extreme pressure conditions, these compounds become semi-solid, forming “slush”.
And at roughly 10,000 kilometers (6214 mi) beneath the surface of these planets, the compression of hydrocarbons is thought to create diamonds. To recreate these conditions, the international team subjected a sample of polystyrene plastic to two shock waves using an intense optical laser at the Matter in Extreme Conditions (MEC) instrument, which they then paired with x-ray pulses from the SLAC’s Linac Coherent Light Source (LCLS).
“So far, no one has been able to directly observe these sparkling showers in an experimental setting. In our experiment, we exposed a special kind of plastic – polystyrene, which also consists of a mix of carbon and hydrogen – to conditions similar to those inside Neptune or Uranus.”
The plastic in this experiment simulated compounds formed from methane, a molecule that consists of one carbon atom bound to four hydrogen atoms. It is the presence of this compound that gives both Uranus and Neptune their distinct blue coloring. In the intermediate layers of these planets, it also forms hydrocarbon chains that are compressed into diamonds that could be millions of karats in weight.
The optical laser the team employed created two shock waves which accurately simulated the temperature and pressure conditions at the intermediate layers of Uranus and Neptune. The first shock was smaller and slower, and was then overtaken by the stronger second shock. When they overlapped, the pressure peaked and tiny diamonds began to form. At this point, the team probed the reactions with x-ray pulses from the LCLS.
This technique, known as x-ray diffraction, allowed the team to see the small diamonds form in real-time, which was necessary since a reaction of this kind can only last for fractions of a second. As Siegfried Glenzer, a professor of photon science at SLAC and a co-author of the paper, explained:
“For this experiment, we had LCLS, the brightest X-ray source in the world. You need these intense, fast pulses of X-rays to unambiguously see the structure of these diamonds, because they are only formed in the laboratory for such a very short time.”
In the end, the research team found that nearly every carbon atom in the original plastic sample was incorporated into small diamond structures. While they measured just a few nanometers in diameter, the team predicts that on Uranus and Neptune, the diamonds would be much larger. Over time, they speculate that these could sink into the planets’ atmospheres and form a layer of diamond around the core.
In previous studies, attempts to recreate the conditions in Uranus and Neptune’s interior met with limited success. While they showed results that indicated the formation of graphite and diamonds, the teams conducting them could not capture the measurements in real-time. As noted, the extreme temperatures and pressures that exist within gas/ice giants can only be simulated in a laboratory for very short periods of time.
However, thanks to LCLS – which creates X-ray pulses a billion times brighter than previous instruments and fires them at a rate of about 120 pulses per second (each one lasting just quadrillionths of a second) – the science team was able to directly measure the chemical reaction for the first time. In the end, these results are of particular significance to planetary scientists who specialize in the study of how planets form and evolve.
As Kraus explained, it could cause to rethink the relationship between a planet’s mass and its radius, and lead to new models of planet classification:
“With planets, the relationship between mass and radius can tell scientists quite a bit about the chemistry. And the chemistry that happens in the interior can provide additional information about some of the defining features of the planet… We can’t go inside the planets and look at them, so these laboratory experiments complement satellite and telescope observations.”
This experiment also opens new possibilities for matter compression and the creation of synthetic materials. Nanodiamonds currently have many commercial applications – i.e. medicine, electronics, scientific equipment, etc, – and creating them with lasers would be far more cost-effective and safe than current methods (which involve explosives).
Fusion research, which also relies on creating extreme pressure and temperature conditions to generate abundant energy, could also benefit from this experiment. On top of that, the results of this study offer a tantalizing hint at what the cores of massive planets look like. In addition to being composed of silicate rock and metals, ice giants may also have a diamond layer at their core-mantle boundary.
Assuming we can create probes of sufficiently strong super-materials someday, wouldn’t that be worth looking into?
Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of heir interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.
However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – a international team of researchers recently made an historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.
The study that details their findings, titled “Observation of coherent elastic neutrino-nucleus scattering“, was recently published in the journal Science. The research was conducted as part of the COHERENT experiment, a collaboration of 80 researchers from 19 institutions from more 4 nations that has been searching for what is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS) for over a year.
In finding evidence of this behavior, COHERENT has essentially made history. As Jason Newby, an ORNL physicist and the technical coordinator for COHERENT, said in a ORNL press statement:
“The one-of-a-kind particle physics experiment at Oak Ridge National Laboratory was the first to measure coherent scattering of low-energy neutrinos off nuclei.”
To break it all down, the Standard Model of particle physics indicates that neutrinos are leptons, a particle that interacts with other matter very weakly. They are created through radioactive decay, the nuclear reactions that power stars, and from supernovae. The Big Bang model of cosmology also predicts that neutrinos are the most abundant particles in existence, since they are a byproduct of the creation of the Universe.
As such, their study has been a major focal point for theoretical physicists and cosmologists. In previous studies, neutrino interactions were detected by using literally tons of target material and then examining the particle transformations that resulted from neutrinos hitting them.
Examples include the Super-Kamiokande Observatory in Japan, an underground facility where the target material is 50,000 tons of ultrapure water. In the case of SNOLAB’s Sudbury Neutrino Observatory – which is located in a former mine complex near Sudbury, Ontario – the SNO neutrino detector relies on heavy water for neutrino detection while the SNO+ experiment will use a liquid scintillator.
And the IceCube Neutrino Observatory– the largest neutrino detector in the world, located at the Amundsen–Scott South Pole Station in Antarctica – relies on Antarctic ice to detect neutrino interactions. In all cases, the facilities are extremely isolated and rely on a very expensive equipment.
The COHERENT experiment, however, is immensely smaller and more economical by comparison, weighing a mere 14.5 kg (32 lbs) and occupying far less in the way of space. The experiment was created to take advantage of the existing SNS accelerator-based system, which produces the most intense pulsed neutron beams in the world in order to smash mercury atoms with beams of protons.
This process creates massive amounts of neutrons which are used for various scientific experiments. However, the process also creates a significant amount of neutrinos as a byproduct. To take advantage of this, the COHERENT team began developing a neutrino experiment known as “neutrino alley”. Located in a basement corridor just 20 meters (45 feet) from the mercury tank, the thick concrete walls and gravel provide natural shielding.
The corridor is also fitted with large water tanks to block out additional neutrinos, cosmic rays and other particles. But unlike other experiments, the COHERENT detectors look for signs of neutrinos bumping into the nuclei of other atoms. To do this, the team outfitted the corridor with detectors that rely on a cesium iodide scintillator crystal, which also uses odium to increase the prominence of light signals caused by neutrino interactions.
Juan Collar, a physicist from the University of Chicago, led the design team that created the detector used at SNS. As he explained, this was a “back-to-basics” approach that did away with more expensive and massive detectors:
“They are arguably the most pedestrian kind of radiation detector available, having been around for a century. Sodium-doped cesium iodide merges all of the properties required to work as a small, ‘handheld’ coherent neutrino detector. Very often, less is more.”
Thanks to their experiment and the sophistication of the SNS, the researchers were able to determine that neutrinos are capable of coupling to quarks through the exchange of neutral Z bosons. This process, which is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS), was first predicted in 1973. But until now, no experiment or research team has been able to confirm it.
As Jason Newby indicated, the experiment succeeded in large part thanks to the sophistication of the existing facility. “The energy of the SNS neutrinos is almost perfectly tuned for this experiment—large enough to create a detectable signal, but small enough to take advantage of the coherence condition,” he said. “The only smoking gun of the interaction is a small amount of energy imparted to a single nucleus.”
The data it produced was also cleaner than with previous experiments, since the neutrinos (like the SNS neutron beam that produced them) were also pulsed. This allowed for the easy separation of the signal from background signals, which offered an advantage over steady-state neutrino sources – such as those that are produced by nuclear reactors.
The team also detected three “flavors” of neutrinos, which included muon neutrinos, muon antineutrinos, and electron neutrinos. Whereas the muon neutrinos emerged instantaneously, the others were detected a few microseconds later. From this, the COHERENT team not only validated the theory of CEvNS, but also the Standard Model of particle physics. Their findings also have implications for astrophysics and cosmology.
As Kate Scholberg, a physicist from Duke University and COHERENT’s spokesperson, explained:
“When a massive star collapses and then explodes, the neutrinos dump vast energy into the stellar envelope. Understanding the process feeds into understanding of how these dramatic events occur… COHERENT’s data will help with interpretation of measurements of neutrino properties by experiments worldwide. We may also be able to use coherent scattering to better understand the structure of the nucleus.”
While there is no need for further confirmation of their results, the COHERENT researchers plan to conduct additional measurements in order to observe coherent neutrino interactions at distinct rates (another signature of the process). From this, they hope to expand their knowledge of the nature of CEvNS, as well as other basic neutrino properties – such as their intrinsic magnetism.
This discovery was certainly impressive in its own right, given that it validates an aspect of both the Standard Model of particle physics and Big Bang cosmology. But the fact that the method offers cleaner results and relies on instruments that are significantly smaller and less expensive than other experiments – that is very impressive!
The implications of this research are sure to be far-reaching, and it will be interesting to see what other discoveries it enables in the future!
Quantum entanglement remains one of the most challenging fields of study for modern physicists. Described by Einstein as “spooky action at a distance”, scientists have long sought to reconcile how this aspect of quantum mechanics can coexist with classical mechanics. Essentially, the fact that two particles can be connected over great distances violates the rules of locality and realism.
Formally, this is a violation of Bell’s Ineqaulity, a theory which has been used for decades to show that locality and realism are valid despite being inconsistent with quantum mechanics. However, in a recent study, a team of researchers from the Ludwig-Maximilian University (LMU) and the Max Planck Institute for Quantum Optics in Munich conducted tests which once again violate Bell’s Inequality and proves the existence of entanglement.
Their study, titled “Event-Ready Bell Test Using Entangled Atoms Simultaneously Closing Detection and Locality Loopholes“, was recently published in the Physical Review Letters. Led by Wenjamin Rosenfeld, a physicist at LMU and the Max Planck Institute for Quantum Optics, the team sought to test Bell’s Inequality by entangling two particles at a distance.
Bell’s Inequality (named after Irish physicist John Bell, who proposed it in 1964) essentially states that properties of objects exist independent of being observed (realism), and no information or physical influence can propagate faster than the speed of light (locality). These rules perfectly described the reality we human beings experience on a daily basis, where things are rooted in a particular space and time and exist independent of an observer.
However, at the quantum level, things do not appear to follow these rules. Not only can particles be connected in non-local ways over large distances (i.e. entanglement), but the properties of these particles cannot be defined until they are measured. And while all experiments have confirmed that the predictions of quantum mechanics are correct, some scientists have continued to argue that there are loopholes that allow for local realism.
To address this, the Munich team conducted an experiment using two laboratories at LMU. While the first lab was located in the basement of the physics department, the second was located in the basement of the economics department – roughly 400 meters away. In both labs, teams captured a single rubidium atom in an topical trap and then began exciting them until they released a single photon.
As Dr. Wenjamin Rosenfeld explained in an Max Planck Institute press release:
“Our two observer stations are independently operated and are equipped with their own laser and control systems. Because of the 400 meters distance between the laboratories, communication from one to the other would take 1328 nanoseconds, which is much more than the duration of the measurement process. So, no information on the measurement in one lab can be used in the other lab. That’s how we close the locality loophole.”
Once the two rubidium atoms were excited to the point of releasing a photon, the spin-states of the rubidium atoms and the polarization states of the photons were effectively entangled. The photons were then coupled into optical fibers and guided to a set-up where they were brought to interference. After conducting a measurement run for eight days, the scientists were able to collected around 10,000 events to check for signs entanglement.
This would have been indicated by the spins of the two trapped rubidium atoms, which would be pointing in the same direction (or in the opposite direction, depending on the kind of entanglement). What the Munich team found was that for the vast majority of the events, the atoms were in the same state (or in the opposite state), and that there were only six deviations consistent with Bell’s Inequality.
These results were also statistically more significant than those obtained by a team of Dutch physicists in 2015. For the sake of that study, the Dutch team conducted experiments using electrons in diamonds at labs that were 1.3 km apart. In the end, their results (and other recent tests of Bell’s Inequality) demonstrated that quantum entanglement is real, effectively closing the local realism loophole.
As Wenjamin Rosenfeld explained, the tests conducted by his team also went beyond these other experiments by addressing another major issue. “We were able to determine the spin-state of the atoms very fast and very efficiently,” he said. “Thereby we closed a second potential loophole: the assumption, that the observed violation is caused by an incomplete sample of detected atom pairs”.
By obtaining proof of the violation of Bell’s Inequality, scientists are not only helping to resolve an enduring incongruity between classical and quantum physics. They are also opening the door to some exciting possibilities. For instance, for years, scientist have anticipated the development of quantum processors, which rely on entanglements to simulate the zeros and ones of binary code.
Computers that rely on quantum mechanics would be exponentially faster than conventional microprocessors, and would ushering in a new age of research and development. The same principles have been proposed for cybersecurity, where quantum encryption would be used to cypher information, making it invulnerable to hackers who rely on conventional computers.
Last, but certainly not least, there is the concept of Quantum Entanglement Communications, a method that would allow us to transmit information faster than the speed of light. Imagine the possibilities for space travel and exploration if we are no longer bound by the limits of relativistic communication!
Einstein wasn’t wrong when he characterized quantum entanglements as “spooky action”. Indeed, much of the implications of this phenomena are still as frightening as they are fascinating to physicists. But the closer we come to understanding it, the closer we will be towards developing an understanding of how all the known physical forces of the Universe fit together – aka. a Theory of Everything!