Since the long-awaited detection of the Higgs Boson in 2012, particle physicists have been probing deeper into the subatomic realm in the hope of investigating beyond the Standard Model of Particle Physics. In so doing, they hope to confirm the existence of previously unknown particles and the existence of exotic physics, as well as learning more about how the Universe began.
At the Fermi National Accelerator Laboratory (aka. Fermilab), researchers have been conducting the Muon g-2 experiment, which recently announced the results of their first run. Thanks to the unprecedented precision of their instruments, the Fermilab team found that muons in their experiment did not behave in a way that is consistent with the Standard Model, resolving a discrepancy that has existed for decades.
The neutrino is a confounding little particle that is believed to have played a major role in the evolution of our Universe. They also possess very little mass, have no charge, and interact with other particles only through the weak nuclear force and gravity. As such, finding evidence of their interactions is extremely difficult and requires advanced facilities that are shielded to prevent interference.
Minute vibrating strings found in string theory are not the only ones that are of interest to physicists. The Standard Model of particle physics provides for a theory regarding a different type of string – this one is a string of very sparse gas strung over very long distances. In fact, the standard model predicts that a large percentage of “baryonic matter” (i.e. the type that makes up everything we can see and interact with) would be contained in these filaments. And now for the first time, scientists led by a team at the University of Bonn in Germany have detected one of these super long strings of gas.
Wormholes are a popular feature in science fiction, the means through which spacecraft can achieve faster-than-light (FTL) travel and instantaneously move from one point in spacetime to another. And while the General Theory of Relativity forbids the existence of “traversable wormholes”, recent research has shown that they are actually possible within the domain of quantum physics.
The only downsides are that they would actually take longer to traverse than normal space and/or likely be microscopic. In a new study performed by a pair of Ivy League scientists, the existence of physics beyond the Standard Model could mean that there are wormholes out there that are not only large enough to be traversable, but entirely safe for human travelers looking to get from point A to point B.
The Standard Model of Particle Physics is one of science’s most impressive feats. It’s a rigorous, precise effort to understand and describe three of the four fundamental forces of the Universe: the electromagnetic force, the strong nuclear force, and the weak nuclear force. Gravity is absent because so far, fitting it into the Standard Model has been extremely challenging.
But there are some holes in the Standard Model, and one of them involves the mass of the neutrino.
At the Amundsen–Scott South Pole Station in Antarctica lies the IceCube Neutrino Observatory – a facility dedicated to the study of elementary particles known as neutrino. This array consists of 5,160 spherical optical sensors – Digital Optical Modules (DOMs) – buried within a cubic kilometer of clear ice. At present, this observatory is the largest neutrino detector in the world and has spent the past seven years studying how these particles behave and interact.
The most recent study released by the IceCube collaboration, with the assistance of physicists from Pennsylvania State University, has measured the Earth’s ability to block neutrinos for the first time. Consistent with the Standard Model of Particle Physics, they determined that while trillions of neutrinos pass through Earth (and us) on a regular basis, some are occasionally stopped by it.
Back in 2013, the first detections of high-energy neutrinos were made by IceCube collaboration. These neutrinos – which were believed to be astrophysical in origin – were in the peta-electron volt range, making them the highest energy neutrinos discovered to date. IceCube searches for signs of these interactions by looking for Cherenkov radiation, which is produced after fast-moving charged particles are slowed down by interacting with normal matter.
By detecting neutrinos that interact with the clear ice, the IceCube instruments were able to estimate the energy and direction of travel of the neutrinos. Despite these detections, however, the mystery remained as to whether or not any kind of matter could stop a neutrino as it journeyed through space. In accordance with the Standard Model of Particle Physics, this is something that should happen on occasion.
After observing interactions at IceCube for a year, the science team found that the neutrinos that had to travel the farthest through Earth were less likely to reach the detector. As Doug Cowen, a professor of physics and astronomy/astrophysics at Penn State, explained in a Penn State press release:
“This achievement is important because it shows, for the first time, that very-high-energy neutrinos can be absorbed by something – in this case, the Earth. We knew that lower-energy neutrinos pass through just about anything, but although we had expected higher-energy neutrinos to be different, no previous experiments had been able to demonstrate convincingly that higher-energy neutrinos could be stopped by anything.”
The existence of neutrinos was first proposed in 1930 by theoretical physicist Wolfgang Pauli, who postulated their existence as a way of explaining beta decay in terms of the conservation of energy law. They are so-named because they are electrically neutral, and only interact with matter very weakly – i.e. through the weak subatomic force and gravity. Because of this, neutrinos pass through normal matter on a regular basis.
Whereas neutrinos are produced regularly by stars and nuclear reactors here on Earth, the first neutrinos were formed during the Big Bang. The study of their interaction with normal matter can therefore tell us much about how the Universe evolved over the course of billions of years. Many scientists anticipate that the study of neutrinos will indicate the existence of new physics, ones which go beyond the Standard Model.
Because of this, the science team was somewhat surprised (and perhaps disappointed) with their results. As Francis Halzen – the principal investigator for the IceCube Neutrino Observatory and a professor of physics at the University of Wisconsin-Madison – explained:
“Understanding how neutrinos interact is key to the operation of IceCube. We were of course hoping for some new physics to appear, but we unfortunately find that the Standard Model, as usual, withstands the test.
For the most part, the neutrinos selected for this study were more than one million times more energetic than those that are produced by our Sun or nuclear power plants. The analysis also included some that were astrophysical in nature – i.e. produced beyond Earth’s atmosphere – and may have been accelerated towards Earth by supermassive black holes (SMBHs).
Darren Grant, a professor of physics at the University of Alberta, is also the spokesperson for the IceCube Collaboration. As he indicated, this latest interaction study opens doors for future neutrino research. “Neutrinos have quite a well-earned reputation of surprising us with their behavior,” he said. “It is incredibly exciting to see this first measurement and the potential it holds for future precision tests.”
This study not only provided the first measurement of the Earth’s absorption of neutrinos, it also offers opportunities for geophysical researchers who are hoping to use neutrinos to explore Earth’s interior. Given that Earth is capable of stopping some of the billions of high-energy particles that routinely pass through it, scientists could develop a method for studying the Earth’s inner and outer core, placing more accurate constraints on their sizes and densities.
It also shows that the IceCube Observatory is capable of reaching beyond its original purpose, which was particle physics research and the study of neutrinos. As this latest study clearly shows, it is capable of contributing to planetary science research and nuclear physics as well. Physicists also hope to use the full 86-string IceCube array to conduct a multi-year analysis, examining even higher ranges of neutrino energies.
As James Whitmore – the program director in the National Science Foundation’s (NSF) physics division (which provides support for IceCube) – indicated, this could allow them to truly search for physics that go beyond the Standard Model.
“IceCube was built to both explore the frontiers of physics and, in doing so, possibly challenge existing perceptions of the nature of universe. This new finding and others yet to come are in that spirit of scientific discovery.”
Ever since the discovery of the Higgs boson in 2012, physicists have been secure in the knowledge that the long journey to confirm the Standard Model was now complete. Since then, they have set their sets farther, hoping to find new physics that could resolve some of the deeper mysteries of the Universe – i.e. supersymmetry, a Theory of Everything (ToE), etc.
This, as well as studying how physics work at the highest energy levels (similar to those that existed during the Big Bang) is the current preoccupation of physicists. If they are successful, we might just come to understand how this massive thing known as the Universe works.
Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of their interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.
However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – an international team of researchers recently made a historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.
The Standard Model of particle physics has been the predominant means of explaining what the basic building blocks of matter are and how they interact for decades. First proposed in the 1970s, the model claims that for every particle created, there is an anti-particle. As such, an enduring mystery posed by this model is why the Universe can exist if it is theoretically made up of equal parts of matter and antimatter.
This seeming disparity, known as the charge-parity (CP) violation, has been the subject of experiments for many years. But so far, no definitive demonstration has been made for this violation, or how so much matter can exist in the Universe without its counterpart. But thanks to new findings released by the international Tokai-to-Kamioka (T2K) collaboration, we may be one step closer to understanding why this disparity exists.
First observed in 1964, CP violation proposes that under certain conditions, the laws of charge-symmetry and parity-symmetry (aka. CP-symmetry) do not apply. These laws state that the physics governing a particle should be the same if it were interchanged with its antiparticle, while its spatial coordinates would be inverted. From this observation, one of the greatest cosmological mysteries emerged.
If the laws governing matter and antimatter are the same, then why is it that the Universe is so matter-dominated? Alternately, if matter and antimatter are fundamentally different, then how does this accord with our notions of symmetry? Answering these questions is not only important as far as our predominant cosmological theories go, they are also intrinsic to understanding how the weak interactions that govern particles work.
Established in June of 2011, the international T2K collaboration is the first experiment in the world dedicated to answering this mystery by studying neutrino and anti-neutrino oscillations. The experiment begins with high-intensity beams of muon neutrinos (or muon anti-neutrinos) being generated at the Japan Proton Accelerator Research Complex (J-PARC), which are then fired towards the Super-Kamiokande detector 295 km away.
This detector is currently one of the world’s largest and most sophisticated, dedicated to the detection and study of solar and atmospheric neutrinos. As neutrinos travel between the two facilities, they change “flavor” – going from muon neutrinos or anti-neutrinos to electron neutrinos or anti-neutrinos. In monitoring these neutrino and anti-neutrino beams, the experiment watches for different rates of oscillation.
This difference in oscillation would show that there is an imbalance between particles and antiparticles, and thus provide the first definitive evidence of CP violation for the first time. It would also indicate that there are physics beyond the Standard Model that scientists have yet to probe. This past April, the first data set produced by T2K was released, which provided some telling results.
As Mark Hartz, a T2K collaborator and the Kavli IPMU Project Assistant Professor, said in a recent press release:
“While the data sets are still too small to make a conclusive statement, we have seen a weak preference for large CP violation and we are excited to continue to collect data and make a more sensitive search for CP violation.”
These results, which were recently published in the Physical Review Letters, include all data runs from between January 2010 to May 2016. In total, this data comprised 7.482 x 1020 protons (in neutrino mode), which yielded 32 electron neutrino and 135 muon neutrino events, and 7.471×1020 protons (in antineutrino mode), which yielded 4 electron anti-neutrino and 66 muon neutrino events.
In other words, the first batch of data has provided some evidence for CP violation, and with a confidence interval of 90%. But this is just the beginning, and the experiment is expected to run for another ten years before wrapping up. “If we are lucky and the CP violation effect is large, we may expect 3 sigma evidence, or about 99.7% confidence level, for CP violation by 2026,” said Hartz.
If the experiment proves successful, physicists may finally be able to answer how it is that the early Universe didn’t annihilate itself. It is also likely help to reveal aspects of the Universe that particle physicists are anxious to get into! For it here that the answers to the deepest secrets of the Universe, like how all of its fundamental forces fit together, are likely to be found.
For some time, physicists have understood that all known phenomena in the Universe are governed by four fundamental forces. These include weak nuclear force, strong nuclear force, electromagnetism and gravity. Whereas the first three forces of are all part of the Standard Model of particle physics, and can be explained through quantum mechanics, our understanding of gravity is dependent upon Einstein’s Theory of Relativity.
Understanding how these four forces fit together has been the aim of theoretical physics for decades, which in turn has led to the development of multiple theories that attempt to reconcile them (i.e. Super String Theory, Quantum Gravity, Grand Unified Theory, etc). However, their efforts may be complicated (or helped) thanks to new research that suggests there might just be a fifth force at work.
In a paper that was recently published in the journal Physical Review Letters, a research team from the University of California, Irvine explain how recent particle physics experiments may have yielded evidence of a new type of boson. This boson apparently does not behave as other bosons do, and may be an indication that there is yet another force of nature out there governing fundamental interactions.
As Jonathan Feng, a professor of physics & astronomy at UCI and one of the lead authors on the paper, said:
“If true, it’s revolutionary. For decades, we’ve known of four fundamental forces: gravitation, electromagnetism, and the strong and weak nuclear forces. If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the universe, with consequences for the unification of forces and dark matter.”
The efforts that led to this potential discovery began back in 2015, when the UCI team came across a study from a group of experimental nuclear physicists from the Hungarian Academy of Sciences Institute for Nuclear Research. At the time, these physicists were looking into a radioactive decay anomaly that hinted at the existence of a light particle that was 30 times heavier than an electron.
In a paper describing their research, lead researcher Attila Krasznahorka and his colleagues claimed that what they were observing might be the creation of “dark photons”. In short, they believed that they might have at last found evidence of Dark Matter, the mysterious, invisible mass that makes up about 85% of the Universe’s mass.
This report was largely overlooked at the time, but gained widespread attention earlier this year when Prof. Feng and his research team found it and began assessing its conclusions. But after studying the Hungarian teams results and comparing them to previous experiments, they concluded that the experimental evidence did not support the existence of dark photons.
Essentially, the UCI team argue that instead of a dark photon, what the Hungarian research team might have witnessed was the creation of a previously undiscovered boson – which they have named the “protophobic X boson”. Whereas other bosons interact with electrons and protons, this hypothetical boson interacts with only electrons and neutrons, and only at an extremely limited range.
This limited interaction is believed to be the reason why the particle has remained unknown until now, and why the adjectives “photobic” and “X” are added to the name. “There’s no other boson that we’ve observed that has this same characteristic,” said Timothy Tait, a professor of physics & astronomy at UCI and the co-author of the paper. “Sometimes we also just call it the ‘X boson,’ where ‘X’ means unknown.”
If such a particle does exist, the possibilities for research breakthroughs could be endless. Feng hopes it could be joined with the three other forces governing particle interactions (electromagnetic, strong and weak nuclear forces) as a larger, more fundamental force. Feng also speculated that this possible discovery could point to the existence of a “dark sector” of our universe, which is governed by its own matter and forces.
“It’s possible that these two sectors talk to each other and interact with one another through somewhat veiled but fundamental interactions,” he said. “This dark sector force may manifest itself as this protophopic force we’re seeing as a result of the Hungarian experiment. In a broader sense, it fits in with our original research to understand the nature of dark matter.”
If this should prove to be the case, then physicists may be closer to figuring out the existence of dark matter (and maybe even dark energy), two of the greatest mysteries in modern astrophysics. What’s more, it could aid researchers in the search for physics beyond the Standard Model – something the researchers at CERN have been preoccupied with since the discovery of the Higgs Boson in 2012.
But as Feng notes, we need to confirm the existence of this particle through further experiments before we get all excited by its implications:
“The particle is not very heavy, and laboratories have had the energies required to make it since the ’50s and ’60s. But the reason it’s been hard to find is that its interactions are very feeble. That said, because the new particle is so light, there are many experimental groups working in small labs around the world that can follow up the initial claims, now that they know where to look.”
As the recent case involving CERN – where LHC teams were forced to announce that they had not discovered two new particles – demonstrates, it is important not to count our chickens before they are roosted. As always, cautious optimism is the best approach to potential new findings.
The early 20th century was a very auspicious time for the sciences. In addition to Ernest Rutherford and Niels Bohr giving birth to the Standard Model of particle physics, it was also a period of breakthroughs in the field of quantum mechanics. Thanks to ongoing studies on the behavior of electrons, scientists began to propose theories whereby these elementary particles behaved in ways that defied classical, Newtonian physics.
One such example is the Electron Cloud Model proposed by Erwin Schrodinger. Thanks to this model, electrons were no longer depicted as particles moving around a central nucleus in a fixed orbit. Instead, Schrodinger proposed a model whereby scientists could only make educated guesses as to the positions of electrons. Hence, their locations could only be described as being part of a ‘cloud’ around the nucleus where the electrons are likely to be found.
Atomic Physics To The 20th Century:
The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.
It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800’s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory.
This theory expanded on the laws of conversation of mass and definite proportions and came down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.
Discovery Of The Electron:
By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, his would change drastically, thanks to research conducted by scientists like Sir Joseph John Thomson.
Through a series of experiments using cathode ray tubes (known as the Crookes’ Tube), Thomson observed that cathode rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles that were 1ooo times smaller and 1800 times lighter than hydrogen.
This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged “corpuscles” were distributed in a uniform sea of positive charge – known as the Plum Pudding Model.
These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’sPhilosophical Magazine, to wide acclaim.
Development Of The Standard Model:
Subsequent experiments revealed a number of scientific problems with the Plum Pudding model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil – aka. the “gold foil experiment.”
In this experiment, Geiger and Marsden measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.
Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.
By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.
Subsequent experiments by Antonius Van den Broek and Niels Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).
The Electron Cloud Model:
During the 1920s, Austrian physicist Erwin Schrodinger became fascinated by the theories Max Planck, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and other physicists. During this time, he also became involved in the fields of atomic theory and spectra, researching at the University of Zurich and then the Friedrich Wilhelm University in Berlin (where he succeeded Planck in 1927).
In 1926, Schrödinger tackled the issue of wave functions and electrons in a series of papers. In addition to describing what would come to be known as the Schrodinger equation – a partial differential equation that describes how the quantum state of a quantum system changes with time – he also used mathematical equations to describe the likelihood of finding an electron in a certain position.
This became the basis of what would come to be known as the Electron Cloud (or quantum mechanical) Model, as well as the Schrodinger equation. Based on quantum theory, which states that all matter has properties associated with a wave function, the Electron Cloud Model differs from the Bohr Model in that it does not define the exact path of an electron.
Instead, it predicts the likely position of the location of the electron based on a function of probabilities. The probability function basically describes a cloud-like region where the electron is likely to be found, hence the name. Where the cloud is most dense, the probability of finding the electron is greatest; and where the electron is less likely to be, the cloud is less dense.
These dense regions are known as “electron orbitals”, since they are the most likely location where an orbiting electron will be found. Extending this “cloud” model to a 3-dimensional space, we see a barbell or flower-shaped atom (as in image at the top). Here, the branching out regions are the ones where we are most likely to find the electrons.
Thanks to Schrodinger’s work, scientists began to understand that in the realm of quantum mechanics, it was impossible to know the exact position and momentum of an electron at the same time. Regardless of what the observer knows initially about a particle, they can only predict its succeeding location or momentum in terms of probabilities.
At no given time will they be able to ascertain either one. In fact, the more they know about the momentum of a particle, the less they will know about its location, and vice versa. This is what is known today as the “Uncertainty Principle”.
Note that the orbitals mentioned in the previous paragraph are formed by a hydrogen atom (i.e. with just one electron). When dealing with atoms that have more electrons, the electron orbital regions spread out evenly into a spherical fuzzy ball. This is where the term ‘electron cloud’ is most appropriate.
This contribution was universally recognized as being one of the cost important contributions of the 20th century, and one which triggered a revolution in the fields of physics, quantum mechanics and indeed all the sciences. Thenceforth, scientists were no longer working in a universe characterized by absolutes of time and space, but in quantum uncertainties and time-space relativity!