Physicists Maybe, Just Maybe, Confirm the Possible Discovery of 5th Force of Nature

The discovery of a possible fifth fundamental force could change our understanding of the universe. Credit: ESA/Hubble/NASA/Judy Schmidt

For some time, physicists have understood that all known phenomena in the Universe are governed by four fundamental forces. These include weak nuclear force, strong nuclear force, electromagnetism and gravity. Whereas the first three forces of are all part of the Standard Model of particle physics, and can be explained through quantum mechanics, our understanding of gravity is dependent upon Einstein’s Theory of Relativity.

Understanding how these four forces fit together has been the aim of theoretical physics for decades, which in turn has led to the development of multiple theories that attempt to reconcile them (i.e. Super String Theory, Quantum Gravity, Grand Unified Theory, etc). However, their efforts may be complicated (or helped) thanks to new research that suggests there might just be a fifth force at work.

In a paper that was recently published in the journal Physical Review Letters, a research team from the University of California, Irvine explain how recent particle physics experiments may have yielded evidence of a new type of boson. This boson apparently does not behave as other bosons do, and may be an indication that there is yet another force of nature out there governing fundamental interactions.

Image from Dark Universe, showing the distribution of dark matter in the universe. Credit: AMNH
Image from Dark Universe, showing the distribution of dark matter in the universe. Credit: AMNH

As Jonathan Feng, a professor of physics & astronomy at UCI and one of the lead authors on the paper, said:

“If true, it’s revolutionary. For decades, we’ve known of four fundamental forces: gravitation, electromagnetism, and the strong and weak nuclear forces. If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the universe, with consequences for the unification of forces and dark matter.”

The efforts that led to this potential discovery began back in 2015, when the UCI team came across a study from a group of experimental nuclear physicists from the Hungarian Academy of Sciences Institute for Nuclear Research. At the time, these physicists were looking into a radioactive decay anomaly that hinted at the existence of a light particle that was 30 times heavier than an electron.

In a paper describing their research, lead researcher Attila Krasznahorka and his colleagues claimed that what they were observing might be the creation of “dark photons”. In short, they believed that they might have at last found evidence of Dark Matter, the mysterious, invisible mass that makes up about 85% of the Universe’s mass.

This report was largely overlooked at the time, but gained widespread attention earlier this year when Prof. Feng and his research team found it and began assessing its conclusions. But after studying the Hungarian teams results and comparing them to previous experiments, they concluded that the experimental evidence did not support the existence of dark photons.

This is the signature of one of 100s of trillions of particle collisions detected at the Large Hadron Collider. The combined analysis lead to the discovery of the Higgs Boson. This article describes one team in dissension with the results. (Photo Credit: CERN)
Signature of one of 100s of trillions of particle collisions detected by CERN’s Large Hadron Collider. Credit: CERN

Instead, they proposed that the discovery could indicate the possible presence of a fifth fundamental force of nature. These findings were published in arXiv in April, which was followed-up by a paper titled “Particle Physics Models for the 17 MeV Anomaly in Beryllium Nuclear Decays“, which was published in PRL this past Friday.

Essentially, the UCI team argue that instead of a dark photon, what the Hungarian research team might have witnessed was the creation of a previously undiscovered boson – which they have named the “protophobic X boson”. Whereas other bosons interact with electrons and protons, this hypothetical boson interacts with only electrons and neutrons, and only at an extremely limited range.

This limited interaction is believed to be the reason why the particle has remained unknown until now, and why the adjectives “photobic” and “X” are added to the name. “There’s no other boson that we’ve observed that has this same characteristic,” said Timothy Tait, a professor of physics & astronomy at UCI and the co-author of the paper. “Sometimes we also just call it the ‘X boson,’ where ‘X’ means unknown.”

If such a particle does exist, the possibilities for research breakthroughs could be endless. Feng hopes it could be joined with the three other forces governing particle interactions (electromagnetic, strong and weak nuclear forces) as a larger, more fundamental force. Feng also speculated that this possible discovery could point to the existence of a “dark sector” of our universe, which is governed by its own matter and forces.

The Large Hadron Collider at CERN. Credit: CERN/LHC
The existence of a fifth fundamental force could mean big things for the experiments being conducted with the Large Hadron Collider at CERN. Credit: CERN/LHC

“It’s possible that these two sectors talk to each other and interact with one another through somewhat veiled but fundamental interactions,” he said. “This dark sector force may manifest itself as this protophopic force we’re seeing as a result of the Hungarian experiment. In a broader sense, it fits in with our original research to understand the nature of dark matter.”

If this should prove to be the case, then physicists may be closer to figuring out the existence of dark matter (and maybe even dark energy), two of the greatest mysteries in modern astrophysics. What’s more, it could aid researchers in the search for physics beyond the Standard Model – something the researchers at CERN have been preoccupied with since the discovery of the Higgs Boson in 2012.

But as Feng notes, we need to confirm the existence of this particle through further experiments before we get all excited by its implications:

“The particle is not very heavy, and laboratories have had the energies required to make it since the ’50s and ’60s. But the reason it’s been hard to find is that its interactions are very feeble. That said, because the new particle is so light, there are many experimental groups working in small labs around the world that can follow up the initial claims, now that they know where to look.”

As the recent case involving CERN – where LHC teams were forced to announce that they had not discovered two new particles – demonstrates, it is important not to count our chickens before they are roosted. As always, cautious optimism is the best approach to potential new findings.

Further Reading: University of California, Irvine

What Is The Electron Cloud Model?

3d model of electron orbitals, based on the electron cloud model. Credit: Wikipedia Commons/Particia.fidi

The early 20th century was a very auspicious time for the sciences. In addition to Ernest Rutherford and Niels Bohr giving birth to the Standard Model of particle physics, it was also a period of breakthroughs in the field of quantum mechanics. Thanks to ongoing studies on the behavior of electrons, scientists began to propose theories whereby these elementary particles behaved in ways that defied classical, Newtonian physics.

One such example is the Electron Cloud Model proposed by Erwin Schrodinger. Thanks to this model, electrons were no longer depicted as particles moving around a central nucleus in a fixed orbit. Instead, Schrodinger proposed a model whereby scientists could only make educated guesses as to the positions of electrons. Hence, their locations could only be described as being part of a ‘cloud’ around the nucleus where the electrons are likely to be found.

Atomic Physics To The 20th Century:

The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.

Various atoms and molecules as depicted in John Dalton's A New System of Chemical Philosophy (1808). Credit: Public Domain
Various atoms and molecules as depicted in John Dalton’s A New System of Chemical Philosophy (1808). Credit: Public Domain

It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800’s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory.

This theory expanded on the laws of conversation of mass and definite proportions and came down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.

Discovery Of The Electron:

By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, his would change drastically, thanks to research conducted by scientists like Sir Joseph John Thomson.

Through a series of experiments using cathode ray tubes (known as the Crookes’ Tube), Thomson observed that cathode rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles that were 1ooo times smaller and 1800 times lighter than hydrogen.

The Plum Pudding model of the atom proposed by John Dalton. Credit: britannica.com
The Plum Pudding model of the atom proposed by John Dalton. Credit: britannica.com

This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged “corpuscles” were distributed in a uniform sea of positive charge – known as the Plum Pudding Model.

These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’s Philosophical Magazine, to wide acclaim.

Development Of The Standard Model:

Subsequent experiments revealed a number of scientific problems with the Plum Pudding model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil – aka. the “gold foil experiment.”

In this experiment, Geiger and Marsden measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.

A depiction of the atomic structure of the helium atom. Credit: Creative Commons
A depiction of the atomic structure of the helium atom. Credit: Creative Commons

Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.

By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.

Subsequent experiments by Antonius Van den Broek and Niels Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).

The Electron Cloud Model:

During the 1920s, Austrian physicist Erwin Schrodinger became fascinated by the theories Max Planck, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and other physicists. During this time, he also became involved in the fields of atomic theory and spectra, researching at the University of Zurich and then the Friedrich Wilhelm University in Berlin (where he succeeded Planck in 1927).

Artist's concept of the Electron Cloud model, which described the likely location of electron orbitals. Credit: prezi.com
Artist’s concept of the Electron Cloud model, which described the likely location of electron orbitals over time. Credit: Pearson Prentice Hall

In 1926, Schrödinger tackled the issue of wave functions and electrons in a series of papers. In addition to describing what would come to be known as the Schrodinger equation – a partial differential equation that describes how the quantum state of a quantum system changes with time – he also used mathematical equations to describe the likelihood of finding an electron in a certain position.

This became the basis of what would come to be known as the Electron Cloud (or quantum mechanical) Model, as well as the Schrodinger equation. Based on quantum theory, which states that all matter has properties associated with a wave function, the Electron Cloud Model differs from the Bohr Model in that it does not define the exact path of an electron.

Instead, it predicts the likely position of the location of the electron based on a function of probabilities. The probability function basically describes a cloud-like region where the electron is likely to be found, hence the name. Where the cloud is most  dense, the probability of finding the electron is greatest; and where the  electron is less likely to be, the cloud is less dense.

These dense regions are known as “electron orbitals”, since they are the most likely location where an orbiting electron will be found. Extending this “cloud” model to a 3-dimensional space, we see a barbell or flower-shaped atom (as in image at the top). Here, the branching out regions are the ones where we are most likely to find the electrons.

Thanks to Schrodinger’s work, scientists began to understand that in the realm of quantum mechanics, it was impossible to know the exact position and momentum of an electron at the same time. Regardless of what the observer knows initially about a particle, they can only predict its succeeding location or momentum in terms of probabilities.

At no given time will they be able to ascertain either one. In fact, the more they know about the momentum of a particle, the less they will know about its location, and vice versa. This is what is known today as the “Uncertainty Principle”.

Note that the orbitals mentioned in the previous paragraph are formed by a hydrogen atom (i.e. with just one electron). When dealing with atoms that have more electrons, the electron orbital regions spread out evenly into a spherical fuzzy ball. This is where the term ‘electron cloud’ is most appropriate.

This contribution was universally recognized as being one of the cost important contributions of the 20th century, and one which triggered a revolution in the fields of physics, quantum mechanics and indeed all the sciences. Thenceforth, scientists were no longer working in a universe characterized by absolutes of time and space, but in quantum uncertainties and time-space relativity!

We have written many interesting articles about atoms and atomic models here at Universe Today. Here’s What Is John Dalton’s Atomic Model?, What Is The Plum Pudding Model?, What Is Bohr’s Atomic Model?, Who Was Democritus?, and What Are The Parts Of An Atom?

For more information, be sure to check What Is Quantum Mechanics? from Live Science.

Astronomy Cast also has episode on the topic, like Episode 130: Radio Astronomy, Episode 138: Quantum Mechanics, and Episode 252: Heisenberg Uncertainty Principle

Beyond WIMPs: Exploring Alternative Theories Of Dark Matter

Image from Dark Universe, showing the distribution of dark matter in the universe. Credit: AMNH

The standard model of cosmology tells us that only 4.9% of the Universe is composed of ordinary matter (i.e. that which we can see), while the remainder consists of 26.8% dark matter and 68.3% dark energy. As the names would suggest, we cannot see them, so their existence has had to be inferred based on theoretical models, observations of the large-scale structure of the Universe, and its apparent gravitational effects on visible matter.

Since it was first proposed, there have been no shortages of suggestions as to what Dark Matter particles look like. Not long ago, many scientists proposed that Dark Matter consists of Weakly-Interacting Massive Particles (WIMPs), which are about 100 times the mass of a proton but interact like neutrinos. However, all attempts to find WIMPs using colliders experiments have come up empty. As such, scientists have been exploring the idea lately that dark matter may be composed of something else entirely. Continue reading “Beyond WIMPs: Exploring Alternative Theories Of Dark Matter”

What’s Next for the Large Hadron Collider?

A section of the LHC. Image Credit: CERN

The world’s most powerful particle collider is waking up from a well-earned rest. After roughly two years of heavy maintenance, scientists have nearly doubled the power of the Large Hadron Collider (LHC) in preparation for its next run. Now, it’s being cooled to just 1.9 degrees above absolute zero.

“We have unfinished business with understanding the universe,” said Tara Shears from the University of Liverpool in a news release. Shears and other LHC physicists will work to better understand the Higgs Boson and hopefully unravel some of the secrets of supersymmetry and dark matter.

On February 11, 2013 the LHC shut down for roughly two years. The break, known as LS1 for “long stop one,” was needed to correct several flaws in the original design of the collider.

The LHC’s first run got off to a rough start in 2008. Shortly after it was fired up, a single electrical connection triggered an explosion, damaging an entire sector (one-eighth) of the accelerator. To protect the accelerator from further disaster, scientists decided to run it at half power until all 10,000 copper connections could be repaired.

So over the last two years, scientists have worked around the clock to rework every single connection in the accelerator.

Now that the step (along with many others) is complete, the collider will operate at almost double its previous power. This was tested early last week, when scientists powered up the magnets of one sector to the level needed to reach the high energy expected in its second run.

“The machine that’s now being started up is almost a new LHC,” said John Womersley, the Chief Executive Officer of the Science and Technology Facilities Council.

With such a powerful new tool, scientists will look for deviations from their initial detection of the Higgs boson, potentially revealing a deeper level of physics that goes well beyond the Standard Model of particle physics.

Many theorists have turned to supersymmetry — the idea that for every known fundamental particle there exists a “supersymmetric” partner particle. If true, the enhanced LHC could be powerful enough to create supersymmetric particles themselves or prove their existence in subtler ways.

“The higher energy and more frequent proton collisions in Run 2 will allow us to investigate the Higgs particle in much more detail,” said Victoria Martin from Edinburgh University. “Higher energy may also allow the mysterious “dark matter” observed in galaxies to be made and studied in the lab for the first time.”

It’s possible that the Higgs could interact with — or even decay into — dark matter particles. If the latter occurs, then the dark matter particles would fly out of the LHC without ever being detected. But their absence would be evident.

So stay turned because these issues might be resolved in the spring of 2015 when the particle accelerator roars back to life.