The Earth Does Stop the Occasional Neutrino

At the Amundsen–Scott South Pole Station in Antarctica lies the IceCube Neutrino Observatory – a facility dedicated to the study of elementary particles known as neutrino. This array consists of 5,160 spherical optical sensors – Digital Optical Modules (DOMs) – buried within a cubic kilometer of clear ice. At present, this observatory is the largest neutrino detector in the world and has spent the past seven years studying how these particles behave and interact.

The most recent study released by the IceCube collaboration, with the assistance of physicists from Pennsylvania State University, has measured the Earth’s ability to block neutrinos for the first time. Consistent with the Standard Model of Particle Physics, they determined that while trillions of neutrinos pass through Earth (and us) on a regular basis, some are occasionally stopped by it.

The study, titled “Measurement of the Multi-TeV Neutrino Interaction Cross-Section with IceCube Using Earth Absorption“, recently appeared in the scientific journal Nature. The study team’s results were based on the observation of 10,784 interactions made by high-energy, upward moving neutrinos, which were recorded over the course of a year at the observatory.

The IceCube Neutrino Observatory at the South Pole. Credit: Emanuel Jacobi/NSF

Back in 2013, the first detections of high-energy neutrinos were made by IceCube collaboration. These neutrinos – which were believed to be astrophysical in origin – were in the peta-electron volt range, making them the highest energy neutrinos discovered to date. IceCube searches for signs of these interactions by looking for Cherenkov radiation, which is produced after fast-moving charged particles are slowed down by interacting with normal matter.

By detecting neutrinos that interact with the clear ice, the IceCube instruments were able to estimate the energy and direction of travel of the neutrinos. Despite these detections, however, the mystery remained as to whether or not any kind of matter could stop a neutrino as it journeyed through space. In accordance with the Standard Model of Particle Physics, this is something that should happen on occasion.

After observing interactions at IceCube for a year, the science team found that the neutrinos that had to travel the farthest through Earth were less likely to reach the detector. As Doug Cowen, a professor of physics and astronomy/astrophysics at Penn State, explained in a Penn State press release:

“This achievement is important because it shows, for the first time, that very-high-energy neutrinos can be absorbed by something – in this case, the Earth. We knew that lower-energy neutrinos pass through just about anything, but although we had expected higher-energy neutrinos to be different, no previous experiments had been able to demonstrate convincingly that higher-energy neutrinos could be stopped by anything.”

The Icetop Tank, the neutrino detectors at the heart of the IceCube Neutrino Observatory. Credit: Dan Hubert

The existence of neutrinos was first proposed in 1930 by theoretical physicist Wolfgang Pauli, who postulated their existence as a way of explaining beta decay in terms of the conservation of energy law. They are so-named because they are electrically neutral, and only interact with matter very weakly – i.e. through the weak subatomic force and gravity. Because of this, neutrinos pass through normal matter on a regular basis.

Whereas neutrinos are produced regularly by stars and nuclear reactors here on Earth, the first neutrinos were formed during the Big Bang. The study of their interaction with normal matter can therefore tell us much about how the Universe evolved over the course of billions of years. Many scientists anticipate that the study of neutrinos will indicate the existence of new physics, ones which go beyond the Standard Model.

Because of this, the science team was somewhat surprised (and perhaps disappointed) with their results. As Francis Halzen – the principal investigator for the IceCube Neutrino Observatory and a professor of physics at the University of Wisconsin-Madison – explained:

“Understanding how neutrinos interact is key to the operation of IceCube. We were of course hoping for some new physics to appear, but we unfortunately find that the Standard Model, as usual, withstands the test.

Looking down one of IceCube’s detector bore holes. Credit: IceCube Collaboration/NSF

For the most part, the neutrinos selected for this study were more than one million times more energetic than those that are produced by our Sun or nuclear power plants. The analysis also included some that were astrophysical in nature – i.e. produced beyond Earth’s atmosphere – and may have been accelerated towards Earth by supermassive black holes (SMBHs).

Darren Grant, a professor of physics at the University of Alberta, is also the spokesperson for the IceCube Collaboration. As he indicated, this latest interaction study opens doors for future neutrino research. “Neutrinos have quite a well-earned reputation of surprising us with their behavior,” he said. “It is incredibly exciting to see this first measurement and the potential it holds for future precision tests.”

This study not only provided the first measurement of the Earth’s absorption of neutrinos, it also offers opportunities for geophysical researchers who are hoping to use neutrinos to explore Earth’s interior. Given that Earth is capable of stopping some of the billions of high-energy particles that routinely pass through it, scientists could develop a method for studying the Earth’s inner and outer core, placing more accurate constraints on their sizes and densities.

It also shows that the IceCube Observatory is capable of reaching beyond its original purpose, which was particle physics research and the study of neutrinos. As this latest study clearly shows, it is capable of contributing to planetary science research and nuclear physics as well. Physicists also hope to use the full 86-string IceCube array to conduct a multi-year analysis, examining even higher ranges of neutrino energies.

This event display shows “Bert,” one of two neutrino events discovered at IceCube whose energies exceeded one petaelectronvolt (PeV). Credit: Berkeley Labs.

As James Whitmore – the program director in the National Science Foundation’s (NSF) physics division (which provides support for IceCube) – indicated, this could allow them to truly search for physics that go beyond the Standard Model.

“IceCube was built to both explore the frontiers of physics and, in doing so, possibly challenge existing perceptions of the nature of universe. This new finding and others yet to come are in that spirit of scientific discovery.”

Ever since the discovery of the Higgs boson in 2012, physicists have been secure in the knowledge that the long journey to confirm the Standard Model was now complete. Since then, they have set their sets farther, hoping to find new physics that could resolve some of the deeper mysteries of the Universe – i.e. supersymmetry, a Theory of Everything (ToE), etc.

This, as well as studying how physics work at the highest energy levels (similar to those that existed during the Big Bang) is the current preoccupation of physicists. If they are successful, we might just come to understand how this massive thing known as the Universe works.

Further Reading: Penn State, Nature

Experiment Detects Mysterious Neutrino-Nucleus Scattering For the First Time

Neutrinos are one of the fundamental particles that make up the Universe. Compared to other types of particles, they have very little mass, no charge, and only interact with others via the weak nuclear force and gravity. As such, finding evidence of heir interactions is extremely difficult, requiring massive instruments located deep underground to shield them from any interference.

However, using the Spallation Neutron Source (SNS), a research facility located at the Oak Ridge National Laboratory (ORNL) – a international team of researchers recently made an historic discovery about neutrinos using an entirely different method. As part of the COHERENT experiment, these results confirm a prediction made 43 years ago and offers new possibilities for neutrino research.

The study that details their findings, titled “Observation of coherent elastic neutrino-nucleus scattering“, was recently published in the journal Science. The research was conducted as part of the COHERENT experiment, a collaboration of 80 researchers from 19 institutions from more 4 nations that has been searching for what is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS) for over a year.

The COHERENT collaboration is the first experiment to observe coherent elastic neutrino–nucleus scattering. Their results confirm a prediction of the Standard Model and establish constraints on alternative theoretical models. Credit: COHERENT Collaboration

In finding evidence of this behavior, COHERENT has essentially made history. As Jason Newby, an ORNL physicist and the technical coordinator for COHERENT, said in a ORNL press statement:

“The one-of-a-kind particle physics experiment at Oak Ridge National Laboratory was the first to measure coherent scattering of low-energy neutrinos off nuclei.”

To break it all down, the Standard Model of particle physics indicates that neutrinos are leptons, a particle that interacts with other matter very weakly. They are created through radioactive decay, the nuclear reactions that power stars, and from supernovae. The Big Bang model of cosmology also predicts that neutrinos are the most abundant particles in existence, since they are a byproduct of the creation of the Universe.

As such, their study has been a major focal point for theoretical physicists and cosmologists. In previous studies, neutrino interactions were detected by using literally tons of target material and then examining the particle transformations that resulted from neutrinos hitting them.

Examples include the Super-Kamiokande Observatory in Japan, an underground facility where the target material is 50,000 tons of ultrapure water. In the case of SNOLAB’s Sudbury Neutrino Observatory – which is located in a former mine complex near Sudbury, Ontario – the SNO neutrino detector relies on heavy water for neutrino detection while the SNO+ experiment will use a liquid scintillator.

Super-Kamiokande, a neutrino detector in Japan, holds 50,000 tons of ultrapure water surrounded by light tubes. Credit: Super-Kamiokande Observatory

And the IceCube Neutrino Observatory– the largest neutrino detector in the world, located at the Amundsen–Scott South Pole Station in Antarctica – relies on Antarctic ice to detect neutrino interactions.  In all cases, the facilities are extremely isolated and rely on a very expensive equipment.

The COHERENT experiment, however, is immensely smaller and more economical by comparison, weighing a mere 14.5 kg (32 lbs) and occupying far less in the way of space. The experiment was created to take advantage of the existing SNS accelerator-based system, which produces the most intense pulsed neutron beams in the world in order to smash mercury atoms with beams of protons.

This process creates massive amounts of neutrons which are used for various scientific experiments. However, the process also creates a significant amount of neutrinos as a byproduct. To take advantage of this, the COHERENT team began developing a neutrino experiment known as “neutrino alley”. Located in a basement corridor just 20 meters (45 feet) from the mercury tank, the thick concrete walls and gravel provide natural shielding.

The corridor is also fitted with large water tanks to block out additional neutrinos, cosmic rays and other particles. But unlike other experiments, the COHERENT detectors look for signs of neutrinos bumping into the nuclei of other atoms. To do this, the team outfitted the corridor with detectors that rely on a cesium iodide scintillator crystal, which also uses odium to increase the prominence of light signals caused by neutrino interactions.

SNS Beamline 13, where protons are slammed into atoms of mercury to release a slew of energetic particles. Credit: ORNL/U.S. Dept. of Energy/Genevieve Martin

Juan Collar, a physicist from the University of Chicago, led the design team that created the detector used at SNS. As he explained, this was a “back-to-basics” approach that did away with more expensive and massive detectors:

“They are arguably the most pedestrian kind of radiation detector available, having been around for a century. Sodium-doped cesium iodide merges all of the properties required to work as a small, ‘handheld’ coherent neutrino detector. Very often, less is more.”

Thanks to their experiment and the sophistication of the SNS, the researchers were able to determine that neutrinos are capable of coupling to quarks through the exchange of neutral Z bosons. This process, which is known as Coherent Elastic Neutrino-Nucleus Scattering (CEvNS), was first predicted in 1973. But until now, no experiment or research team has been able to confirm it.

As Jason Newby indicated, the experiment succeeded in large part thanks to the sophistication of the existing facility. “The energy of the SNS neutrinos is almost perfectly tuned for this experiment—large enough to create a detectable signal, but small enough to take advantage of the coherence condition,” he said. “The only smoking gun of the interaction is a small amount of energy imparted to a single nucleus.”

The data it produced was also cleaner than with previous experiments, since the neutrinos (like the SNS neutron beam that produced them) were also pulsed. This allowed for the easy separation of the signal from background signals, which offered an advantage over steady-state neutrino sources – such as those that are produced by nuclear reactors.

Professor Yuri Efremenko of the University of Tennessee–Knoxville and Jason Newby of ORNL – two members of COHERENT. Credit: ORNL/U.S. Dept. of Energy/Genevieve Martin

The team also detected three “flavors” of neutrinos, which included muon neutrinos, muon antineutrinos, and electron neutrinos. Whereas the muon neutrinos emerged instantaneously, the others were detected a few microseconds later. From this, the COHERENT team not only validated the theory of CEvNS, but also the Standard Model of particle physics. Their findings also have implications for astrophysics and cosmology.

As Kate Scholberg, a physicist from Duke University and COHERENT’s spokesperson, explained:

“When a massive star collapses and then explodes, the neutrinos dump vast energy into the stellar envelope. Understanding the process feeds into understanding of how these dramatic events occur… COHERENT’s data will help with interpretation of measurements of neutrino properties by experiments worldwide. We may also be able to use coherent scattering to better understand the structure of the nucleus.”

While there is no need for further confirmation of their results, the COHERENT researchers plan to conduct additional measurements in order to observe coherent neutrino interactions at distinct rates (another signature of the process). From this, they hope to expand their knowledge of the nature of CEvNS, as well as other basic neutrino properties – such as their intrinsic magnetism.

This discovery was certainly impressive in its own right, given that it validates an aspect of both the Standard Model of particle physics and Big Bang cosmology. But the fact that the method offers cleaner results and relies on instruments that are significantly smaller and less expensive than other experiments – that is very impressive!

The implications of this research are sure to be far-reaching, and it will be interesting to see what other discoveries it enables in the future!

Further Reading: ScienceOak Ridge National Laboratory

We’re One Step Closer to Knowing Why There’s More Matter Than Antimatter in the Universe

The Standard Model of particle physics has been the predominant means of explaining what the basic building blocks of matter are and how they interact for decades. First proposed in the 1970s, the model claims that for every particle created, there is an anti-particle. As such, an enduring mystery posed by this model is why the Universe can exist if it is theoretically made up of equal parts of matter and antimatter.

This seeming disparity, known as the charge-parity (CP) violation, has been the subject of experiments for many years. But so far, no definitive demonstration has been made for this violation, or how so much matter can exist in the Universe without its counterpart. But thanks to new findings released by the international Tokai-to-Kamioka (T2K) collaboration, we may be one step closer to understanding why this disparity exists.

First observed in 1964, CP violation proposes that under certain conditions, the laws of charge-symmetry and parity-symmetry (aka. CP-symmetry) do not apply. These laws state that the physics governing a particle should be the same if it were interchanged with its antiparticle, while its spatial coordinates would be inverted. From this observation, one of the greatest cosmological mysteries emerged.

If the laws governing matter and antimatter are the same, then why is it that the Universe is so matter-dominated? Alternately, if matter and antimatter are fundamentally different, then how does this accord with our notions of symmetry? Answering these questions is not only important as far as our predominant cosmological theories go, they are also intrinsic to understanding how the weak interactions that govern particles work.

Established in June of 2011, the international T2K collaboration is the first experiment in the world dedicated to answering this mystery by studying neutrino and anti-neutrino oscillations. The experiment begins with high-intensity beams of muon neutrinos (or muon anti-neutrinos) being generated at the Japan Proton Accelerator Research Complex (J-PARC), which are then fired towards the Super-Kamiokande detector 295 km away.

This detector is currently one of the world’s largest and most sophisticated, dedicated to the detection and study of solar and atmospheric neutrinos. As neutrinos travel between the two facilities, they change “flavor” – going from muon neutrinos or anti-neutrinos to electron neutrinos or anti-neutrinos. In monitoring these neutrino and anti-neutrino beams, the experiment watches for different rates of oscillation.

This difference in oscillation would show that there is an imbalance between particles and antiparticles, and thus provide the first definitive evidence of CP violation for the first time. It would also indicate that there are physics beyond the Standard Model that scientists have yet to probe. This past April, the first data set produced by T2K was released, which provided some telling results.

The detected pattern of an electron neutrino candidate event observed by Super-Kamiokande. Credit: Kavli IMPU

As Mark Hartz, a T2K collaborator and the Kavli IPMU Project Assistant Professor, said in a recent press release:

“While the data sets are still too small to make a conclusive statement, we have seen a weak preference for large CP violation and we are excited to continue to collect data and make a more sensitive search for CP violation.”

These results, which were recently published in the Physical Review Letters, include all data runs from between January 2010 to May 2016. In total, this data comprised 7.482 x 1020 protons (in neutrino mode), which yielded 32 electron neutrino and 135 muon neutrino events, and 7.471×1020 protons (in antineutrino mode), which yielded 4 electron anti-neutrino and 66 muon neutrino events.

In other words, the first batch of data has provided some evidence for CP violation, and with a confidence interval of 90%. But this is just the beginning, and the experiment is expected to run for another ten years before wrapping up. “If we are lucky and the CP violation effect is large, we may expect 3 sigma evidence, or about 99.7% confidence level, for CP violation by 2026,” said Hartz.

If the experiment proves successful, physicists may finally be able to answer how it is that the early Universe didn’t annihilate itself. It is also likely help to reveal aspects of the Universe that particle physicists are anxious to get into! For it here that the answers to the deepest secrets of the Universe, like how all of its fundamental forces fit together, are likely to be found.

Further Reading: Kavli IMPU, Physical Review Letters

Physicists Maybe, Just Maybe, Confirm the Possible Discovery of 5th Force of Nature

For some time, physicists have understood that all known phenomena in the Universe are governed by four fundamental forces. These include weak nuclear force, strong nuclear force, electromagnetism and gravity. Whereas the first three forces of are all part of the Standard Model of particle physics, and can be explained through quantum mechanics, our understanding of gravity is dependent upon Einstein’s Theory of Relativity.

Understanding how these four forces fit together has been the aim of theoretical physics for decades, which in turn has led to the development of multiple theories that attempt to reconcile them (i.e. Super String Theory, Quantum Gravity, Grand Unified Theory, etc). However, their efforts may be complicated (or helped) thanks to new research that suggests there might just be a fifth force at work.

In a paper that was recently published in the journal Physical Review Letters, a research team from the University of California, Irvine explain how recent particle physics experiments may have yielded evidence of a new type of boson. This boson apparently does not behave as other bosons do, and may be an indication that there is yet another force of nature out there governing fundamental interactions.

Image from Dark Universe, showing the distribution of dark matter in the universe. Credit: AMNH
Image from Dark Universe, showing the distribution of dark matter in the universe. Credit: AMNH

As Jonathan Feng, a professor of physics & astronomy at UCI and one of the lead authors on the paper, said:

“If true, it’s revolutionary. For decades, we’ve known of four fundamental forces: gravitation, electromagnetism, and the strong and weak nuclear forces. If confirmed by further experiments, this discovery of a possible fifth force would completely change our understanding of the universe, with consequences for the unification of forces and dark matter.”

The efforts that led to this potential discovery began back in 2015, when the UCI team came across a study from a group of experimental nuclear physicists from the Hungarian Academy of Sciences Institute for Nuclear Research. At the time, these physicists were looking into a radioactive decay anomaly that hinted at the existence of a light particle that was 30 times heavier than an electron.

In a paper describing their research, lead researcher Attila Krasznahorka and his colleagues claimed that what they were observing might be the creation of “dark photons”. In short, they believed that they might have at last found evidence of Dark Matter, the mysterious, invisible mass that makes up about 85% of the Universe’s mass.

This report was largely overlooked at the time, but gained widespread attention earlier this year when Prof. Feng and his research team found it and began assessing its conclusions. But after studying the Hungarian teams results and comparing them to previous experiments, they concluded that the experimental evidence did not support the existence of dark photons.

This is the signature of one of 100s of trillions of particle collisions detected at the Large Hadron Collider. The combined analysis lead to the discovery of the Higgs Boson. This article describes one team in dissension with the results. (Photo Credit: CERN)
Signature of one of 100s of trillions of particle collisions detected by CERN’s Large Hadron Collider. Credit: CERN

Instead, they proposed that the discovery could indicate the possible presence of a fifth fundamental force of nature. These findings were published in arXiv in April, which was followed-up by a paper titled “Particle Physics Models for the 17 MeV Anomaly in Beryllium Nuclear Decays“, which was published in PRL this past Friday.

Essentially, the UCI team argue that instead of a dark photon, what the Hungarian research team might have witnessed was the creation of a previously undiscovered boson – which they have named the “protophobic X boson”. Whereas other bosons interact with electrons and protons, this hypothetical boson interacts with only electrons and neutrons, and only at an extremely limited range.

This limited interaction is believed to be the reason why the particle has remained unknown until now, and why the adjectives “photobic” and “X” are added to the name. “There’s no other boson that we’ve observed that has this same characteristic,” said Timothy Tait, a professor of physics & astronomy at UCI and the co-author of the paper. “Sometimes we also just call it the ‘X boson,’ where ‘X’ means unknown.”

If such a particle does exist, the possibilities for research breakthroughs could be endless. Feng hopes it could be joined with the three other forces governing particle interactions (electromagnetic, strong and weak nuclear forces) as a larger, more fundamental force. Feng also speculated that this possible discovery could point to the existence of a “dark sector” of our universe, which is governed by its own matter and forces.

The Large Hadron Collider at CERN. Credit: CERN/LHC
The existence of a fifth fundamental force could mean big things for the experiments being conducted with the Large Hadron Collider at CERN. Credit: CERN/LHC

“It’s possible that these two sectors talk to each other and interact with one another through somewhat veiled but fundamental interactions,” he said. “This dark sector force may manifest itself as this protophopic force we’re seeing as a result of the Hungarian experiment. In a broader sense, it fits in with our original research to understand the nature of dark matter.”

If this should prove to be the case, then physicists may be closer to figuring out the existence of dark matter (and maybe even dark energy), two of the greatest mysteries in modern astrophysics. What’s more, it could aid researchers in the search for physics beyond the Standard Model – something the researchers at CERN have been preoccupied with since the discovery of the Higgs Boson in 2012.

But as Feng notes, we need to confirm the existence of this particle through further experiments before we get all excited by its implications:

“The particle is not very heavy, and laboratories have had the energies required to make it since the ’50s and ’60s. But the reason it’s been hard to find is that its interactions are very feeble. That said, because the new particle is so light, there are many experimental groups working in small labs around the world that can follow up the initial claims, now that they know where to look.”

As the recent case involving CERN – where LHC teams were forced to announce that they had not discovered two new particles – demonstrates, it is important not to count our chickens before they are roosted. As always, cautious optimism is the best approach to potential new findings.

Further Reading: University of California, Irvine

What Is The Electron Cloud Model?

The early 20th century was a very auspicious time for the sciences. In addition to Ernest Rutherford and Niels Bohr giving birth to the Standard Model of particle physics, it was also a period of breakthroughs in the field of quantum mechanics. Thanks to ongoing studies on the behavior of electrons, scientists began to propose theories whereby these elementary particles behaved in ways that defied classical, Newtonian physics.

One such example is the Electron Cloud Model proposed by Erwin Schrodinger. Thanks to this model, electrons were no longer depicted as particles moving around a central nucleus in a fixed orbit. Instead, Schrodinger proposed a model whereby scientists could only make educated guesses as to the positions of electrons. Hence, their locations could only be described as being part of a ‘cloud’ around the nucleus where the electrons are likely to be found.

Atomic Physics To The 20th Century:

The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.

Various atoms and molecules as depicted in John Dalton's A New System of Chemical Philosophy (1808). Credit: Public Domain
Various atoms and molecules as depicted in John Dalton’s A New System of Chemical Philosophy (1808). Credit: Public Domain

It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800’s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory.

This theory expanded on the laws of conversation of mass and definite proportions and came down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.

Discovery Of The Electron:

By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, his would change drastically, thanks to research conducted by scientists like Sir Joseph John Thomson.

Through a series of experiments using cathode ray tubes (known as the Crookes’ Tube), Thomson observed that cathode rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles that were 1ooo times smaller and 1800 times lighter than hydrogen.

The Plum Pudding model of the atom proposed by John Dalton. Credit: britannica.com
The Plum Pudding model of the atom proposed by John Dalton. Credit: britannica.com

This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged “corpuscles” were distributed in a uniform sea of positive charge – known as the Plum Pudding Model.

These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’s Philosophical Magazine, to wide acclaim.

Development Of The Standard Model:

Subsequent experiments revealed a number of scientific problems with the Plum Pudding model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil – aka. the “gold foil experiment.”

In this experiment, Geiger and Marsden measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.

A depiction of the atomic structure of the helium atom. Credit: Creative Commons
A depiction of the atomic structure of the helium atom. Credit: Creative Commons

Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.

By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.

Subsequent experiments by Antonius Van den Broek and Niels Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).

The Electron Cloud Model:

During the 1920s, Austrian physicist Erwin Schrodinger became fascinated by the theories Max Planck, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and other physicists. During this time, he also became involved in the fields of atomic theory and spectra, researching at the University of Zurich and then the Friedrich Wilhelm University in Berlin (where he succeeded Planck in 1927).

Artist's concept of the Electron Cloud model, which described the likely location of electron orbitals. Credit: prezi.com
Artist’s concept of the Electron Cloud model, which described the likely location of electron orbitals over time. Credit: Pearson Prentice Hall

In 1926, Schrödinger tackled the issue of wave functions and electrons in a series of papers. In addition to describing what would come to be known as the Schrodinger equation – a partial differential equation that describes how the quantum state of a quantum system changes with time – he also used mathematical equations to describe the likelihood of finding an electron in a certain position.

This became the basis of what would come to be known as the Electron Cloud (or quantum mechanical) Model, as well as the Schrodinger equation. Based on quantum theory, which states that all matter has properties associated with a wave function, the Electron Cloud Model differs from the Bohr Model in that it does not define the exact path of an electron.

Instead, it predicts the likely position of the location of the electron based on a function of probabilities. The probability function basically describes a cloud-like region where the electron is likely to be found, hence the name. Where the cloud is most  dense, the probability of finding the electron is greatest; and where the  electron is less likely to be, the cloud is less dense.

These dense regions are known as “electron orbitals”, since they are the most likely location where an orbiting electron will be found. Extending this “cloud” model to a 3-dimensional space, we see a barbell or flower-shaped atom (as in image at the top). Here, the branching out regions are the ones where we are most likely to find the electrons.

Thanks to Schrodinger’s work, scientists began to understand that in the realm of quantum mechanics, it was impossible to know the exact position and momentum of an electron at the same time. Regardless of what the observer knows initially about a particle, they can only predict its succeeding location or momentum in terms of probabilities.

At no given time will they be able to ascertain either one. In fact, the more they know about the momentum of a particle, the less they will know about its location, and vice versa. This is what is known today as the “Uncertainty Principle”.

Note that the orbitals mentioned in the previous paragraph are formed by a hydrogen atom (i.e. with just one electron). When dealing with atoms that have more electrons, the electron orbital regions spread out evenly into a spherical fuzzy ball. This is where the term ‘electron cloud’ is most appropriate.

This contribution was universally recognized as being one of the cost important contributions of the 20th century, and one which triggered a revolution in the fields of physics, quantum mechanics and indeed all the sciences. Thenceforth, scientists were no longer working in a universe characterized by absolutes of time and space, but in quantum uncertainties and time-space relativity!

We have written many interesting articles about atoms and atomic models here at Universe Today. Here’s What Is John Dalton’s Atomic Model?, What Is The Plum Pudding Model?, What Is Bohr’s Atomic Model?, Who Was Democritus?, and What Are The Parts Of An Atom?

For more information, be sure to check What Is Quantum Mechanics? from Live Science.

Astronomy Cast also has episode on the topic, like Episode 130: Radio Astronomy, Episode 138: Quantum Mechanics, and Episode 252: Heisenberg Uncertainty Principle

Beyond WIMPs: Exploring Alternative Theories Of Dark Matter

The standard model of cosmology tells us that only 4.9% of the Universe is composed of ordinary matter (i.e. that which we can see), while the remainder consists of 26.8% dark matter and 68.3% dark energy. As the names would suggest, we cannot see them, so their existence has had to be inferred based on theoretical models, observations of the large-scale structure of the Universe, and its apparent gravitational effects on visible matter.

Since it was first proposed, there have been no shortages of suggestions as to what Dark Matter particles look like. Not long ago, many scientists proposed that Dark Matter consists of Weakly-Interacting Massive Particles (WIMPs), which are about 100 times the mass of a proton but interact like neutrinos. However, all attempts to find WIMPs using colliders experiments have come up empty. As such, scientists have been exploring the idea lately that dark matter may be composed of something else entirely. Continue reading “Beyond WIMPs: Exploring Alternative Theories Of Dark Matter”

What’s Next for the Large Hadron Collider?

The world’s most powerful particle collider is waking up from a well-earned rest. After roughly two years of heavy maintenance, scientists have nearly doubled the power of the Large Hadron Collider (LHC) in preparation for its next run. Now, it’s being cooled to just 1.9 degrees above absolute zero.

“We have unfinished business with understanding the universe,” said Tara Shears from the University of Liverpool in a news release. Shears and other LHC physicists will work to better understand the Higgs Boson and hopefully unravel some of the secrets of supersymmetry and dark matter.

On February 11, 2013 the LHC shut down for roughly two years. The break, known as LS1 for “long stop one,” was needed to correct several flaws in the original design of the collider.

The LHC’s first run got off to a rough start in 2008. Shortly after it was fired up, a single electrical connection triggered an explosion, damaging an entire sector (one-eighth) of the accelerator. To protect the accelerator from further disaster, scientists decided to run it at half power until all 10,000 copper connections could be repaired.

So over the last two years, scientists have worked around the clock to rework every single connection in the accelerator.

Now that the step (along with many others) is complete, the collider will operate at almost double its previous power. This was tested early last week, when scientists powered up the magnets of one sector to the level needed to reach the high energy expected in its second run.

“The machine that’s now being started up is almost a new LHC,” said John Womersley, the Chief Executive Officer of the Science and Technology Facilities Council.

With such a powerful new tool, scientists will look for deviations from their initial detection of the Higgs boson, potentially revealing a deeper level of physics that goes well beyond the Standard Model of particle physics.

Many theorists have turned to supersymmetry — the idea that for every known fundamental particle there exists a “supersymmetric” partner particle. If true, the enhanced LHC could be powerful enough to create supersymmetric particles themselves or prove their existence in subtler ways.

“The higher energy and more frequent proton collisions in Run 2 will allow us to investigate the Higgs particle in much more detail,” said Victoria Martin from Edinburgh University. “Higher energy may also allow the mysterious “dark matter” observed in galaxies to be made and studied in the lab for the first time.”

It’s possible that the Higgs could interact with — or even decay into — dark matter particles. If the latter occurs, then the dark matter particles would fly out of the LHC without ever being detected. But their absence would be evident.

So stay turned because these issues might be resolved in the spring of 2015 when the particle accelerator roars back to life.