There are some strange results being announced in the physics world lately. A fluid with a negative effective mass, and the discovery of five new particles, are all challenging our understanding of the universe.
New results from ALICE (A Large Ion Collider Experiment) are adding to the strangeness.
ALICE is a detector on the Large Hadron Collider (LHC). It’s one of seven detectors, and ALICE’s role is to “study the physics of strongly interacting matter at extreme energy densities, where a phase of matter called quark-gluon plasma forms,” according to the CERN website. Quark-gluon plasma is a state of matter that existed only a few millionths of a second after the Big Bang.
In what we might call normal matter—that is the familiar atoms that we all learn about in high school—protons and neutrons are made up of quarks. Those quarks are held together by other particles called gluons. (“Glue-ons,” get it?) In a state known as confinement, these quarks and gluons are permanently bound together. In fact, quarks have never been observed in isolation.
The LHC is used to collide particles together at extremely high speeds, creating temperatures that can be 100,000 times hotter than the center of our Sun. In new results just released from CERN, lead ions were collided, and the resulting extreme conditions come close to replicating the state of the Universe those few millionths of a second after the Big Bang.
In those extreme temperatures, the state of confinement was broken, and the quarks and gluons were released, and formed quark-gluon plasma.
So far, this is pretty well understood. But in these new results, something additional happened. There was increased production of what are called “strange hadrons.” Strange hadrons themselves are well-known particles. They have names like Kaon, Lambda, Xi and Omega. They’re called strange hadrons because they each have one “strange quark.”
If all of this seems a little murky, here’s the dinger: Strange hadrons may be well-known particles, because they’ve been observed in collisions between heavy nuclei. But they haven’t been observed in collisions between protons.
“Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system…opens up an entirely new dimension for the study of the properties of the fundamental state that our universe emerged from.” – Federico Antinori, Spokesperson of the ALICE collaboration.
“We are very excited about this discovery,” said Federico Antinori, Spokesperson of the ALICE collaboration. “We are again learning a lot about this primordial state of matter. Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system, such as the collision between two protons, opens up an entirely new dimension for the study of the properties of the fundamental state that our universe emerged from.”
The creation of quark-gluon plasma at CERN provides physicists an opportunity to study the strong interaction. The strong interaction is also known as the strong force, one of the four fundamental forces in the Universe, and the one that binds quarks into protons and neutrons. It’s also an opportunity to study something else: the increased production of strange hadrons.
In a delicious turn of phrase, CERN calls this phenomenon “enhanced strangeness production.” (Somebody at CERN has a flair for language.)
Enhanced strangeness production from quark-gluon plasma was predicted in the 1980s, and was observed in the 1990s at CERN’s Super Proton Synchrotron. The ALICE experiment at the LHC is giving physicists their best opportunity yet to study how proton-proton collisions can have enhanced strangeness production in the same way that heavy ion collisions can.
According to the press release announcing these results, “Studying these processes more precisely will be key to better understand the microscopic mechanisms of the quark-gluon plasma and the collective behaviour of particles in small systems.”
During the 19th and 20th centuries, physicists began to probe deep into the nature of matter and energy. In so doing, they quickly realized that the rules which govern them become increasingly blurry the deeper one goes. Whereas the predominant theory used to be that all matter was made up of indivisible atoms, scientists began to realize that atoms are themselves composed of even smaller particles.
From these investigations, the Standard Model of Particle Physics was born. According to this model, all matter in the Universe is composed of two kinds of particles: hadrons – from which Large Hadron Collider (LHC) gets its name – and leptons. Where hadrons are composed of other elementary particles (quarks, anti-quarks, etc), leptons are elementary particles that exist on their own.
The word lepton comes from the Greek leptos, which means “small”, “fine”, or “thin”. The first recorded use of the word was by physicist Leon Rosenfeld in his book Nuclear Forces (1948). In the book, he attributed the use of the word to a suggestion made by Danish chemist and physicist Prof. Christian Moller.
The term was chosen to refer to particles of small mass, since the only known leptons in Rosenfeld’s time were muons. These elementary particles are over 200 times more massive than electrons, but have only about one-ninth the the mass of a proton. Along with quarks, leptons are the basic building blocks of matter, and are therefore seen as “elementary particles”.
Types of Leptons:
According to the Standard Model, there are six different types of leptons. These include the Electron, the Muon, and Tau particles, as well as their associated neutrinos (i.e. electron neutrino, muon neutrino, and tau neutrino). Leptons have negative charge and a distinct mass, whereas their neutrinos have a neutral charge.
Electrons are the lightest, with a mass of 0.000511 gigaelectronvolts (GeV), while Muons have a mass of 0.1066 Gev and Tau particles (the heaviest) have a mass of 1.777 Gev. The different varieties of the elementary particles are commonly called “flavors”. While each of the three lepton flavors are different and distinct (in terms of their interactions with other particles), they are not immutable.
A neutrino can change its flavor, a process which is known as “neutrino flavor oscillation”. This can take a number of forms, which include solar neutrino, atmospheric neutrino, nuclear reactor, or beam oscillations. In all observed cases, the oscillations were confirmed by what appeared to be a deficit in the number of neutrinos being created.
One observed cause has to do with “muon decay” (see below), a process where muons change their flavor to become electron neutrinos or tau neutrinos – depending on the circumstances. In addition, all three leptons and their neutrinos have an associated antiparticle (antilepton).
For each, the antileptons have an identical mass, but all of the other properties are reversed. These pairings consist of the electron/positron, muon/antimuon, tau/antitau, electron neutrino/electron antineutrino, muon neutrino/muan antinuetrino, and tau neutrino/tau antineutrino.
The present Standard Model assumes that there are no more than three types (aka. “generations”) of leptons with their associated neutrinos in existence. This accords with experimental evidence that attempts to model the process of nucleosynthesis after the Big Bang, where the existence of more than three leptons would have affected the abundance of helium in the early Universe.
All leptons possess a negative charge. They also possess an intrinsic rotation in the form of their spin, which means that electrons with an electric charge – i.e. “charged leptons” – will generate magnetic fields. They are able to interact with other matter only though weak electromagnetic forces. Ultimately, their charge determines the strength of these interactions, as well as the strength of their electric field and how they react to external electrical or magnetic fields.
None are capable of interacting with matter via strong forces, however. In the Standard Model, each lepton starts out with no intrinsic mass. Charged leptons obtain an effective mass through interactions with the Higgs field, while neutrinos either remain massless or have only very small masses.
History of Study:
The first lepton to be identified was the electron, which was discovered by British physicist J.J. Thomson and his colleagues in 1897 using a series of cathode ray tube experiments. The next discoveries came during the 1930s, which would lead to the creation of a new classification for weakly-interacting particles that were similar to electrons.
The first discovery was made by Austrian-Swiss physicist Wolfgang Pauli in 1930, who proposed the existence of the electron neutrino in order to resolve the ways in which beta decay contradicted the Conservation of Energy law, and Newton’s Laws of Motion (specifically the Conservation of Momentum and Conservation of Angular Momentum).
The positron and muon were discovered by Carl D. Anders in 1932 and 1936, respectively. Due to the mass of the muon, it was initially mistook for a meson. But due to its behavior (which resembled that of an electron) and the fact that it did not undergo strong interaction, the muon was reclassified. Along with the electron and the electron neutrino, it became part of a new group of particles known as “leptons”.
In 1962, a team of American physicists – consisting of Leon M. Lederman, Melvin Schwartz, and Jack Steinberger – were able to detect of interactions by the muon neutrino, thus showing that more than one type of neutrino existed. At the same time, theoretical physicists postulated the existence of many other flavors of neutrinos, which would eventually be confirmed experimentally.
The tau particle followed in the 1970s, thanks to experiments conducted by Nobel-Prize winning physicist Martin Lewis Perl and his colleagues at the SLAC National Accelerator Laboratory. Evidence of its associated neutrino followed thanks to the study of tau decay, which showed missing energy and momentum analogous to the missing energy and momentum caused by the beta decay of electrons.
In 2000, the tau neutrino was directly observed thanks to the Direct Observation of the NU Tau (DONUT) experiment at Fermilab. This would be the last particle of the Standard Model to be observed until 2012, when CERN announced that it had detected a particle that was likely the long-sought-after Higgs Boson.
Today, there are some particle physicists who believe that there are leptons still waiting to be found. These “fourth generation” particles, if they are indeed real, would exist beyond the Standard Model of particle physics, and would likely interact with matter in even more exotic ways.
Since ancient times, philosophers and scholars have sought to understand light. In addition to trying to discern its basic properties (i.e. what is it made of – particle or wave, etc.) they have also sought to make finite measurements of how fast it travels. Since the late-17th century, scientists have been doing just that, and with increasing accuracy.
In so doing, they have gained a better understanding of light’s mechanics and the important role it plays in physics, astronomy and cosmology. Put simply, light moves at incredible speeds and is the fastest moving thing in the Universe. Its speed is considered a constant and an unbreakable barrier, and is used as a means of measuring distance. But just how fast does it travel?
Speed of Light (c):
Light travels at a constant speed of 1,079,252,848.8 (1.07 billion) km per hour. That works out to 299,792,458 m/s, or about 670,616,629 mph (miles per hour). To put that in perspective, if you could travel at the speed of light, you would be able to circumnavigate the globe approximately seven and a half times in one second. Meanwhile, a person flying at an average speed of about 800 km/h (500 mph), would take over 50 hours to circle the planet just once.
To put that into an astronomical perspective, the average distance from the Earth to the Moon is 384,398.25 km (238,854 miles ). So light crosses that distance in about a second. Meanwhile, the average distance from the Sun to the Earth is ~149,597,886 km (92,955,817 miles), which means that light only takes about 8 minutes to make that journey.
Little wonder then why the speed of light is the metric used to determine astronomical distances. When we say a star like Proxima Centauri is 4.25 light years away, we are saying that it would take – traveling at a constant speed of 1.07 billion km per hour (670,616,629 mph) – about 4 years and 3 months to get there. But just how did we arrive at this highly specific measurement for “light-speed”?
History of Study:
Until the 17th century, scholars were unsure whether light traveled at a finite speed or instantaneously. From the days of the ancient Greeks to medieval Islamic scholars and scientists of the early modern period, the debate went back and forth. It was not until the work of Danish astronomer Øle Rømer (1644-1710) that the first quantitative measurement was made.
In 1676, Rømer observed that the periods of Jupiter’s innermost moon Io appeared to be shorter when the Earth was approaching Jupiter than when it was receding from it. From this, he concluded that light travels at a finite speed, and estimated that it takes about 22 minutes to cross the diameter of Earth’s orbit.
Christiaan Huygens used this estimate and combined it with an estimate of the diameter of the Earth’s orbit to obtain an estimate of 220,000 km/s. Isaac Newton also spoke about Rømer’s calculations in his seminal work Opticks (1706). Adjusting for the distance between the Earth and the Sun, he calculated that it would take light seven or eight minutes to travel from one to the other. In both cases, they were off by a relatively small margin.
Later measurements made by French physicists Hippolyte Fizeau (1819 – 1896) and Léon Foucault (1819 – 1868) refined these measurements further – resulting in a value of 315,000 km/s (192,625 mi/s). And by the latter half of the 19th century, scientists became aware of the connection between light and electromagnetism.
This was accomplished by physicists measuring electromagnetic and electrostatic charges, who then found that the numerical value was very close to the speed of light (as measured by Fizeau). Based on his own work, which showed that electromagnetic waves propagate in empty space, German physicist Wilhelm Eduard Weber proposed that light was an electromagnetic wave.
The next great breakthrough came during the early 20th century/ In his 1905 paper, titled “On the Electrodynamics of Moving Bodies”, Albert Einstein asserted that the speed of light in a vacuum, measured by a non-accelerating observer, is the same in all inertial reference frames and independent of the motion of the source or observer.
Using this and Galileo’s principle of relativity as a basis, Einstein derived the Theory of Special Relativity, in which the speed of light in vacuum (c) was a fundamental constant. Prior to this, the working consensus among scientists held that space was filled with a “luminiferous aether” that was responsible for its propagation – i.e. that light traveling through a moving medium would be dragged along by the medium.
This in turn meant that the measured speed of the light would be a simple sum of its speed through the medium plus the speed of that medium. However, Einstein’s theory effectively made the concept of the stationary aether useless and revolutionized the concepts of space and time.
Not only did it advance the idea that the speed of light is the same in all inertial reference frames, it also introduced the idea that major changes occur when things move close the speed of light. These include the time-space frame of a moving body appearing to slow down and contract in the direction of motion when measured in the frame of the observer (i.e. time dilation, where time slows as the speed of light approaches).
His observations also reconciled Maxwell’s equations for electricity and magnetism with the laws of mechanics, simplified the mathematical calculations by doing away with extraneous explanations used by other scientists, and accorded with the directly observed speed of light.
During the second half of the 20th century, increasingly accurate measurements using laser inferometers and cavity resonance techniques would further refine estimates of the speed of light. By 1972, a group at the US National Bureau of Standards in Boulder, Colorado, used the laser inferometer technique to get the currently-recognized value of 299,792,458 m/s.
Role in Modern Astrophysics:
Einstein’s theory that the speed of light in vacuum is independent of the motion of the source and the inertial reference frame of the observer has since been consistently confirmed by many experiments. It also sets an upper limit on the speeds at which all massless particles and waves (which includes light) can travel in a vacuum.
One of the outgrowths of this is that cosmologists now treat space and time as a single, unified structure known as spacetime – in which the speed of light can be used to define values for both (i.e. “lightyears”, “light minutes”, and “light seconds”). The measurement of the speed of light has also become a major factor when determining the rate of cosmic expansion.
Beginning in the 1920’s with observations of Lemaitre and Hubble, scientists and astronomers became aware that the Universe is expanding from a point of origin. Hubble also observed that the farther away a galaxy is, the faster it appears to be moving. In what is now referred to as the Hubble Parameter, the speed at which the Universe is expanding is calculated to 68 km/s per megaparsec.
This phenomena, which has been theorized to mean that some galaxies could actually be moving faster than the speed of light, may place a limit on what is observable in our Universe. Essentially, galaxies traveling faster than the speed of light would cross a “cosmological event horizon”, where they are no longer visible to us.
Also, by the 1990’s, redshift measurements of distant galaxies showed that the expansion of the Universe has been accelerating for the past few billion years. This has led to theories like “Dark Energy“, where an unseen force is driving the expansion of space itself instead of objects moving through it (thus not placing constraints on the speed of light or violating relativity).
Along with special and general relativity, the modern value of the speed of light in a vacuum has gone on to inform cosmology, quantum physics, and the Standard Model of particle physics. It remains a constant when talking about the upper limit at which massless particles can travel, and remains an unachievable barrier for particles that have mass.
Perhaps, someday, we will find a way to exceed the speed of light. While we have no practical ideas for how this might happen, the smart money seems to be on technologies that will allow us to circumvent the laws of spacetime, either by creating warp bubbles (aka. the Alcubierre Warp Drive), or tunneling through it (aka. wormholes).
Until that time, we will just have to be satisfied with the Universe we can see, and to stick to exploring the part of it that is reachable using conventional methods.
Particle physicists are an inquisitive bunch. Their goal is a working, complete model of the particles and forces that make up the Universe, and they pursue that goal with a vigour matched by few other professions.
The Standard Model of Physics is the result of their efforts, and for 25 years or so, it has guided our thinking and understanding of particle physics. The best tool we have for studying physics further is the Large Hadron Collider (LHC), near Geneva, Switzerland. And some recent, intriguing results from the LHC points to the existence of a newly discovered particle.
The LHC has four separate detectors. Two of them are “general purpose” detectors, called ATLAS and CMS. Last year, separate experiments in both the ATLAS and CMS detectors produced what is best called a “bump” in their data. Initially, the two teams conducting the experiments were puzzled by the data. But when they compared them, they found that the bumps in their data were the same in both experiments, and they hinted at what could be a new type of particle, never before detected.
The two experiments involved smashing protons into each other at near-relativistic speeds. The collisions produced more high-energy photons than theory predicts. Not a lot more, but physics is a detailed endeavour, so even a slight increase in the amount of photons produced is a big deal. In physics, everything happens for a reason.
To be more specific, ATLAS and CMS recorded increased activity at an energy level around 750 giga electron-volts (GeV). What that means, for all you non-particle physicists, is that the new particle decays into two photons at the point of the proton-proton collision. If the new particle exists, that is.
A new particle would be a huge discovery. The Standard Model has describe all the particles present in nature pretty well. It even predicted the existence of one type of particle, the Higgs Boson, long before the LHC actually verified its existence. The discovery of a new type of particle would be very exciting news indeed, and could break the Standard Model.
Since this data from the experiments at the LHC was released last year, the physics world has been buzzing. Over 100 papers have been written to try to explain what the results might mean. But some caution is required.
The first thing scientists do when faced with results like this is to try to quantify the likelihood that it could be chance. If only one experiment had this bump in its data, then the likelihood that it was just a chance occurrence is pretty high. There are many reasons why an experiment can have a result like this, which is why repeatability is such a big deal in science. But when two independent, separate, experiments have the same result, people’s ears perk up.
A few months have passed since the experiments were run, and in that time, the experimenters have tried to determine exactly what the likelihood is of these result occurring by chance. After working with the data, a funny thing has happened. The significance of the extra photons detected by CMS has risen, while the significance of the extra photons detected by ATLAS has fallen. This has definitely left physicists scratching their heads.
Also in that time, about four main explanations for the experimental results have percolated to the surface. One states that the new particle, if it exists, is made up of smaller particles, similar to how a proton is made up of quarks. These smaller particles could be held together by an unknown force. Some theoretical physicists think this is the best fit with the data.
Another possibility is that the new particle is a heavier version of the Higgs Boson. About 12 times heavier. Or it could be that the Higgs Boson itself is made up of smaller particles, and that’s what the experiment detected.
Or, it could be the much-hypothesized graviton, the theoretical particle that carries the gravitational force. The four fundamental forces in the Universe are electromagnetism, the strong nuclear force, the weak nuclear force, and gravity. So far, we have discovered the particles that transmit all of those forces, except for gravity. If their was a new particle detected, and if it proved to be the graviton, that would be enormous, earth-shattering news. At least for those who are passionate about understanding nature.
That’s a lot of “ifs” though.
There are a lot of holes in our knowledge of the Universe, and physicists are eager to fill those gaps. The discovery of a new particle might very well answer some basic questions about dark matter, dark energy, or even gravity itself. But there’s a lot more experimentation to be done before the existence of a new particle can be announced.
It’s no secret that black holes are objects to be avoided, were you to plot yourself a trip across the galaxy. Get too close to one and you’d find your ship hopelessly caught sliding down a gravitational slippery slope toward an inky black event horizon, beyond which there’s no escape. The closer you got the more gravity would yank at your vessel, increasingly more on the end closest to the black hole than on the farther side until eventually the extreme tidal forces would shear both you and your ship apart. Whatever remained would continue to fall, accelerating and stretching into “spaghettified” strands of ship and crew toward—and across—the event horizon. It’d be the end of the cosmic road, with nothing left of you except perhaps some slowly-dissipating “information” leaking back out into the Universe over the course of millennia in the form of Hawking radiation. Nice knowin’ ya.
That is, of course, if you were foolish enough to approach a non-spinning black hole.* Were it to have a healthy rotation to it there’s a possibility, based on new research, that you and your ship could survive the trip intact.
A team of researchers from Georgia Gwinnett College, UMass Dartmouth, and the University of Maryland have designed new supercomputer models to study the exotic physics of quickly-rotating black holes, a.k.a. Kerr black holes, and what might be found in the mysterious realm beyond the event horizon. What they found was the dynamics of their rapid rotation create a scenario in which a hypothetical spacecraft and crew might avoid gravitational disintegration during approach.
“We developed a first-of-its-kind computer simulation of how physical fields evolve on the approach to the center of a rotating black hole,” said Dr. Lior Burko, associate professor of physics at Georgia Gwinnett College and lead researcher on the study. “It has often been assumed that objects approaching a black hole are crushed by the increasing gravity. However, we found that while gravitational forces increase and become infinite, they do so fast enough that their interaction allows physical objects to stay intact as they move toward the center of the black hole.”
Because the environment around black holes is so intense (and physics inside them doesn’t play by the rules) creating accurate models requires the latest high-tech computing power.
“This has never been done before, although there has been lots of speculation for decades on what actually happens inside a black hole,” said Gaurav Khanna, Associate Physics Professor at UMass Dartmouth, whose Center for Scientific Computing & Visualization Research developed the precision computer modeling necessary for the project.
Like science fiction movies have imagined for decades—from Disney’s The Black Hole to Nolan’s Interstellar—it just might be possible to survive a trip into a black hole, if conditions are right (i.e., you probably still don’t want to find yourself anywhere near one of these.)
Electromagnetism is one of the fundamental forces of the universe, responsible for everything from electric and magnetic fields to light. Originally, scientists believed that magnetism and electricity were separate forces. But by the late 19th century, this view changed, as research demonstrated conclusively that positive and negative electrical charges were governed by one force (i.e. magnetism).
Since that time, scientists have sought to test and measure electromagnetic fields, and to recreate them. Towards this end, they created electromagnets, a device that uses electrical current to induce a magnetic field. And since their initial invention as a scientific instrument, electromagnets have gone on to become a regular feature of electronic devices and industrial processes.