Could Antimatter Be Powering Super-Luminous Supernovae?


Explosions are almost always cool, and supernovae are some of the most spectacular and violent explosions in the Universe. In 2006, the supernova SN 2006gy wowed scientists with a light show that was 10 times as luminous as the average supernova, challenging the traditional model of exactly how an exploding star creates a supernova. Astronomers suspect that the cause is the repeated production of antimatter in the core of the star.

Supernovae occur when a star nears the end of its life, and the nuclear processes that fuel the star push outward more powerfully than the force of gravity can hold the star together; the type of supernova created depends on the mass of the star. In stars with masses between 95-130 times the Sun, this process can occur more than once, creating a “pulsational” supernova which can happen as many as seven times.

The cause for the multiple explosions may have to do with the production of antimatter particles in the core, which then recombine and release large amounts of energy.

“The pair instability is encountered when, late in the star’s life, a large amount of thermal energy goes into making the masses of an increasing abundance of electron-positron pairs rather than providing pressure,” wrote Dr. Stan Woosley, of the Department of Astronomy and Astrophysics, USCS Santa Cruz.

What happens is this: the first supernova occurs, powered by the antimatter explosions in the core, and ejects a large amount of the star’s material out into space; however, there still remains enough matter near the core for the star to reignite and begin nuclear processes once again. After between a few hundred days and a few years, another supernova occurs by the same mechanism, and when the ejected material collides with the previous shell of ejected material, the interaction gives off enormous amounts of light.

This process only occurs with stars in the 95-130 solar mass range. Stars with solar masses under 95 undergo typical, non-repeating supernovae, while those over 130 solar masses are subject to the pair instability but explode with such force as to leave nothing near the core to recombine and start the process again.

The production of antimatter in the core, as well as the large amount of light given off by the repeated collision of the shells of ejected matter explains very well the otherwise puzzling luminosity of SN 2006gy.

“The model existed before 2006gy happened as well as the prediction of a possible bright supernova of this sort. When we learned of the supernova, we carried out much more detailed calculations specific to 2006gy and found, to our satisfaction, that many of the observed facts were in the model results,” Dr. Woosley said.

There are other possible candidates for this type of repeating supernova, including Eta Carinae, though they unfortunately may not all be as spectacular as SN 2006gy.

Source: Arxiv paper

Podcast: The Important Numbers in the Universe


This week we wanted to give you a basic physics lesson. This isn’t easy physics, this is a lesson on the basic numbers of the Universe. Each of these numbers define a key aspect of our Universe. If they had different values, the Universe would be a changed place, and life here on Earth would never have arisen.

Click here to download the episode

The Important Numbers in the Universe – Show notes and transcript

Or subscribe to: with your podcatching software.

Creating the Conditions Inside Supergiant Planets


We won’t be visiting a supergiant planet any time soon. But physicists are about to do the next best thing, and creat the conditions that exist inside the most dense planets right here on Earth. What used to require a nuclear explosion should now be possible with diamond anvils and powerful lasers.

Researchers from the Lawrence Livermore National Laboratory (LLNL), New Mexico State University and France’s Atomic Energy Commission announced this week that they have achieved pressures of 10 million atmospheres using a 30 kilojoule ultraviolet laser. The next step will be to use a 2 megajoule laser to achieve more than a billion atmospheres of pressure. Just for comparison, the centre of the Earth squeezes with a little less than 4 to 5 million atmospheres, and the centre of Jupiter is 70 million atmospheres.

Half of the apparatus uses diamond anvils, which can squeeze liquids and solids under high pressures. The researchers then blast the material with a laser-induced shock wave, and compressing it even more. Of course, you need a laser the size of a building, and half the diamond anvil is vapourized.

Once they reached pressures this high, scientists are discovering entirely new realms of chemistry. The just need to work quickly. The high pressure is only maintained for 1 or 2 nanoseconds.

Original Source: UC Berkeley News Release

Here’s a Way to Look for Extra Dimensions

Possible 6-dimension geometry. Image credit: Andrew J. HansonOne of the great outstanding questions in science is known as the “theory of everything”. What underlying laws of physics explain the forces we see in nature? Are gravity and electromagnetism the same force? One popular theory is known as string theory, and proposes that everything in the Universe is made up of tiny, vibrating strings.
Continue reading “Here’s a Way to Look for Extra Dimensions”

Finding a Fourth Dimension

Braneworld challenges Einstein’s general relativity. Image credit: NASA. Click to enlarge
Scientists have been intrigued for years about the possibility that there are additional dimensions beyond the three we humans can understand. Now researchers from Duke and Rutgers universities think there’s a way to test for five-dimensional theory (4 spatial dimensions plus time) of gravity that competes with Einstein’s General Theory of Relativity. This extra dimension should have effects in the cosmos which are detectable by satellites scheduled to launch in the next few years.

Scientists at Duke and Rutgers universities have developed a mathematical framework they say will enable astronomers to test a new five-dimensional theory of gravity that competes with Einstein’s General Theory of Relativity.

Charles R. Keeton of Rutgers and Arlie O. Petters of Duke base their work on a recent theory called the type II Randall-Sundrum braneworld gravity model. The theory holds that the visible universe is a membrane (hence “braneworld”) embedded within a larger universe, much like a strand of filmy seaweed floating in the ocean. The “braneworld universe” has five dimensions — four spatial dimensions plus time — compared with the four dimensions — three spatial, plus time — laid out in the General Theory of Relativity.

The framework Keeton and Petters developed predicts certain cosmological effects that, if observed, should help scientists validate the braneworld theory. The observations, they said, should be possible with satellites scheduled to launch in the next few years.
If the braneworld theory proves to be true, “this would upset the applecart,” Petters said. “It would confirm that there is a 4th dimension to space, which would create a philosophical shift in our understanding of the natural world.”

The scientists’ findings appeared May 24, 2006, in the online edition of the journal Physical Review D. Keeton is an astronomy and physics professor at Rutgers, and Petters is a mathematics and physics professor at Duke. Their research is funded by the National Science Foundation.

The Randall-Sundrum braneworld model — named for its originators, physicists Lisa Randall of Harvard University and Raman Sundrum of Johns Hopkins University — provides a mathematical description of how gravity shapes the universe that differs from the description offered by the General Theory of Relativity.

Keeton and Petters focused on one particular gravitational consequence of the braneworld theory that distinguishes it from Einstein’s theory.

The braneworld theory predicts that relatively small “black holes” created in the early universe have survived to the present. The black holes, with mass similar to a tiny asteroid, would be part of the “dark matter” in the universe. As the name suggests, dark matter does not emit or reflect light, but does exert a gravitational force.

The General Theory of Relativity, on the other hand, predicts that such primordial black holes no longer exist, as they would have evaporated by now.

“When we estimated how far braneworld black holes might be from Earth, we were surprised to find that the nearest ones would lie well inside Pluto’s orbit,” Keeton said.

Petters added, “If braneworld black holes form even 1 percent of the dark matter in our part of the galaxy — a cautious assumption — there should be several thousand braneworld black holes in our solar system.”

But do braneworld black holes really exist — and therefore stand as evidence for the 5-D braneworld theory?

The scientists showed that it should be possible to answer this question by observing the effects that braneworld black holes would exert on electromagnetic radiation traveling to Earth from other galaxies. Any such radiation passing near a black hole will be acted upon by the object’s tremendous gravitational forces — an effect called “gravitational lensing.”

“A good place to look for gravitational lensing by braneworld black holes is in bursts of gamma rays coming to Earth,” Keeton said. These gamma-ray bursts are thought to be produced by enormous explosions throughout the universe. Such bursts from outer space were discovered inadvertently by the U.S. Air Force in the 1960s.

Keeton and Petters calculated that braneworld black holes would impede the gamma rays in the same way a rock in a pond obstructs passing ripples. The rock produces an “interference pattern” in its wake in which some ripple peaks are higher, some troughs are deeper, and some peaks and troughs cancel each other out. The interference pattern bears the signature of the characteristics of both the rock and the water.

Similarly, a braneworld black hole would produce an interference pattern in a passing burst of gamma rays as they travel to Earth, said Keeton and Petters. The scientists predicted the resulting bright and dark “fringes” in the interference pattern, which they said provides a means of inferring characteristics of braneworld black holes and, in turn, of space and time.

“We discovered that the signature of a fourth dimension of space appears in the interference patterns,” Petters said. “This extra spatial dimension creates a contraction between the fringes compared to what you’d get in General Relativity.”

Petters and Keeton said it should be possible to measure the predicted gamma-ray fringe patterns using the Gamma-ray Large Area Space Telescope, which is scheduled to be launched on a spacecraft in August 2007. The telescope is a joint effort between NASA, the U.S. Department of Energy, and institutions in France, Germany, Japan, Italy and Sweden.

The scientists said their prediction would apply to all braneworld black holes, whether in our solar system or beyond.

“If the braneworld theory is correct,” they said, “there should be many, many more braneworld black holes throughout the universe, each carrying the signature of a fourth dimension of space.”

Original Source: Duke University

Podcast: Unlikely Wormholes

Wormholes are a mainstay in science fiction, providing our heroes with a quick and easy way to instantly travel around the Universe. Enter a wormhole near the Earth and you come out on the other side of the galaxy. Even though science fiction made them popular, wormholes had their origins in science – distorting spacetime like this was theoretically possible. But according to Dr. Stephen Hsu from the University of Oregon building a wormhole is probably impossible.
Continue reading “Podcast: Unlikely Wormholes”

Podcast: Alpha, Still Constant After All These Years

There’s a number in the Universe which we humans call alpha – or the fine structure constant. It shows up in almost every mathematical formula dealing with magnetism and electricity. The very speed of light depends on it. If the value for alpha was even a little bit different, the Universe as we know it wouldn’t exist – you, me and everyone on Earth wouldn’t be here. Some physicists have recently reported that the value for alpha has been slowly changing since the Big Bang. Others, including Jeffrey Newman from the Lawrence Berkeley National Laboratory have good evidence that alpha has remained unchanged for at least 7 billion years.
Continue reading “Podcast: Alpha, Still Constant After All These Years”

Experiment Will Help Probe “Theory of Everything”

Image credit: NASA/JPL
Sooner or later, the reign of Einstein, like the reign of Newton before him, will come to an end. An upheaval in the world of physics that will overthrow our notions of basic reality is inevitable, most scientists believe, and currently a horse race is underway between a handful of theories competing to be the successor to the throne.

In the running are such mind-bending ideas as an 11-dimensional universe, universal “constants” (such as the strength of gravity) that vary over space and time and only remain truly fixed in an unseen 5th dimension, infinitesimal vibrating strings as the fundamental constituents of reality, and a fabric of space and time that’s not smooth and continuous, as Einstein believed, but divided into discrete, indivisible chunks of vanishingly small size. Experiment will ultimately determine which triumphs.

A new concept for an experiment to test the predictions of Einstein’s relativity more precisely than ever before is being developed by scientists at NASA’s Jet Propulsion Laboratory (JPL). Their mission, which effectively uses our solar system as a giant laboratory, would help narrow the field of vying theories and bring us one step closer to the next revolution in physics.

A House Divided
It may not weigh heavily on most people’s minds, but a great schism has long plagued our fundamental understanding of the universe. Two ways of explaining the nature and behavior of space, time, matter, and energy currently exist: Einstein’s relativity and the “standard model” of quantum mechanics. Both are extremely successful. The Global Positioning System (GPS), for instance, wouldn’t be possible without the theory of relativity. Computers, telecommunications, and the Internet, meanwhile, are spin-offs of quantum mechanics.

But the two theories are like different languages, and no one is yet sure how to translate between them. Relativity explains gravity and motion by uniting space and time into a 4-dimensional, dynamic, elastic fabric of reality called space-time, which is bent and warped by the energy it contains. (Mass is one form of energy, so it creates gravity by warping space-time.) Quantum mechanics, on the other hand, assumes that space and time form a flat, immutable “stage” on which the drama of several families of particles unfolds. These particles can move both forward and backward in time (something relativity doesn’t allow), and the interactions between these particles explain the basic forces of nature — with the glaring exception of gravity.

The stalemate between these two theories has gone on for decades. Most scientists assume that somehow, eventually, a unifying theory will be developed that subsumes the two, showing how the truths they each contain can fit neatly within a single, all-encompassing framework of reality. Such a “Theory of Everything” would profoundly affect our knowledge of the birth, evolution, and eventual fate of the universe.

Slava Turyshev, a scientist at JPL, and his colleagues have thought of a way to use the International Space Station (ISS) and two mini-satellites orbiting on the far side of the sun to test the theory of relativity with unprecedented accuracy. Their concept, developed in part through funding from NASA’s Office of Biological and Physical Research, would be so sensitive that it could reveal flaws in Einstein’s theory, thus providing the first hard data needed to distinguish which of the competing Theories of Everything agree with reality and which are merely fancy chalk-work.

The experiment, called Laser Astrometric Test Of Relativity (LATOR), would look at how the sun’s gravity deflects beams of laser light emitted by the two mini-satellites. Gravity bends the path of light because it warps the space through which the light is passing. The standard analogy for this warping of space-time by gravity is to imagine space as a flat sheet of rubber that stretches under the weight of objects like the sun. The depression in the sheet would cause an object (even a no-mass particle of light) passing nearby the sun to turn slightly as it went by.

In fact, it was by measuring the bending of starlight by the sun during a solar eclipse in 1919 that Sir Arthur Eddington first tested Einstein’s theory of general relativity. In cosmic terms, the sun’s gravity is fairly weak; the path of a beam of light skimming the edge of the sun would only be bent by about 1.75 arcseconds (an arcsecond is 1/3600 of a degree). Within the limits of accuracy of his measuring equipment, Eddington showed that starlight did indeed bend by this amount — and in doing so effectively impeached Newton.

LATOR would measure this deflection with a billion (109) times the precision of Eddington’s experiment and 30,000 times the precision of the current record-holder: a serendipitous measurement using signals from the Cassini spacecraft on its way to explore Saturn.

“I think [LATOR] would be quite an important advance for fundamental physics,” says Clifford Will, a professor of physics at Washington University who has made major contributions to post-Newtonian physics and is not directly involved with LATOR. “We should continue to try to press for more accuracy in testing general relativity, simply because any kind of deviation would mean that there’s new physics that we were not aware of before.”

Solar laboratory
The experiment would work like this: Two small satellites, each about one meter wide, would be launched into an orbit circling the sun at roughly the same distance as Earth. This pair of mini-satellites would orbit more slowly than Earth does, so about 17 months after launch, the mini-satellites and Earth would be on opposite sides of the sun. Even though the two satellites would be about 5 million km apart, the angle between them as viewed from Earth would be tiny, only about 1 degree. Together, the two satellites and Earth would form a skinny triangle, with laser beams along its sides, and one of those beams passing close to the sun.

Turyshev plans to measure the angle between the two satellites using an interferometer mounted on the ISS. An interferometer is a device that catches and combines beams of light. By measuring how waves of light from the two mini-satellites “interfere” with each other, the interferometer can measure the angle between the satellites with extraordinary precision: about 10 billionths of an arcsecond, or 0.01 ?as (micro-arcseconds). When the precision of the other parts of the LATOR design are considered, this gives an overall accuracy for measuring how much gravity bends the laser beam of about 0.02 ?as for a single measurement.

“Using the ISS gives us a few advantages,” Turyshev explains. “For one, it’s above the distortions of Earth’s atmosphere, and it’s also large enough to let us place the two lenses of the interferometer far apart (one lens on each end of the solar panel truss), which improves the resolution and accuracy of the results.”

The 0.02 ?as accuracy of LATOR is good enough to reveal deviations from Einstein’s relativity predicted by the aspiring Theories of Everything, which range from roughly 0.5 to 35 ?as. Agreement with LATOR’s measurements would be a major boost for any of these theories. But if no deviation from Einstein is found even by LATOR, most of the current contenders–along with their 11 dimensions, pixellated space, and inconstant constants–will suffer a fatal blow and “pass on” to that great dusty library stack in the sky.

Because the mission requires only existing technologies, Turyshev says LATOR could be ready to fly as soon as 2009 or 2010. So it may not be too long before the stalemate in physics is broken and a new theory of gravity, space, and time takes the throne.

Original Source: NASA/Science Story

New Research Confirms Einstein

Image credit: NASA

Einstein’s General Theory of Relativity got another confirmation this week thanks to research by an astronomer from NASA. Some theorists believed that particles popping into and out of existence in space would slow light down, as if it was moving through air or water. Scientists measured the total energy of gamma rays emitted by a distant gamma ray bursts and found that they were interacting with particles on their way to the Earth in such a way that precisely matched predictions by Einstein.

Scientists say that Albert Einstein’s principle of the constancy of the speed of light holds up under extremely tight scrutiny, a finding that rules out certain theories predicting extra dimensions and a “frothy” fabric of space.

The finding also demonstrates that basic ground- and space-based observations of the highest-energy gamma-rays, a form of electromagnetic energy like light, can provide insight into the very nature of time, matter, energy and space at scales extremely far below the subatomic level — something that few scientists thought possible.

Dr. Floyd Stecker of NASA’s Goddard Space Flight Center in Greenbelt, Md., discusses the implications of these findings in a recent issue of Astroparticle Physics. His work is based partly on an earlier collaboration with Nobel laureate Sheldon Glashow of Boston University.

“What Einstein worked out with pencil and paper nearly a century ago continues to hold up to scientific scrutiny,” said Stecker. “High-energy observations of cosmic gamma rays don’t rule out the possibility of extra dimensions and the concept of quantum gravity, but they do place some strict constraints on how scientists can go about finding such phenomena.”

Einstein stated that space and time were actually two aspects of a single entity called spacetime, a four-dimensional concept. This is the foundation to his theories of special and general relativity. For example, general relativity posits that the force of gravity is the result of mass distorting spacetime, like a bowling ball on a mattress.

General relativity is the theory of gravity on a large scale, while quantum mechanics, developed independently in the early 20th century, is the theory of the atom and subatomic particles on a very small scale. Theories based on quantum mechanics do not describe gravity, but rather the other three fundamental forces: electromagnetism (light), strong forces (binding atomic nuclei), and weak forces (seen in radioactivity).

Scientists have long hoped to meld these theories into one “theory of everything” to describe all aspects of nature. These unifying theories — such as quantum gravity or string theory — may involve the invocation of extra dimensions of space and also violations of Einstein’s special theory of relativity, such as the speed of light being the maximum attainable velocity for all objects.

Stecker’s work involves concepts called the uncertainty principle and Lorentz invariance. The uncertainty principle, derived from quantum mechanics, implies that at the subatomic level virtual particles, also called quantum fluctuations, pop in and out of existence. Many scientists say that spacetime itself is made up of quantum fluctuations which, when viewed up close, resemble a froth or “quantum foam.” Some scientists think a quantum foam of spacetime can slow the passage of light — much as light travels at a maximum speed in a vacuum but at slower speeds through air or water.

The foam would slow higher-energy electromagnetic particles, or photons — such as X rays and gamma rays — more than lower energy photons of visible light or radio waves. Such a fundamental variation in the speed of light, different for photons of different energies, would violate Lorentz invariance, the basic principle of the special theory of relativity. Such a violation could be a clue that would help point us on the road to unification theories.

Scientists have hoped to find such Lorentz invariance violations by studying gamma rays coming from far outside the Galaxy. A gamma-ray burst, for example, is at such a great distance that the differences in the speeds of photons in the burst, depending on their energy, might be measurable — as the quantum foam of space may act to slow light which has been traveling to us for billions of years.

Stecker looked much closer to home to find that Lorentz invariance is not being violated. He analyzed gamma rays from two relatively nearby galaxies about half a billion light years away with supermassive black holes at their centers, named Markarian (Mkn) 421 and Mkn 501. These black holes generate intense beams of gamma-ray photons that are aimed directly at the Earth. Such galaxies are called blazars. (Refer to Image 4 for a picture of Mkn 421. Images 1 – 3 are artist’s concepts of supermassive black holes powering quasars which, when pointed directly at Earth, are called blazars. Image 5 is a Hubble Space Telescope photo of a blazar.)

Some of the gamma rays from Mkn 421 and Mkn 501 collide with infrared photons in the Universe. These collisions result in the destruction of the gamma rays and infrared photons as their energy is converted into mass in the form of electrons and positively charged antimatter-electrons (called positrons), according to Einstein’s famous formula E=mc^2. Stecker and Glashow have pointed out that evidence of the annihilation of the highest-energy gamma rays from Mkn 421 and Mkn 501, obtained from direct observations of these objects, demonstrates clearly that Lorentz invariance is alive and well and not being violated. If Lorentz invariance were violated, the gamma rays would pass right through the extragalactic infrared fog without being annihilated.

This is because annihilation requires a certain amount of energy in order to create the electrons and positrons. This energy budget is satisfied for the highest-energy gamma rays from Mkn 501 and Mkn 421 in interacting with infrared photons if both are moving at the well-known speed of light according to the special theory of relativity. However, if the gamma rays in particular were moving at a slower velocity because of Lorentz invariance violation, the total energy available would be inadequate and the annihilation reaction would be a “no go.”

“The implications of these results,” Stecker said “is that if Lorentz invariance is violated, it is at such a small level — less than one part in a thousand trillion — that it is beyond the ability of our present technology to find. These results may also be telling us that the correct form of string theory or quantum gravity must obey the principle of Lorentz invariance.”

For more information, refer to “Constraints on Lorentz Invariance Violating Quantum Gravity and Large Extra Dimensions Models using High Energy Gamma Ray Observations” online at:

Original Source: NASA News Release