Categories: Astronomy

Computer to Simulate Exploding Star

Image credit: University of Chicago
University scientists are preparing to run the most advanced supercomputer simulation of an exploding star ever attempted.

Tomasz Plewa, Senior Research Associate in the Center for Astrophysical Thermonuclear Flashes and Astronomy & Astrophysics, expects the simulation to reveal the mechanics of exploding stars, called supernovae, in unprecedented detail.

The simulation is made possible by the U.S. Department of Energy?s special allocation of an extraordinary 2.7 million hours of supercomputing time to the Flash Center, which typically uses less than 500,000 hours of supercomputer time annually.

?This is beyond imagination,? said Plewa, who submitted the Flash Center proposal on behalf of a research team at the University and Argonne National Laboratory.

The Flash Center project was one of three selected to receive supercomputer time allocations under a new competitive program announced last July by Secretary of Energy Spencer Abraham.

The other two winning proposals came from the Georgia Institute of Technology, which received 1.2 million processor hours, and the DOE?s Lawrence Berkeley National Laboratory, which received one million processor hours.

The supercomputer time will help the Flash Center more accurately simulate the explosion of a white dwarf star, one that has burned most or all of its nuclear fuel. These supernovae shine so brightly that astronomers use them to measure distance in the universe. Nevertheless, many details about what happens during a supernova remain unknown.

Simulating a supernova is computationally intensive because it involves vast scales of time and space. White dwarf stars gravitationally accumulate material from a companion star for millions of years, but ignite in less than a second. Simulations must also account for physical processes that occur on a scale that ranges from a few hundredths of an inch to the entire surface of the star, which is comparable in size to Earth.

Similar computational problems vex the DOE?s nuclear weapons Stockpile Stewardship and Management Program. In the wake of the Comprehensive Test Ban Treaty, which President Clinton signed in 1996, the reliability of the nation?s nuclear arsenal must now be tested via computer simulations rather than in the field.

?The questions ultimately are how is the nuclear arsenal aging with time, and is your code predicting that aging process correctly?? Plewa said.

Flash Center scientists verify the accuracy of their supernovae code by comparing the results of their simulations both to laboratory experiments and to telescopic observations. Spectral observations of supernovae, for example, provide a sort of bar code that reveals which chemical elements are produced in the explosions. Those observations currently conflict with simulations.

?You want to reconcile current simulations with observations regarding chemical composition and the production of elements,? Plewa said.

Scientists also wish to see more clearly the sequence of events that occurs immediately before a star goes supernova. It appears that a supernova begins in the core of a white dwarf star and expands toward the surface like an inflating balloon.

According to one theory, the flame front initially expands at a relatively ?slow? subsonic speed of 60 miles per second. Then, at some unknown point, the flame front detonates and accelerates to supersonic speeds. In the ultra-dense material of a white dwarf, supersonic speeds exceed 3,100 miles per second.

Another possibility: the initial subsonic wave fizzles when it reaches the outer part of the star, leading to a collapse of the white dwarf, the mixing of unburned nuclear fuel and then detonation.

?It will be very nice if in the simulations we could observe this transition to detonation,? Plewa said.

Flash Center scientists already are on the verge of recreating this moment in their simulations. The extra computer time from the DOE should push them across the threshold.

The center will increase the resolution of its simulations to one kilometer (six-tenths of a mile) for a whole-star simulation. Previously, the center could achieve a resolution of five kilometers (3.1 miles) for a whole-star simulation, or 2.5 kilometers (1.5 miles) for a simulation encompassing only one-eighth of a star.

The latter simulations fail to capture perturbations that may take place in other sections of the star, Plewa said. But they may soon become scientific relics.

?I hope by summer we?ll have all the simulations done and we?ll move on to analyze the data,? he said.

Original Source: University of Chicago News Release

Fraser Cain

Fraser Cain is the publisher of Universe Today. He's also the co-host of Astronomy Cast with Dr. Pamela Gay. Here's a link to my Mastodon account.

Recent Posts

New Shepard’s 25th Launch Carries Six to the Edge of Space and Back

Sending tourists to space is still relatively novel in the grand scheme of humanity's journey…

51 mins ago

That Recent Solar Storm Was Detected Almost Three Kilometers Under the Ocean

On May 10th, 2024, people across North America were treated to a rare celestial event:…

1 hour ago

More Evidence for the Gravitational Wave Background of the Universe

The gravitational wave background was first detected in 2016. It was announced following the release…

2 days ago

When Uranus and Neptune Migrated, Three Icy Objects Were Crashing Into Them Every Hour!

The giant outer planets haven’t always been in their current position. Uranus and Neptune for…

2 days ago

Astronomers Discover the Second-Lightest “Cotton Candy” Exoplanet to Date.

The hunt for extrasolar planets has revealed some truly interesting candidates, not the least of…

2 days ago

Did Earth’s Multicellular Life Depend on Plate Tectonics?

How did complex life emerge and evolve on the Earth and what does this mean…

3 days ago