We all know that we are “made of star-stuff,” with all of the elements necessary for the formation of planets and even life itself having originated inside generations of massive stars, which over billions of years have blasted their creations out into the galaxy at the explosive ends of their lives. Supernovas are some of the most powerful and energetic events in the known Universe, and when a dying star finally explodes you wouldn’t want to be anywhere nearby—fresh elements are nice and all but the energy and radiation from a supernova would roast any planets within tens if not hundreds of light-years in all directions. Luckily for us we’re not in an unsafe range of any supernovas in the foreseeable future, but there was a time geologically not very long ago that these stellar explosions are thought to have occurred in nearby space… and scientists have recently found the “smoking gun” evidence at the bottom of the ocean.
Two independent teams of “deep-sea astronomers”—one led by Dieter Breitschwerdt from the Berlin Institute of Technology and the other by Anton Wallner from the Australian National University—have investigated sediment samples taken from the floors of the Pacific, Atlantic, and Indian oceans. The sediments were found to contain relatively high levels of iron-60, an unstable isotope specifically created during supernovas.
The teams found that the ages of the iron-60 concentrations (the determination of which was recently perfected by Wallner) centered around two time periods, 1.7 to 3.2 million years ago and 6.5 to 8.7 million years ago. Based on this and the fact that our Solar System currently resides within a peanut-shaped region virtually empty of interstellar gas known as the Local Bubble, the researchers are confident that this provides further evidence that supernovas exploded within a mere 330 light-years of Earth, sending their elemental fallout our way.
“This research essentially proves that certain events happened in the not-too-distant past,” said Adrian Melott, an astrophysicist and professor at the University of Kansas who was not directly involved with the research but published his take on the findings in a letter in Nature. (Source)
The researchers think that two supernova events in particular were responsible for nearly half of the iron-60 concentrations now observed. These are thought to have taken place among a a nearby group of stars known as the Scorpius–Centaurus Association, some 2.3 and 1.5 million years ago. At those same time frames Earth was entering a phase of repeated global glaciation, the end of the last of which led to the rise of modern human civilization.
While supernovas of those sizes and distances wouldn’t have been a direct danger to life here on Earth, could they have played a part in changing the climate?
“Our local research group is working on figuring out what the effects were likely to have been,” Melott said. “We really don’t know. The events weren’t close enough to cause a big mass extinction or severe effects, but not so far away that we can ignore them either. We’re trying to decide if we should expect to have seen any effects on the ground on the Earth.”
Regardless of the correlation, if any, between ice ages and supernovas, it’s important to learn how these events do affect Earth and realize that they may have played an important and perhaps overlooked role in the history of life on our planet.
“Over the past 500 million years there must have been supernovae very nearby with disastrous consequences,” said Melott. “There have been a lot of mass extinctions, but at this point we don’t have enough information to tease out the role of supernovae in them.”
UPDATE 4/14/16: The presence of iron-60 from the same time periods as those mentioned above has also been found on the Moon by research teams in Germany and the U.S. Read more here.