Extinction Alert: Stephen Hawking Says Our Technology Might Wipe Us Out

If you’re thinking of having yourself cryogenically suspended and awakened in some future paradise, you might want to set your alarm clock for no later than 1,000 years from now. According to the BBC, Stephen Hawking will be saying this much in the 2016 Reith Lectures – a series of lectures organized by the BBC that explore the big challenges faced by humanity.

In Hawking’s first lecture, which will be broadcast on February 26th on the BBC, Hawking covers the topic of black holes, whether or not they have hair, and other concepts about these baffling objects.

But at the end of the lecture, he responded to audience questions about humanity’s capacity for self destruction. Hawking said that 1,000 years might be all we have until we meet our demise at the hands of our own scientific and technological advances.

As we have become increasingly advanced both scientifically and technologically, Hawking says, we will be creating “new ways that things can go wrong.” Hawking mentioned nuclear war, global warming, and genetically engineered viruses as things that could cause our extinction.

Nuclear War

Through the Cold War, annihilation at the hands of our own nuclear weapons was a real danger. The threat of a nuclear launch in response to a real or perceived threat was real. The resulting retaliation and counter-retaliation was a risk faced by everyone on the planet. And the two superpowers had enough warheads between them to potentially wipe out life on Earth.

One nuclear explosion can ruin your whole day. Image: Andrew Kuznetsov, CC by 2.0
One nuclear explosion can ruin your whole day. Image: Andrew Kuznetsov, CC by 2.0

The USA and the USSR have reduced their stockpiles of nuclear weapons in recent decades, but there are still enough warheads around to wipe us out. The possibility of a rogue state like North Korea setting off a nuclear confrontation is still very real. By the time Hawking’s 1,000 year time-frame has passed, we’ll either have solved this problem, or we won’t be here.

Global Warming

Earth is getting warmer, and though the Earth has warmed and cooled many times in its history, this time we only have ourselves to blame. We’ve been inadvertently enriching our atmosphere with carbon since the Industrial Revolution. All that carbon is creating a nice insulating layer around Earth, as it traps heat that would normally radiate into space. If we reach some of the “tipping points” that scientists talk about, like the melting of permafrost and the subsequent release of methane, we could be in real trouble.

Global Mean Surface Temperature. Image: NASA, Goddard Institute for Space Studies
Global Mean Surface Temperature. Image: NASA, Goddard Institute for Space Studies

Different climate engineering schemes have been thought up to counteract global warming, like seeding the upper atmosphere with reflective molecules, and having fleets of ships around the equator spraying sea mist into the air to partially block out the sun. Or even extracting carbon from the atmosphere. But how realistic or effective those counter-measures might be is not clear.

Genetically Engineered Viruses

As a weapon, a virus can be cheap and effective. There’ve been programs in the past to develop biological weapons. The temptation to use genetic science to create extremely deadly viruses may prove too great.

Smallpox and Viral Hemorrhagic Fevers have been weaponized, and as our genetic manipulation abilities grow, it’s possible, or even likely, that somebody somewhere will attempt develop even more dangerous viral weapons. They may be doing it right now.

There’s a ban on viral weapons, called the Biological and Toxin Weapons Convention signed in 1972. But, not everybody has signed it.

Artificial Intelligence

Hawking never mentioned AI in his talk, but it fits in with the discussion. As our machines get smarter and smarter, will they deduce that the only chance for survival is to remove or reduce the human population? Who knows. But Hawking himself, as well as other thinkers, have been warning us that there may be a catastrophic downside to our achievements in AI.

A Google driverless car: Looks harmless, doesn't it? Image: Michael Shick http://creativecommons.org/licenses/by-sa/4.0
A Google driverless car: Looks harmless, doesn’t it? Image: Michael Shick http://creativecommons.org/licenses/by-sa/4.0

We may love the idea of driverless cars, and computer assistants like SIRI. But as numerous science fiction stories have warned us (Skynet in the Terminator series being my favorite,) it may be a small step from very helpful AI that protects us and makes our lives easier, to AI that decides existence would be a whole lot better without us pesky humans around.

The Technological Singularity is the point at which artificially intelligent systems “wake up” and become—more or less—conscious. These AI machines would start to improve themselves recursively, or build better and smarter machines. At this point, they would be a serious danger to humanity.

Drones are super popular right now. They flew off the shelves at Christmas, and they’re great toys. But once we start seeing drones with primitive but effective AI, patrolling the property of the wealthy, it’ll be time to start getting nervous.

Extinction May Have To Wait

As our scientific and technological prowess grows, we’ll definitely face new threats, just like Hawking says. But, that same progress may also protect us, or make us more resilient. Hawking says, “We are not going to stop making progress, or reverse it, so we have to recognise the dangers and control them. I’m an optimist, and I believe we can.” So do we.

Maybe you’ll be able to hit the snooze button after all.

Original Source: BBC News

8 Replies to “Extinction Alert: Stephen Hawking Says Our Technology Might Wipe Us Out”

  1. If we really need to get off the planet, we need a cheaper way to send <10 people at a time to orbit. Using the current Hydrogen/Oxygen rocket system, re-configure the Hydrogen to a gas (Balloon), add Oxygen at the apex of the balloon flight and use the empty shell as a solar sail to move around. Cheaper, Easier, not safer.
    None of this is rocket science, just old fashioned flight: more @ http://www.h2liftship.com

  2. Dzzzzzaster! I vote virus. Something gets dug up or released due to hydro fracking, or an ice covered lake gets exposed due to GW and releases an ancient virus and the bug gets us because we have no defenses… Or, last but not least, a meteor enters the atmosphere and releases an alien spore that is toxic to left handed chirality lifeforms… The End

  3. Our own stupidity. We have the technology to shoot down an ICBM and our politicians want to play politics with it.

    Viruses have been getting more complex and resistant to medicine. Some STD’s are immune to treatment. HIV can’t be cured in most cases. It seems only a matter of time before a deadly untreatable virus comes into existence.

    Global Warming/Global Cooling probably won’t kill us off. We could live underground if temperatures require it. But Stupidity and politics might prevent it. We more than have the technology to survive Snowball Earth.

    I think our that some form of AI killing us will be because of our own stupidity. If we let an AI control any part of Government, then we deserve our fate.

    1. I subscribe to the stupidity theory. Ever country wants some kind of leverage on the other, which puts everyone else on edge. It kind of reminds me of the situation created by a Nash equilibrium in economics except on a larger scale.

    2. Practically we have no effective treatment for almost all virus. Right now. Only our immune system can deal with them -sometimes-. And viruses change by mutation all the time, so I think that if humanity has survived so far, we will survive in the future, even if nasty new viruses are created.

  4. I think we’ll be fine if we can colonise other planets and, ultimately, work our way out to other stars. We will then be safe against planet-killing disasters. Mining asteroids, comets, and KBOs will give us a vastly unimaginable wealth of raw materials, and the societal problems that come from resource scarcity will fade away. I’m optimistic.

  5. — “a rogue state like North Korea” … typical Cold-War, State-Dept. rogue language, and a worn-out phrase, like “the free world”, so dear to publications like “Time”, “Life” & “Reader’s Digest”. Their leader is simple and ingenuous rather than cruel. They’re striving to survive within their honest ideological limitations. They’re terrified of their neighbor to the south and its huge ally, which is fond of murderous régimes that make the world safe for Big Nasty Business, like down in miserable Colombia, so they feel compelled to spend their scarce resources on expensive weapons. (The Colombia Plan is mostly military aid for those who started the present civil war back in 1948 with Operation Pantomime, conceived by the CIA.)

    — “If we reach some of the ‘tipping points’ that scientists talk about, like the melting of permafrost and the subsequent release of methane, we could be in real trouble.” There will be real trouble if the permafrost that covers Siberia and northern Canada doesn’t melt fast so that we can have vast territories finally available once again for growing food and digging up strategic minerals without having to go terraform Mars, which would take too long, anyway, like a thousand years. Together, Canada and Russia will be like half a planet of beckoning land in a nice, warm planet, far away from the tornado, cyclone and earthquake zones.

Comments are closed.