Stephen Hawking's dire warnings

Stephen Hawking has made some very dire predictions regarding the future of the humanity and the chances of survival due to;

Climate Change
Artificial Intelligence
Nuclear Weapons
Disease

But on a long enough time span the survival rate of any(thing) drops to zero (paraphrasing Fight Club), as well as, given a long enough list of doomsday/apocalyptic scenarios one/some becomes more likely. This leads to a Deadpool of sorts for humanity. What are your thought on what will cause humanity’s demise and which one are you betting on?

Climate Change
Artificial Intelligence
Nuclear Weapons
Disease
Asteroid Collision
Overpopulation
Return or Initial visit of Divine Being
WWIII
Other <Please list here>
Combination of some of the above

I think we can reasonably cross “overpopulation” and “divine being” off the list; neither is truly credible at this point. WWIII is pretty much just Nuclear Weapons, so that’s kinda mixed together. Personally, I’d like to have a high enough opinion of Humanity to say that it’s going to be AI, but I don’t - I think climate change is substantially more likely to be a factor.

Under other:

Supervolcanoes
Nanotechnology gone wrong
Bioengineering gone wrong
Loss of biodiversity
Supernova or Gamma-ray burst
Aliens

Are we talking about elimination of every living human, or the general destruction of civilization so life for the survivors is nasty, brutish, and short?

Because I think humans will survive climate change, but we won’t be wiped out. It will just be shitty for everyone who survives. But we’ll adapt over time.

Climate Change - Not currently predicted to result in a runaway catastrophe and as technology improves it becomes cleaner so, while we might not yet be at peak CO[sub]2[/sub], the general trend should be downward in about 30 years. It’s also worth noting that, most likely, we could build giant scrubbers, seed the air with large quantities of sulfur dioxide, etc. to alleviate the issue, should it start looking like we were headed into catastrophe territory.

Artificial Intelligence - Most AI will not be true AI. There are ethical implications to create a real life form (virtual or not), and that should cause most people working on it to tread carefully. And, to create true AI, you would need to give it an environment that was of sufficient complexity for it to develop. Most likely, this would be a virtual realm; think World of Warcraft or Minecraft. Despite Hollywood’s’ presentation, this basically sandboxes the AI to a place where it can not access the code. You can’t hack a computer by punching monsters in an instanced dungeon. That’s just not how it works. And, due to the ethical implications, it would possibly be considered unethical to reveal to the AI that its universe was a simulation. You would either let it play out or simply hit the stop button and then analyze the data.

Nuclear Weapons - Might take out large chunks of humanity. Unlikely to end the species. Due the physical realities of mining and processing ore, it’s unlikely that a non-state actor would be able to develop nukes, which leaves them in the hands of bureaucracies that are going to be slow to deploy them (if at all) and selective of their target. There’s no value to a nation in destroying humanity, and so they would not deploy them in such a way.

Disease - Not any arbitrary / natural disease, but lab equipment and microengineering / DNA manipulation could become cheap and easy enough for a lone madman to hand develop something completely catastrophic. In a worst case, he would be able to create something with a timed deployment, so it could spread naturally, without side effect, for several years before triggering and killing everyone. It’s likely that a few humans on island would still survive, but they would probably have sufficiently backwards cultures that (potentially) you would be resetting the history of human civilization. Though, as said, this would be the worst case. A less carefully created disease would take out large chunks of humanity, but they’d come right back at the same technological level as before.

Asteroid Collision - As technology improves, our ability to spot and redirect an asteroid becomes much better. It would basically need to hit us in the next 40 years or, likely, we’ll have advanced to a point that it’s no longer a reasonable worry. The nice thing with space collisions is that you’ve (potentially) got such a long forewarning that even with minimal explosives, you still might be able to divert mass from their target just by adjusting their course by 0.00001 degrees.

Overpopulation - Self-correcting.

Return or Initial visit of Divine Being - Based on the current lack of indication of any such event in the last 13.82 billion years, the odds are against any such fear.

WWIII - As someone else indicated, this is effectively the same as the nuclear threat.

Other - Nanobots, but this is just the same as disease, using an alternate mechanism.

I think that overpopulation is already here but it is self-correcting as noted. It is the root cause of many other problems but it is a chronic rather than an acute problem.

The most likely acute problem IMO is a nuclear attack in key world cities like NYC, London or Tokyo. There are plenty of people that would love to do that but currently don’t have access to nuclear weapons thankfully but that could change at any time. A single truck based nuke detonated near Wall Street would take down the whole world’s financial markets. It may not kill a significant fraction of the population but the effects would be devastating and make 9/11 look like playground taunting. All of Manhattan could become uninhabitable if the nuke is large enough.

I was thinking every human, eventually we would have to go, it could be a one-two punch combination as well, for example an asteroid to kill off a large percentage of humanity and then the blocking of the sun (due to dust and particles) and killing off of vegetation, cooler climate etc to finish off the rest, or war amongst the survivors etc

How many(if any) of these categories is he considered to be an expert in?

No votes for the best case scenario of surviving until the sun turns into a red giant? I think that’s likely the upper limit for our species. If we can somehow survive that by mastering interstellar travel by that time, the next doomsday would be when our galaxy runs out of new stars to travel to. I’m not sure how far into the future that will be but it will happen eventually.

The instant interstellar travel becomes feasible, we’ll be as hard to kill off as the common Cold.

I’ll go with “cascading ecosystem collapse” for $1000, Alex.

Me, too.

Basically this is like saying “all of the above” (actually, “most of the above”).

You could lay it at the door of overpopulation, or maybe overpopulation AND climate change.

Basically, we are over the world’s sustainable carrying capacity, for all kinds of things. Food, fresh water, many material resources.

Climate change exacerbates these problems. Many of the solutions to correct (or often only temporarily deal with or mitigate) these problems exacerbate global warming.

As these problems come to a head, the Four Horsemen will ride with increasing vigor. Overpopulation combined with resource scarcity and loss of coastal lands will call for ever more War. Famine will both follow and precede War. Pestilence will romp through overcrowded cities, enhanced by changing geographic distributions of various pests. Death will be too busy to keep up. Basically, the “The Postman” squared (or any number of other dystopian sci fi futures). We may be grateful if the AI’s take over, a la “The Matrix”- at least we won’t have to eat goop.

Of course, as others have noted upthread, these problems may be self correcting, unless climate change goes too far…

I’m hoping for a Robotopia, where my hard decisions will be whether to skip Karate class to learn to play an instrument, or should I accompany my wife to Yoga class?

Extinction is often caused by a confluence of factors acting upon a stressed population. Major events can chain together, such as super volcanoes causing an anoxic ocean, a firing of the clathrate gun, or hydrogen sulfide emissions.

The sun’s output is steadily increasing, and IIRC the Earth’s biosphere will be destroyed in about a billion years, and anything that requires oxygen will be gone awhile before that.

Maybe humanity will destroy itself in a geo-engineering accident trying to deal with one of the above. Maybe we’ll tow an asteroid into Earth orbit to mine it except it will accidentally collide with the Earth. Maybe someone will try to genetically engineer cute puppies that glow in the dark, but also somehow makes a prion that destroys all eukaryotes.

Mass Hysteria.

Not comfortable writing off overpopulation as “not a problem.” It’s leading to severe ecological degradation, habitat loss, and waves of extinctions. It’s also a big part of climate change, as the new billions want electricity, transport, communications, and drinkable water, all of which require energy, and energy is (today) a major carbon emitter.

If the human population had never grown above one billion, all the other problems on the list would be much less in intensity.

(Also, the “self-correcting” idea is pretty macabre, given that it will come in the form of hellish epidemics and mass starvation.)

I’ll bet on societal collapse brought on by compulsive gambling.

That is a limited effect because it stays in Las Vegas.

Or the Kardashians.

Nine’ll get ya eleven it doesn’t!

That’s why I think he is hedging his bets so he can’t be proven wrong (after the fact, when there is no humanity left), I think (if memory serves me correct) it was last year when he made his first doomsday scenario for Earth, and in all the hysteria I started digging a hole in my backyard to build a bunker. Then a few months later there was Stephen Hawking on the webernet postulating another doomsday scenario , but this time involving AI, so I went back outside, and beside my original hole (I got down about half a foot), I started digging a new hole. Then a few months later and Trump made a decision on the Paris climate deal and 'lo and behold Stephen Hawking is all over social media proclaiming that this is it, the Earth will become like Venus with sulfur raining down on the masses - which is quite prophetic, so I get outside look at the two holes in my backyard, one that is half a foot deep and the second which is two inches deep and I cover over the one that is two inches deep but leave the one that is half a foot deep, just in case.

It is always a safe bet to wager on what will kill off humanity