Existential Threats

Which of the existential threats humans are currently facing is the most likely to cause our extinction:
Pandemics, climate change, drinking water contamination, AI takeover, earth-astroid/comet impact, super volcano.

Climate change.

As bad as climate change gets, I don’t see how it could reach an extinction level. The worse things get, the more solutions will be engineered - not necessarily to solve it, but enough to cope or stave off some things and hold them at bay.

Bear in mind that all of the Earth’s population could probably live and work and survive in a country the size of Australia, if need be, if technology were heavily, heavily engineered to solve problems pertaining to food and whatnot. It wouldn’t be fun, but it would be doable.

Now as for a suddenly massive asteroid, that might be harder.

No way we wouldn’t see it coming, and all we’d need to do is nudge it a bit long enough in advance that this nudge makes it miss us.

Now, a planet sized extrasolar object, we wouldn’t see coming, and it would be moving so fast and be so massive we would be hard pressed to do anything about it. But the odds of something like that striking us are (pardon the pun) astronomically tiny.

Even if 99% of human life dies, that still leaves 80 million people. thats about the world population in 500 BC.

Killing all humans is difficult. Climate change definitely can’t do it. I don’t even know if a meteor impact could because in theory some humans could live in underground lairs powered by nuclear or geothermal, and live off saved food or food grown under LED lights until the environment on the ground became hospitable again.

When a volcanic eruption occurred around 70,000 years ago humans dropped to a few thousand people and we survived. I don’t know what the minimum number of people you need to repopulate the earth is, it may be as low as 2 humans (assuming you are ok with the letting the ones who are deformed from inbreeding die). But even 100 survivors is probably enough to provide enough genetic diversity to repopulate earth.

I don’t think AI would kill us all. It might abandon us, but I don’t see it intentionally targeting all people for extinction. If it did I’m sure it would succeed but why would it want to do that? Thats like saying humans want to exterminate every ant on planet earth. It makes no sense.

Drinking water contamination, I don’t think thats a risk. We know enough to remove pathogens, toxins and metals from water if we really want to.

I don’t think any of them would lead to extinction. The meteor is closest, but even that one can be survived by at least some people. Underground bunkers stuffed with food, maybe powered by nuclear or geothermal energy with the ability to grow crops with LED lights would let at least some humans survive. Even if 99.99% of people die in a meteor impact, that still leaves 800,000 humans, more than enough to repopulate.

Some of the stuff about theoretical nuclear war is pretty scary with the most likely scenario Pakistan and India it wouldn’t even necessarily have to be that bad to create a huge domino effect of changes down the line.

This is almost an IMHO poll type of thread. While the OP didn’t offer a debate, the underlying discussion can be considered debatable. I’ll leave this open for now.

It’s fascinating that nuclear war seems to have completely dropped off the radar, so to speak.

I’d say economic foolishness leading to economc/financial collapse, leading to a major war. or wars. It won’t wipe out humanity, but it’d be a real kick in the teeth.

Extinction is a high bar as there are a lot of humans and getting every single one of us is difficult. Of the things listed:

Pandemics. Possible but a disease needs to have the right combination of no immunity (both natural or artificial), rapid (but not too rapid) incubation, high transmission and high (but not too high) lethality. Can such a disease exist? Maybe, the first criteria is a true unknown. Certainly the world response to SARS-CoV-2 has shown us that if such a disease did exist, we’re pretty forked.

Climate change. Under even the worst scenarios, this is not an extinction event. There would need to be a currently unknown process that would accelerate climate change beyond control to human extinction levels. Possible? Sure, but not very likely. However, of the things on the list this is the one most likely to happen as it is already happening and will cause an enormous toll of human misery and death. It also happens to be the one we could do the most about, but are refusing to do so for strictly political reasons. Much like those who refuse to wear masks in a pandemic.

Drinking water contamination. Unlikely that water could ever become so contaminated that it could not be treated. This is simply expensive and not extinction causing.

AI takeover. This is the one for which I have the most expertise, and I can say with a high degree of certainty that it won’t be this anytime soon. There are three main reasons: 1) algorithmic, 2) computational power, and 3) the digital/physical divide.

Algorithmic. There are no known algorithms that can provide general artificial super intelligence. There isn’t even a plausible path to one (no, neural networks isn’t). There has been some recently scholarly speculation that it might be impossible (I don’t agree). AI is simply a computational tool with a cool sounding name.

Computational Power. A hypothetical general artificial super intelligence would require massive amounts of computational power due to the curse of dimensionality (the most things to consider the more computational power required) that simply doesn’t exist. Even were it to find such power, say by spreading across every supercomputer (and it is possible this might not even be enough) in the world …

Digital/physical divide. While our world has a lot of automation, it isn’t so automated that a rogue AI can declare war on humanity very easily. Without agents (human or robotic) working for it in the physical world, an AI is pretty limited it what it can do. The digital world is simply too vulnerable and reliant on the physical world.

So while there may come a day when a hostile AI could pose an extinction-level risk to humanity, we are simply not there yet. Of course, a malicious or even badly-coded AI can still cause a lot of damage as we’ve seen for example in the financial markets.

Earth asteroid/comet impact. With a big enough rock, this could certainly kill all humans. Fortunately, such events are very rare, and our ability to detect and deflect an asteroid is improving.

Super volcano. With a big enough volcano, you could hypothetically kill all humans; however, even a Yellowstone supereruption is not considered likely to kill all humans.

I can think of at least one known process that could do it - a global-scale clathrate gun/methane expulsion event (sub-century 6°C warming is an existential threat in my book)

I’m not sure even a 6 degree increase would kill all humans. Certainly it would render significant tracts of the world (relatively) uninhabitable. Humanity would become more densely packed into areas around the poles. The death toll would be catastrophic, but some humans would likely survive.

Regarding climate change, there are two worst-case scenarios which have the highest probability of killing all humans:

  1. Continued acidification of the oceans as a result of absorbing more CO2 might cause the phytoplankton population to collapse, and since phytoplankton produce around 50 to 80 percent of the world’s oxygen [1] this collapse will bring oxygen levels below the necessary threshold for humans to survive. However, I think that even if this happens, rich people (with military support) will enslave some scientists, and force the scientists to conduct electrolysis of water on an industrial scale, generating enough oxygen to allow a small population of humans to hold out.
  1. The continued addition of CO2 to the atmosphere might lead to a runaway greenhouse effect like the one that happened to Venus. This will certainly kill all surface-dwelling humans, but like Wesley_Clark said, some people can still survive in underground bunkers. Also, this process is slow on human timescales.

Personally, I feel that humans will never go extinct until the sun’s slowly increasing power output boils off the oceans in ~1 billion years, but it is quite possible that after 200 - 300 years, a combo of wars and unsustainable rates of using natural resources (along with not building out enough infrastructure for renewable energy), will cause society to revert back to a pre-industrial state for the foreseeable future.

The other posters have pretty much mirrored my views on the other threats listed in the OP, so no need for me to expound on those.

Here’s a darn good primer on the topic. If the OP wants to learn, start here & follow the references and related articles.

While not likely, the only thing I feel would lead to total destruction of the human population is a direct hit by a nearby gamma ray burst

I think it’s more than unlikely - there aren’t any potentially GRB producing objects in deadly range in the first place, are there?

And unlike a rogue planet which could be hurtling towards us through extrasolar space at a significant chunk of the speed of light with no way for us to detect it until its gravity starts tugging on the outer planets (assuming it is coming down a direction that will have it pass by those planets at all), a GRB producing star about to go supernova isn’t exactly stealthy.

Would the mass of the Earth protect life on the opposite side of the origination of a hypothetical GRB? Of course there could be second order effects that would be devastating to any survivors.

Here’s @Chronos thoughts on point from a few years ago:

Aren’t there climate change events that could collapse the food chain? It’s all well and good to claim that humanity could repopulate if reduced to 1000 people or that rich people could all move to a space colony or something, but if they can’t reliably feed themselves on a long-term basis couldn’t they end up offed by starvation or dietary deficiency?

And regarding humanity getting wiped out by an asteroid impact, all it would have to do is hit hard enough.

I’d say unleashing multiple extremely deadly and contagious human-engineered diseases at the same time might do the trick. I don’t know why anybody with the resources would do that, but I think it might do the trick.

Agreed. It could displace cities and cause food shortages resulting in the death of many but certainly not the death of the humanity.

I consider Elon Musk a modern day Tesla/Edison and find it very interesting that he considers AI to be the greatest threat to humanity. It’s possible this is because he believes we’re about the solve the single planet problem.