Anagenesis is missing too (along with ye olde AI Apocalypse), an evolution into a new species. Depends on whether you consider that to be an “end” or not, per se. Did dinosaurs end as well, or evolve into birds?
Ahem… ![]()
Death of the sun and heat death of the universe are so far out on the horizon that humans will have likely evolved into something that wouldn’t be recognized as “human” to us anymore.
Possibly other in that we may evolve ourselves into something different through the use of integrating technology, altering our genetics or biotech.
But really it seems to me the most likely cause of human extinction is nuclear war. To me that is the only thing that could kill enough humans and destroy the environment such that we couldn’t recover.
Even the worst scenarios of resource depletion or climate change I assume SOME humans will survive SOMEWHERE.
In the post just previous to yours I reminded John_DiFool that in my OP I said that, for the purposes of the poll, human evolution into something else should still be considered human, or at least, not the end of humanity. This would include evolving ourselves through any form of transhumanism.
Many details can’t be known, but I’m sure it will end with a whimper and not a bang.
There will be fires everywhere but people will be afraid to pull the fire alarm.
Without a bang, humanity will end for sure.
It most certainly isn’t, because we’ve sucked the earth dry of the easy and plentiful resources that enabled hunter-gatherers to turn to agri-business and sedentarism, which was the prerequisite for everything else. We need high and heavy tech to wring out what’s left of most resources, and a developing society left on their own advice will not have that.
I addressed that in my post. So long as they don’t completely forget technology (and even if the so there should be tons of ruins to reverse engineer from) they could invent renewable wind or hydro power using only Medieval style wind and water mills of their own making and copper wire and a magnet “mined” from a ruined city.
The only resources we actually use up in this context are things that go through an irrevocable chemical or nuclear reaction to release energy - so coal, oil, natural gas, and fissile materials. Iron, copper, aluminum, and other metals we are bringing to the surface and refining from ore. If our civilization collapsed tomorrow, we’d be leaving all that stuff laying on the surface for our successors to use.
Yes, I could see this happening. Kind of a ‘reset’ for civilization-- the remnants of humanity, after whatever apocalytic event they survive, rebuild a civilization that is much more ecologically friendly, with renewable sources of energy. Partly out of necessity because they no longer have the tools to extract the remnants of oil and coal out of the ground, partly out of learning the lessons of the past.
I imagine there’s been science fiction that addressed this. Anybody remember ‘John Titor’, from the early days of the internet? That was the pseudonym of somebody on early newsgroups or chat forums who professed to be a time traveler, and had a lot of people going for awhile. If I remember correctly, he claimed that, after a future nuclear war, survivors had re-formed society in a more ecologically friendly, back-to-nature way, living in small city-state groups.
Oh, man! Heat Death is down to 5%! Yall are pessimistic.
I think they’d have a more immediate problem to solve than figuring out how to make electricity - food. How many of us today know how to grow or find enough food for today, for this week, for this month? We are a civilization accustomed to abundant and safe food being readily available whenever we need, packed in neat cellophane, and I would guess most of the world’s population currently lives well beyond the carrying capacity of their neighborhood, city, county. People will starve long before having to worry about mining resources from former cities.
Right. But if 99.99% of the world’s population died off, and the remainder were reduced to hunter-gatherer tech level, within a generation they’d be no worse off than our anvestors, and they’d be able to redevelop civilization within, oh, 10,000 years or so - which was the original claim I made.
I vote for Cosmic Crapshoot - something eventually happens to the Van Allen belt and one too many solar flares wipes everybody, and pretty much everything, out
I went with cosmic crapshoot as IMO, that has the highest likelihood of happening before we’re capable of sustaining ourselves off planet.
Rendering the planet unlivable. Really not feasible to expect 100% fatality. We might break things to the point that the current population level is unsustainable, but depopulation will only go to whatever the new equilibrium point is. Even an ultraplague won’t cut it. There are tribes in Africa than never have contact with the rest of the world.
Genetically alteration. either natural or deliberate. Would never propagate across the entire species before it was detected. And if it was deliberately introduced, we should have the knowledge to reverse it.
So yeah. Cosmic wrecking ball. We’d see it coming, but there wouldn’t be anything we could do about it.
The novel The Windup Girl imagines a post-apocalyptic world which had been struck by two disasters - runaway climate change and the spread of various genetically modified blights and diseases that were engineered by competing biotech firms to wipe out each others’ crops. The end result is that almost all wild vegetation is gone, most plant species are extinct, and the remaining genetically modified crops have to compete with repeated waves of these artificial blights and illnesses which have started mutating in the wild nearly as fast as the geneticists can modify their crops to protect them. Humanity is left in an arms race, always balanced on the knife’s edge with one bad mutation liable to wipe out all remaining food crops. (It doesn’t help that some of these illnesses manage to leap over into humans forming plagues or that the remaining biotech companies are rumored to be nudging the blights to keep the pressure up on the competition).
It’s an extremely pessimistic novel, one of the bleakest SciFi stories I have ever read - by the end of the story Thailand, which had kept out most of the blights through strict border controls and its own set of master geneticists, is ripped apart by internal divisions exploited by the biotech companies.
The ending of the novel, incidentally, ties into this thread:
The titular “Windup Girl” is a genetically modified human created as a secretary/servant by the Japanese, who are the most advanced creators of genetically modified animals in the novel. She was brought into Thailand by a Japanese businessman during a trip, but she proved too troublesome to bring her back out through customs, so he abandoned her there when he left. Plus, it was an excuse to purchase a newer model.
By the end of the story she overcomes some of the limitations that were bred or trained into her, such as obedience. Her own desperate attempts at freedom are the reason the biotech companies’ plot to loot Thailand fails, although that doesn’t really help the Thai. She escapes the destruction of Thailand along with a mad genius Western geneticist who was one of the architects of the world’s destruction, but was now the secret behind much of the Thai’s success. He is old and dying, and it is implied that his last act will be the creation of a new race of genetically engineered humans, copying many concepts from the Japanese “wind-ups” (which include not just servants but also military models, etc) but improving on them and removing any limitations on behavior, obedience, or reproduction.
One of the “wind-up” creatures that features in the novel is called a Cheshire. The idea is that a few decades earlier some geneticist threw his daughter a birthday party to which he brought experimental cats who could change the color of their fur, for camoflauge. A handful escaped. Within a couple decades Felis domesticus was extinct, entirely replaced by this new creature. With devestating results for songbird populations around the world.
There are repeated scenes where Emiko (the genetically modified character) compares herself to a Cheshire or expresses admiration for them. The implication at the end of the novel is that once people like Emiko exist, free from obedience protocols or intentionally placed limitations (Emiko, for example, cannot sweat, which means that even though she could be as strong or as fast as a military windup, moving like one for more than a few seconds would kill her) Homo sapiens would share the fate of Felis domesticus.
But for the purposes for this question, that wouldn’t count as extinction.
I listened to this really great podcast a while back where to guest postulated that the existential threat to humanity will be biological. That one day, micro drones will be able to target individuals via some sort of DNA signature, and introducing a biological agent. Or maybe bring down planes. From there, the next step will be a recipe for that agent online. Or something like that, I can’t remember exactly but…
Then yesterday I saw this:
o Other
Text messaging while driving, or
Supply chain running out of “Likes”, or
Instatwit social media turns into a Tulpa and devours everything in sight
Wait, I have to pick just one? ![]()
So I picked “humans render earth unlivable” but I think the reality is that the process of escalating the stresses of climate change – such as large-scale crop failures and food and water shortages – will increase the probabilities of global conflicts that may become nuclear while at the same time creating an environment conducive to the mutation of new viruses and bacteria.
I think Alien Invasion and Introduction of Genetic Quirk are the most unlikely to cause human extinction.
Death of Sun and Heat Death of the Universe will occur, but it’s wishful thinking that they will cause human extinction (we’ll be extinct long before either occurs).
Any of the remaining events could be our extinction event, but as things stand now, I’ll place my bet on Humans Render Earth Unlivable. A close second may be Other: AI Kills Its Creators.