Again, I think the idea that AI will somehow magically become a monster that we can’t control is really quite absurd. It really harks back to an early fear of computers as a result of failing to understand both their capabilities and their limitations. Computer companies have actually tried to deal with these unfounded fears through subtle design approaches. Some deliberately made the cabinets lower so they wouldn’t appear to loom over the puny humans attending to them. Others, perhaps (just a WAG here) equipped their mainframes with very prominent “emergency off” buttons, as IBM did with System/360.
But as I said above, these fears were both valid in one sense and misplaced in another. You don’t need a big red “emergency off” button to disable a computer. It’s easy. The hard part is that after a period of entrenched dependency, activating that button will also disable your entire business!
I’ll put it to you this way. Saying AI is an existential threat to humanity is similar to saying an encounter with extraterrestrials is a threat.
If it happens, then it probably ends badly for us. Even if they’re benevolent.
But will it happen? Who knows? There are too many unknowns involved.
Seems silly? Well, Stephan Hawking didn’t think so. So, lets keep that in mind if we’re going to appeal to Hawking as an authority. He said both aliens and AI would be threats to humanity.
We (hopefully) aren’t talking about magic. I think people aren’t scared enough because they underestimate AI capabilities, and their ethical and moral limitations.
It is exactly the opposite. People greatly overestimate AI capabilities because they confuse it with what they see in science-fiction. I work almost every day with AI (except when I’m not teaching or writing papers). If you could how dumb they, how reliant they are on my programming skills, then the idea of AI as an existential threat would seem comical to you.
(beep I’m a little disappointed nobody has pointed out that what I’m saying is exactly what a rogue malicious AI masquerading as a human would say beep).
I, for one, welcome our new AI overlords! I want to remind them that I can be helpful in rounding up others to toil for them in their climate controlled raised-floor bunkers.
Not I want to make this too much of a hijack about AI; however, what you do need to worry about with AI is them taking your job depending on what you do for a living. Training an AI to do very specific tasks is something that humanity is getting very good at doing.
Whether this will lead to such a productivity boom that we end up in or close to a post-scarcity world or a dystopian future of a permanent underclass of unemployables… well, my guess is probably the latter sadly, because people suck.
Would a moon sized object be enough to judge a bunch of Kuiper belt objects Earthwards? I’d think you’d need something bigger to have a significant effect on the orbit of large numbers of asteroids.
Your guess would be wrong. Something like 80% of the ocean is unexplored, and a similar proportion of undiscovered species. And we’re not just talking algae, worms and sponges here - scientists think there are a few undiscovered whale and dolphin species.
Let’s just say some disagree with you that having to farm on what was once permafrost would be the only impact:
" There’s a certain level where humans biologically can’t survive outside, as well. We get close enough already in the Arabian Peninsula and some other parts of the world. Remember, 6 degrees is a global average. It would be probably twice that over land and somewhat less than that over the oceans. The oceans would probably stratify, so the oceans would become oxygen deficient, which would cause a mass extinction and a die off in the oceans, as well – which would then release gases and affect land. So it’s pretty much equivalent of a meteorite striking the planet, in terms of the overall impacts."
I know, Lynas is not a scientist, but he does a good job of science aggregation, IMO.
Well I think that is all that needed to be said for this thread. Both these things are potential threats, any disagreement about their probability of happening is besides the point.
For example, in terms of ETs, I think they are not only unlikely to come any time soon, but are unlikely to threaten us if they did. But for the purpose of this thread, I’m happy to simply say that if ETs came here and wanted to eradicate us, they could trivially do so.
Thanks @BeepKillBeep and @wolfpup - very interesting to hear what you have to say. I also agree BTW that processing power does not equal strong AI, I just wanted to be a devil’s advocate. I think what people are seeing is an apparent acceleration of technological progress in general - not just computer processing speed, but especially in that realm. And it appears to conform to an exponential line which cannot be sustainable. Either it flattens out or humans get left behind somehow. Personally I think it’ll be the former. I have no trouble believing that the line we are seeing is the middle part of an S curve. The flattening of that line would create its own huge challenges to our economic and political systems. But probably not an existential one.
I don’t see why not, but it depends upon its trajectory I suppose. I’m no expert in this. Perhaps such an object is most likely to pass through our system without disturbing much. However one of the popular simulations of the history of our solar system is that there was a large planet ejected from its orbit, and that’s what inspired me to suggest it as a possible future event (if a highly unlikely one).
I’ll concede on this one. There is some support for what you’re saying in the literature.
That answers the thread though. It is climate change. Climate change by leagues and leagues and leagues and leagues. Climate change has finished the race and the other horses and still in the barn getting saddled. It isn’t even close. Climate change is here, right now happening. The rest are all mere probabilities of varying degrees.
Pierrehumbert, R. (2019). There is no Plan B for dealing with the climate crisis. Bulletin of the Atomic Scientists , 75 (5), 215-221.
Personally, I cannot ignore probability. In the military, they taught us a simple mechanism for risk assessment of probability times severity (using a 5-point scale for each). Hence, something with a probability of 4 (80%) with a severity of 2 (risk of 8) is worse than something with a probability of 1 (20%) and severity of 5 (risk of 5). Granted this is simplistic, but I think it is a reasonable way to assess a threat.
So let me put is this way. ETs and a general super artificial intelligence are threats, but they are not significant risks to humanity. Not like some of the other things on the list.
A paper on 6-degree change on Denmark. Tl;DR: it is bad.
Trolle, D., Nielsen, A., Rolighed, J., Thodsen, H., Andersen, H. E., Karlsson, I. B., … & Jeppesen, E. (2015). Projecting the future ecological state of lakes in Denmark in a 6 degree warming scenario. Climate Research , 64 (1), 55-72.
Personally, it’s incredibly frustrating seeing people I know IRL, actual trained scientists of one stripe or another, blithely ignoring climate change as a threat while angsting about grey goo or weaponized viruses. I want to grab them, drive out to the Karroo, stand them right on the P-T boundary and yell at them for an hour or so.
Yes- the Grand Tack hypothesis. But that’s why I asked if a moon sized object would be big enough; because according to the Grand Tack hypothesis it was Jupiter that migrated, and obviously Jupiter is many, many, many times larger than the moon (or Earth).
Yeah - it’s often the counter-intuitive stuff that bites hardest, in this sort of scenario. ‘Well, our models say we’ll have increased rainfall, so your scare-mongering of a global desert are … wait, what do you mean, “eutrophication”? Is that bad?’
Here’s a paleoclimate perspective on approximately a 5 to 8 degree rise in global temperature. The only difference is that the carbon emissions took as much as 50,000 years to happen instead of little more than a century. The devastating effects on the planet lasted about 200,000 years. Since one of the most dangerous aspects of post-industrial climate change is its extreme rapidity, one can adjust one’s expectations accordingly. There will be little to no time for natural adaptation. It’s amazing that anyone would consider any other threat to humanity to even be in the same ballpark, since many of those threats – disease, famine, floods and droughts, threats to the water supply, the destructiveness of extreme weather – are all an intrinsic part of rapid climate change.