Assuming our technology remains close to its current level, yes. However, there is a non-zero, perhaps even very high, chance that technology will continue increasing, to the point where average expected lifespan increases by about a year per year… and there is a non-zero chance that technology will advance that far by 2050, depending on what assumptions are made about recursive intelligence-increasing technology.
Well, if you just lock yourself in a vault, then /yeah/, it’s pretty likely you’ll end up dying sooner or later.
(See my previous post on ‘micromorts’. )
QFT. This is, in fact, much the point I was /trying/ to make in my initial post, though I seem to have done so rather poorly. Increasing science, by increasing the prerequisites for it, such as getting governments to respect rights and encouraging economic development. What I’m trying to figure out, at present, are what actions will, in fact, have the greatest effect in this area.
Actually, I don’t see an ELE as that distant. The supervolcano under Yosemite is past due for a blow, and if it matches past events, an ELE or something close to it is very likely. We may have one year. We may have 1000 years. But it’s gonna happen, and if we don’t figure out a way to stop it or survive it, most or all of us are going bye-bye.
Yellowstone isn’t “overdue”. The last supervolcanic eruption there was over half a million years ago. That eruption isn’t associated with a mass extinction. North America would be hit pretty hard, but human extinction would be pretty much impossible.
You’re conveniently overlooking the longevity and durability of Soviet space technology. After the final shuttle mission this year, it will be Soyuz capsules that sustain our human presence in space. Considering that technology was the product of one of those pesky “totalitarian regimes which do not respect rights and attempt centralized economic control”, I think there might be a flaw in your chain of logic.
Yeah, that one isn’t going to pop any time soon. Since it only blows every geologic era or so, I think we can assume that we would detect the rumbling and tumbling long before it blows up. There would probably be hundreds of years of “burps” before the ultimate vomit.
I question most of what you say right off the bat.
Define “better”. Sapient life is “better” for the sapients, probably, but not necessarily for other life or for the ecosystems which sustain all life. It’s conceivable that sapience can lead to actions which may cause mass extinctions, including sapient life in particular,
Is you goal actually to protect sapient life? It’s very likely there IS intelligence elsewhere in the universe. Would you heave a sigh of relief if we found incontrovertible evidence of this? What if we came into conflict with some alien race?
Is your goal to protect the human race in particular?
Is your goal to protect your descendents?
Is your goal to protect yourself?
What time span concerns you? Ultimately the suns will burn out, entropy will take its toll, etc.
I suppose what I’m getting at here is that you are pretending that this exercise is based entirely on a ridiculously altruistic (one might even suggest god-like) motive of preserving intelligence. In the whole fucking UNIVERSE! Possibly for eternity.
You need to take it down a few notches, like to, “I want us to live! It would be embarrassing for the entire species to be wiped out by one lousy rock falling out of the sky.”
The thing is, even if you agree with all the premises, the actions don’t follow.
Doing what we can to improve our chance of survival as a species is very important of course. It doesn’t follow that nothing else we can do has value.
I mean, why is it so great to have sapient life in the universe, if long-term survival is our only ambition or activity?
There is no such thing as an objective standard of value, any more than there is one for beauty. However, the vast majority of people have desires which can be fulfilled without impinging upon other peoples’ desires, eg having the freedom to explore the world around them without getting stabbed, and so there is a general consensus that it’s worth protecting these desires from those few people whose desires can only be fulfilled by blocking others’ desires, ie from stabbing people at random.
At the moment, yes.
Debatable - especially if we limit ‘universe’ to ‘our past light-cone, plus maybe a few thousand years/light-years’. Ever hear of the Fermi Paradox?
I would be over/joyed/.
See my second paragraph, above.
If humans were to create a species of sentient anthropomorphic rodents, ala http://www.datapacrat.com/Rat%20Lass.gif , and humanity died out but the rodents continued living, then my goal would still be fulfilled.
I don’t expect to have any.
“I’m going to live forever or die trying.”
I figure to take it in stages - after all, we’ve only known about relativity for a scant century, and it’s possible there’s all sorts of weird and wacky physics remaining to be discovered. I figure I’m not going to worry about how the universe will be a billion years from now until our understanding of physics has stabilized for, oh, at least a thousand years or so.
You say that like it’s a bad thing.
So let’s start with that. What do you think an individual can actually /do/ to help reduce the risk of humanity’s extinction from that sort of event?
I would argue that that ‘if’ is contrafactual. Survival /isn’t/ our only ambition or activity - it’s simply a necessary /prerequisite/ for whatever /else/ it is we want to do.
A fellow named William Fortier drew it in 1992 - the original image-file is at http://www.datapacrat.com/Furry/CD%20-%20Fur/W%20FORTIER/RATLASS.PNG . I’m a data pack-rat, so when I re-noticed that image a few years ago, I un-dithered the colours and ran it through a vectorizer, so it would look a lot less pixellated. I probably set the background too dark, but other than that, I think it turned out reasonably improved, no?
Sounds to me like the best way to do that is to establish a kind of grad/postgrad student welfare system. And/or make receiving any kind of government aid contingent on contributing to some research program, if only as a janitor or experimental subject.
You’ll get a lot of crap, padded science, of course. But then any system that has just More Science as its goal will do that. (And people will complain about human rights. Piffle.)
This has promise, if they’re rat-sized. They’d be cheaper per sentient to send.
Now you’ve got me thinking about the feasibility of sending out auto-creches, with stores of frozen embryos. And I don’t have time to flesh that solution out, but it’s given me an idea. Maybe the best way to encourage ELE problem-solving is to fund the publication of science fiction. Either encourage the government or businesses to sponsor contests for SF stories with ELE survival themes or just buy those sorts of books, magazines, and movies ourselves.
The writing will reach budding scientists at an impressionable age and possibly steer their field of interest toward something that may save us all. The stories will not be constrained by the current level of technology and may point the way for needed innovations. And it’s relatively cheap. There are thousands of writers and wannabes who would jump all over that. Just have honorable mention prizes for stories with crap plots and characters that have a new wrinkle on survival tech that might be useful. Someone will have to sift through a lot of bad writing, but sacrifices must be made.
If you start a non-profit to fund the contest, I’ll kick in twenty bucks. You can do this.
The fact that survival is not our only ambition or activity is the status quo.
You seem to be suggesting something else however: a situation where everything we do is geared, directly or indirectly, to perpetuating our species. I think this is wrong. What are we perpetuating ourselves for?
As for it being a prerequisite, obviously being alive is a prerequisite for doing things. But the human race preparing itself for an ELE is not a prerequisite for anything, as for almost all of human history we have been ignorant of global threats. Yet we’ve obviously achieved a great deal in that time.
True - but I’d suggest you try a couple of mental exercises.
An average person has about a 33/1,000,000 probability of dying on any given day. There are certain behaviours that increase those chances, like standing on a hill in a thunderstorm, and certain behaviours that decrease them, like always being sure to wear a seatbelt.
What would you estimate is the probability of humanity experiencing an ELE on any given day?
Let’s say that an ELE will arrive on Dec 23, 2048 AD. How much money would it be worth spending to prevent that event? That is, if spending X dollars would allow humanity to survive, and spending X-1 dollars would lead to humanity dying off, then how high could X be and still be worth paying?
Absolutely. And for very unlikely events which could kill me, I take no precautions. For instance, I could wear a hard hat in case a meteorite were to hit me. However I consider the risk to be so low as to not be worth disrupting the rest of my life so much.
I think “sometime in the next million years” is a phrase I hear a lot. So something like 1 / 365,000,000?
Obviously in such a scenario we should spend all of our money. But in reality, we don’t know if such a scenario is approaching, and the probability is very low.
I actually believe that we should be spending more on things like monitoring near earth objects, but nothing like what you’re proposing.
As others have pointed out, our ability to defend ourselves against such threats is probably going to grow rapidly in the next few centuries in any case, without having to focus our whole society on this one event.
I should also point out that this thread has mentioned extinction a lot but the kind of event which wiped out the dinosaurs won’t make us extinct. There are approximately 7 billion of us, and we’re the most adaptable species the planet has ever seen.
I’m not saying that we shouldn’t care about such a horrific and devestating event happening, merely that it sits on the same spectrum as other natural disasters. It’s much more devastating but also much more rare.