APOCALYPSE. Now – or later? 50-50 odds says British scientist....

Um … how have the recent advances in science and technology increased the chances of an erupting supervolcano, again?

I think it is safe to blame the journalists. Obviously, (?) this guy’s ideas are poorly presented. Possibly purposely so to excite the dear reader.

It may turn out that he talking about the chances of things going down the shitter this milenium. Who knows?

As is, though, it seems more like a Weekly World News story. I wonder if they have picked it up yet. And if not, I wonder why not?

tracer-“Um … how have the recent advances in science and technology increased the chances of an erupting supervolcano, again?”

Well, I’ve got a Bombadier Trigemina Auto-Deep-Earth-Bore in the back yard. But that only makes minivolcanos.
SimonX has a good point. From the Amazon review it looks like he means chance of long term survival, i.e ride out the Sun or something. A quote from the book in the review:

Though the first amateur review seems to think he meant 50/50 surviving to the end of the century.

Good thing you got that link in there otherwise people might not have taken this seriously.

GOM my friend, you’re confuting two uses of “Apocalypse” in the OP – and the folks making fun of the article have not paid attention to it.

In sum, paying attention to the odds for every form of manmade and natural catastrophe of the degree that might reasonably be termed apocalyptic in scope (a Tambora-size eruption causing another “year without a summer” is obviously something more than a typical volcanic eruption, with effects generally local in scope), he comes out with even odds that we will experience a catastrophe sooner or later. He gives no particular time scale, though I’d assume it would be less than the five billion years until the Sun enters red-giant phase and incinerates the planet, a virtual certainty if nothing else intervenes.

However, this is a far cry from saying that The End of the World Is Imminent; Repent Or You’ll Be Left Behind!

When a noted scientist is able to produce substantiatable scientific evidence that the Rapture is imminent, I’ll start paying more attention to this sort of thing.

I think a few people have got the wrong idea about Rees, that he is some kind of apocalyptic nutter who might be prosletysing the End Times.
In fact, this WorldNetDaily article is badly written and sensationalist…
here is Amazon about Sir Martin’s book Our Final Hour:

Editorial Reviews

Just when you’ve stopped worrying and learned to love the bomb, along comes Sir Martin Rees, Britain’s Astronomer Royal, with teeming armies of deadly viruses, nanobots, and armed fanatics. Beyond the hazards most of us know about–smallpox, terrorists, global warming–Rees introduces the new threats of the 21st century and the unholy political and scientific alliances that have made them possible. Our Final Hour spells out doomsday scenarios for cosmic collisions, high-energy experiments gone wrong, and self-replicating machines that steadily devour the biosphere. If we can avoid driving ourselves to extinction, he writes, a glorious future awaits; if not, our devices may very well destroy the universe.


and about Martin Rees


Sir Martin Rees is Royal Society Professor at Cambridge University, a Fellow of Kings College, and the U.K.'s Astronomer Royal. The winner of the 2001 Cosmology Prize of the Peter Gruber Foundation, he has published numerous academic papers and books and is the author of four titles for a general readership: Our Cosmic Habitat, Gravity’s Fatal Attraction, Before the Beginning, and Just Six Numbers. He lives in Cambridge, England.

Sir Martin’s category of Bioerror includes any disaster with widespread effects caused by the action of humanity- this includes
the creation of small black holes or strangelets, by high energy physics experiments-
the physics of these entities are supposed to be well understood,
and they are not expected to cause a threat.
but none have yet been observed, so there may be unexpected risks involed in their manufacture.
Similarly the creation of GM viral and bacterial diseases is a danger that many people are aware of, but the possibility of self replicating nanotechnology swarms is one that is often dismissed or sensationalised-
Rees recognises that there is nothing implausible about a slow conversion of the Earth’s biosphere by a deliberately designed Drexlerian swarm, (a possibility explored in our own collaborative future scenario)…
If the resources of the world can be exploited by biological organisms that have been the result of billions of years of selection and adaptation, the same resources could be sequestered by an artificially designed ecosystem of replicators which need not have any regard for its own continued existence.
Of course, it would be absurd and idiotic to design such a set of machines; that is not a reason to rule the possibility out.

As far as I can figure, Sir Martin does not include the supervolcano in his list of possible ‘bioerrors’; it is a long term certainty that the larger volcanic eruptions will introduce vast amounts of dust into the atmosphere, as they have done hundreds of times in the past billion years-
normally, this causes a mild species extinction or dieback, barely noticeable in the geological record- but such an event would cause economic chaos on a acivilsed world such as our own.

So, basically, the problem many people are having with this news article is probably caused by sloppy journalism- Sir Martin is in fact optimistic about the long term survival of the Human race and its descendants; but we must be aware of the possible dangers, even the off-the-wall ones, in order to ensure our prolonged existence…
which is a long way from predicting the onset of the Book of Revelations, as some people seem to imply.


SF worldbuilding at
http://www.orionsarm.com/main.html

Yeah, thats right…laugh now…

Done and done. Ha-ha-ha, haha, hahahaha, etc. :wink:

Haw haw haw!

There’s rather clearer coverage of what Rees is saying in his recent interview with the Guardian.
Or you can read the piece he wrote for the Independent - provided you fork out some dosh.

And his claim is that it’s a 50:50 chance of us surviving the 21st century, hence the original UK title of the book: Our Final Century.

Didn’t he die in the first one?

Bruce Willis is still alive and well. his character may have died though.

There’s another dream crushed, thanks a lot.

[ul]:stuck_out_tongue: [sup]Oh, shit HPL this isn’t the time to get technical[/sup][/ul]

Well, let’s see. The only one that has any scientific backing is the supervolcanoes and viruses, but it’s still a load of crap.

Supervolcanoes: The only one I’ve heard of is the one around yellowstone. If it were to blow, it would (supposedly) kill a large number of the people in the surrounding states, cast debris over the western half of north america (Causing some more deaths but mostly just damage), and cast enough ash into the air to lower temperatures a few degrees and alter weather patterns. Yes, it will bring along another mass extinction, though with all the infrastructure humans have built up over the years, the more industrialized countries would survive (We have, after all, survived several mass extinctions already). However, this is entirely unaffected by the increase in scientific knowledge, and unlikely to happen any time soon. I should also note that, as we develop science more and more, we may figure out how better to avoid an event like this. Supervolanoes are caused by mass pressure under the surface, IIRC. In a few years, we might figure out exactly how to release this pressure without the full explosive effects.

Bioerror: Bad sci-fi plot, off the starboard bow! (Any scientific evidence that supports this idea?)

Nuclear terrorism: A full strike by all nuclear-armed nations, carefully coordinated, in a full-scale international effort, could not kill off the human species. Even at the height of the cold war, when there were many more nukes in Russian and US inventories, the use of nukes was anticipated to only slow down the opposing nation’s war machine long enough for their own conventional forces to win the war; they couldn’t even wipe out the opposing nation (In fact, some Russian studies found it unlikely that it would even slow the US down enough to win the war!). Terrorists with SADMs are not going to do what the two major superpowers of the time could not.

Deadly manufactured viruses: Viruses are always a problem, but I think the risk of this is often overdramatized. Making a disease that is virulent enough, has a long enough dormant period to be spread over a wide area before detection, and is impossible to counteract, is not an easy feat. It’d be more likely that nature would come up with one before we do.

Rogue machines: Bad sci-fi plot, part 2. The machines aren’t likely to start rioting for equal rights, and Skynet is probably not going to be installed in the next few years. Movies aside, we don’t have working sentient AI, much less one that would be better without humans caring for it (Seriously, I would think it more likely that an AI might be more satisfied. I’d love to be able to browse around online and play games all day without having to worry about the electric bill or food). And even if we start going into AI-in-the-military scenarios, current policy is that no machine may achieve weapons release without explicit human consent. I don’t see that changing any time soon, and I would predict that it will not change any time prior to AI, nor likely for a good time after.

Character-altering genetic engineering: I thought we were talking near-future? We’re probably decades away from human genetic engineering being done, if not a century or more (It will likely be illegal for a good while even after it’s feasable). And then, we wait a couple decades before that small portion of that generation grows up. Might reach a few percent of the population in a century. You’d think if there were negative side-effects, it would be taken care of by then, no?

Yeah. And does he have anything to support this, or is it as reasonable as “believing” some particular sport team is going to win next years finals/superbowl/whatever?

Seeing as I seem to be the Astronomer Royal’s apologist, I have been sticking up for him a bit-

but you are mostly right;
Supervolcanos- risk trivial in the next hundred years;
Asteroids-ditto;
Character-altering genetic engineering- trivial in the next *hundred * years, but may turn out to be the most important thing in the more distant future;
Rogue Machines- although I am an enthusiast for the development of these things, I don’t see that they will be advanced enough to cause problems to civilisation for several centuries yet;

High Energy physics experiments creating exotic phenomena-
now this is the area Sir Martin knows most about, and yet there doesn’t seem to be much real danger from these experiments;

Nuclear weapons- they certainly wouldn’t wipe out the population of the Earth, but they might well cause the economic system to collapse- if they do this after the oil runs out, we may never bootstrap ourselves back to today’s standard.
But once again the oil won’t run out for a hundred years or so, and the coal will last even longer, so the real danger period is later than the next hundred years, by which time there should be other technologies available to replace fossil fuels;

But I do see viruses as a threat in this 100 year timescale, and this really is something that must be addressed.
If SARS, a relatively mild plague, causes havoc in a city like Toronto imagine the panic if an artificial plague showed up at several locations simultaneously.

Yes, on analysis Martin Rees is wrong, and as I said in my first post, he is speaking outside his speciality area, but what he calls Bioerror, that is the possibility of Mankind wiping itself out, will increase as technology increases, and this is the best argument for getting off the planet-

and ultimately this is the reason for the whole exercise, I should imagine- Rees, like me, thinks we should get out of the cradle of the Earth and spread out, to avoid the eggs-in-basket syndrome.


SF worldbuilding at
http://www.orionsarm.com/main.html

There’s a pretty good article about this subject in this month’s issue of Wired (“We’re All Gona Die!”). The article mentions Martin Rees and Stephen Hawking’s estimates on the expiration date of humans (Hawking predicts less than a thousand years).

The article also discuses J. Richard Gott III’s statistical formula for predicting the end of human society. His formula was used before to predict that when the Berlin Wall would fall (he said it would last between 2.6 and 24 years…not a bad guess).

They don’t really discuss all of the variables that go into the formula, so I’m not sure how much faith anyone should put in the results, but Gott says that, with 95 percent certainty, humanity will last from 205,000 to 8 million more years.

at least that is better than Brandon Carter’s prognosis-
thank god!

the other side of this coin is, once humanity has escaped from Earth and the solar system, it is very difficult to imagine a catastrophe or any other process that will cause our extinction-
untill the heat death or big rip, that is…


SF worldbuilding at
http://www.orionsarm.com/main.html

i’m sure the Black Mesa Research Facilty will play a big role in a bioerror.:wink:

Hopefully they’ll perform this experiment VERY close to this thread (or Chick’s)…