Could science go too far and destroy the world?

Someone unconcerned with the long-term effects on his credit rating, I assume, because - hey - mega bomb.

So ask them to score you an 8-ball and a hooker.

Works with AIs, too.

The OP discusses a subset of global catastrophic risks. These include AI, nanotech, global warming, global pandemics, geomagnetic reversals, asteroid impacts and the old favorite extraterrestrial invasion. (The last one seems unlikely.)

Fear not, the philosophy department at the University of Oxford is on top of it: http://www.global-catastrophic-risks.com/

Kidding aside, they did some decent work: Global Catastrophic Risks by Nick Bostrom | Goodreads

There is a Global Catastrophic Risk Insitute: http://gcrinstitute.org/

…though frankly I place more trust in these guys, since they attempt to benchmark such concerns against other worthy endeavors: Open Philanthropy Project update: Global catastrophic risks - The GiveWell Blog

I note that none of these seem to be looking at peak oil as a global catastrophic risk.