Climate Change - Not currently predicted to result in a runaway catastrophe and as technology improves it becomes cleaner so, while we might not yet be at peak CO[sub]2[/sub], the general trend should be downward in about 30 years. It’s also worth noting that, most likely, we could build giant scrubbers, seed the air with large quantities of sulfur dioxide, etc. to alleviate the issue, should it start looking like we were headed into catastrophe territory.
Artificial Intelligence - Most AI will not be true AI. There are ethical implications to create a real life form (virtual or not), and that should cause most people working on it to tread carefully. And, to create true AI, you would need to give it an environment that was of sufficient complexity for it to develop. Most likely, this would be a virtual realm; think World of Warcraft or Minecraft. Despite Hollywood’s’ presentation, this basically sandboxes the AI to a place where it can not access the code. You can’t hack a computer by punching monsters in an instanced dungeon. That’s just not how it works. And, due to the ethical implications, it would possibly be considered unethical to reveal to the AI that its universe was a simulation. You would either let it play out or simply hit the stop button and then analyze the data.
Nuclear Weapons - Might take out large chunks of humanity. Unlikely to end the species. Due the physical realities of mining and processing ore, it’s unlikely that a non-state actor would be able to develop nukes, which leaves them in the hands of bureaucracies that are going to be slow to deploy them (if at all) and selective of their target. There’s no value to a nation in destroying humanity, and so they would not deploy them in such a way.
Disease - Not any arbitrary / natural disease, but lab equipment and microengineering / DNA manipulation could become cheap and easy enough for a lone madman to hand develop something completely catastrophic. In a worst case, he would be able to create something with a timed deployment, so it could spread naturally, without side effect, for several years before triggering and killing everyone. It’s likely that a few humans on island would still survive, but they would probably have sufficiently backwards cultures that (potentially) you would be resetting the history of human civilization. Though, as said, this would be the worst case. A less carefully created disease would take out large chunks of humanity, but they’d come right back at the same technological level as before.
Asteroid Collision - As technology improves, our ability to spot and redirect an asteroid becomes much better. It would basically need to hit us in the next 40 years or, likely, we’ll have advanced to a point that it’s no longer a reasonable worry. The nice thing with space collisions is that you’ve (potentially) got such a long forewarning that even with minimal explosives, you still might be able to divert mass from their target just by adjusting their course by 0.00001 degrees.
Overpopulation - Self-correcting.
Return or Initial visit of Divine Being - Based on the current lack of indication of any such event in the last 13.82 billion years, the odds are against any such fear.
WWIII - As someone else indicated, this is effectively the same as the nuclear threat.
Other - Nanobots, but this is just the same as disease, using an alternate mechanism.