What is the most likely cause of the end of the world?
This is an incredibly important question to ask. Maybe even the most important question in the world. Here is a list of options.
Option A: The heat death of the universe. We're not going to make it past that. The less plausible the rest of this list is, the more plausible this option is.
Option B: An artificial intelligence that can do any cognitive task a human being can do. For justifications that this is a real risk, I'm going to link you elsewhere, since there is a lot of ground to cover. But here's the three sentence version. We have our dominant position because of intelligence. If something smarter than us did not share our values, it would be hostile to us, and it would defeat us. It is very hard to figure out how to successfully encode values into a computer program that we would actually want optimized. Here is an article I wrote that goes into more detail, but is still only an introduction. Here is an article that describes what eminent AI researchers think of all this. This is the book on the topic out of Oxford. This is a paper on why we can't just put a human-level intelligent AI in a sealed box and be fine. Here is an outline of some of the current research directions that people are working on to try to mitigate this risk.
The rest of these are much less likely than either A or B. I put Option C at 1% and the rest at less than 10^-5.
Option C: A pandemic. If a disease had the contagiousness of influenza, the incubation period of small pox, and the morbidity of rabies, that might just finish us. I'm using examples of real diseases to make this seem as plausible as I can. That said, one person has survived rabies, so maybe this imagined pandemic would have to have a morbidity beyond any disease that we have ever encountered. That, of course, makes this less plausible.
Option D: The sun expands and gets hot enough to make Mars uninhabitable. This will happen in a few billion years. That is our clock for developing interstellar travel. Given how much we've accomplished in the last couple hundred though, and given that we have tens of millions of such intervals in the future, I think we have plenty of time. Never too early to start though, so Elon Musk is doing the right thing here.
Option E: The sun expands and gets hot enough to make Earth uninhabitable. This will happen in about a billion years. This seems even less likely that Option D to me, since it doesn't seem that hard to get to Mars and make it's atmosphere livable, and a billion years is quite a while. Complete and permanent civilizational collapse is necessary here; not just civilizational collapse for a few thousand years.
Option F: Nuclear War. Most models have Australia and New Zealand surviving, so the risk of this option is really just the risk that those models are wrong. I trust the experts here, but it would be silly not to allow at least a tiny probability that they are wrong.
Option G: HUUUGE asteroid. We survived the last asteroid, and our technology has gotten a lot better since then. Not too worried about this one.
Here are some non-options that people are often concerned about.
Non-option A: Global warming. Global warming will suck, and it will probably be a bigger ethical catastrophe than anything that has ever happened. But in order for global warming to wipe us out entirely, we would need to burn about 10 times as much coal, oil, and natural gas as currently exists in the earth's crust. I also offer a more complete picture here in a very short post (much shorter than this one).
Non-option B: Normal sized asteroid. As mentioned above.
I think that this answer represents something close to what people at the Centre for the Study of Existential Risk would say, but I suspect there are also minor differences. That's the next place I'd head to if you're interested in more opinions.