Before we get into the meat of the matter, some things to know upfront. From the introduction to 2018's "Tips And Tricks For Investing In 'End of the World Funds'":
As unauthorized representatives for Long or Short Capital's End of the World Puts this is an area of profound interest from which we have gleaned some insight:
1) Should the world end, collecting on your bet can be a challenge. Know your counterparty!
And possibly more important, demand collateral!
2) The swings in end of the world product prices can be dramatic.
3) Prognosticators have predicted 100,000 of the last 0 termination events....
And from arXiv (astrophysics) at Cornell:
How unlikely is a doomsday catastrophe?
Numerous Earth-destroying doomsday scenarios have recently been analyzed, including breakdown of a metastable vacuum state and planetary destruction triggered by a "strangelet'' or microscopic black hole. We point out that many previous bounds on their frequency give a false sense of security: one cannot infer that such events are rare from the the fact that Earth has survived for so long, because observers are by definition in places lucky enough to have avoided destruction. We derive a new upper bound of one per 10^9 years (99.9% c.l.) on the exogenous terminal catastrophe rate that is free of such selection bias, using planetary age distributions and the relatively late formation time of Earth....
If interested we mentioned Bostrom in "The Roubini Cascade: Are we heading for a Greater Depression?".
His Future of Humanity Institute at Oxford seems to have moved on from
the mundane climate cataclysm or cosmic fireball ending everything, to a
very real, very serious examination of whether or not Artificial
Intelligence may be the nail in humanity's coffin, so to speak.