The Global Priorities Project and the Future of Humanity Institute, both based at Oxford University, recently produced a Global Catastrophic Risk 2016 report. It’s less gripping than the Left Behind novels about the Second Coming of Christ (with titles like The Rapture: In the Twinkling of an Eye/Countdown to the Earth’s Last Days), but, in its own dry, detached way, no less scary.
According to the Oxford experts’ calculations, extinction of the whole human race is reasonably likely. Scientists have suggested that the risk is 0.1% per year, and perhaps as much as 0.2%. While this may not seem worthwhile worrying about, these figures actually imply, says the report, that “an individual would be more than five times as likely to die in an extinction event than a car crash”.
What sort of calamities are we talking about? Collision with an asteroid, the eruption of a super-volcano, extreme climate change, a bio-engineered pandemic, or even a super-intelligent computer declaring war on wetware humanity.
Tiny probabilities add up, so that the chance of extinction in the next century is 9.5% — which is worth worrying about. And of course, a mere global catastrophe, involving the death of a tenth of the population, is far more likely. That is a very startling statistic.
However, even at Oxford they make mistakes. Within days of issuing the Global Catastrophic Risk 2016 report, the experts were eating humble pie. A mathematician reviewed its calculations and concluded that “the Future of Humanity Institute seems very confused re: the future of humanity”.
The views, opinions and positions expressed by these authors and blogs are theirs and do not necessarily represent that of the Bioethics Research Library and Kennedy Institute of Ethics or Georgetown University.