OXFORD, England, Friday February 20, 2015 – Researchers from Oxford University’s Future of Humanity Institute and the Global Challenges Foundation have produced the first serious scientific assessment of the apocalyptic risks that could bring about the end of the world.
Some of the 12 scenarios envisioned by the scientists arise from events that are beyond human control, including such disasters as a global pandemic, an asteroid hitting the planet, or the eruption of a supervolcano.
Most, however, emerge from man-made developments, which, while having the potential to benefit mankind, could also lead to our demise. This category includes technological advancements that sound like the stuff of science fiction such as intelligent computers that take over the world, and synthetic biology pressed into service for bio-warfare, but the experts consider them genuine threats.
“This is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities,” the report states.
The 12 most likely causes of the Apocalypse:
An apocalyptic disease would be highly infectious, incurable, almost always fatal and have long incubation periods.
Should all these features occur in a single pathogen, the death toll would be catastrophic.
Apart from the devastation in and near the eruption zone, the danger of a supervolcano lies in the amount of debris released into the atmosphere.
Dust could absorb the sun’s rays and cause a global “volcanic winter” with effects similar to those of a nuclear war or asteroid impact.
There is little that could be done to alleviate the destruction with currently available technology.
This refers to the development of machines and software with human-level intelligence.
Theoretically, such intelligence could not be easily controlled and would probably be able to boost its own intelligence, making it superior to human intellect.
Since it is not known whether there is a real risk of extreme machine intelligence, the researchers gave it a wide estimate of probability.
Extreme climate change
Scientists predict that climate change caused by human activity could mean average global temperatures increase by 4C.
There is nevertheless a risk that the warming could be much more extreme, rising up to 6C.
The consequences of such an increase could leave many countries uninhabitable, leading to famines, mass deaths and migration.
Genetic engineering of super-organisms could be beneficial, but the release of a super-organism that targets humans or a vital part of the ecosystem, could be disastrous.
Whether leaked accidentally, or deliberately in instances of bio-warfare or bio-terrorism, the impact could be worse than any natural pandemic.
Major asteroid collisions, with objects 5 km or larger, occur about once every 20 million years and could pack a punch thousands of times greater than the largest bomb ever detonated.
A land impact could destroy an area the size the Netherlands.
Far more widespread destruction would be caused by the clouds of dust released into the atmosphere, affecting climate, food supplies and creating political instability.
The likelihood of a complete breakdown of the global ecosystem, leading to mass extinction, depends on the extent to which mankind is dependent on the ecosystem.
Some lifestyles could be sustained if they were independent from the network, but achieving this on a large scale, especially during a collapse, would be challenging.
Super-precise manufacturing on an atomic level could create materials with special properties, such as being highly resilient or “smart,” that would benefit mankind.
But these manufacturing technologies could also create major problems, including the depletion of natural resources, pollution and climate change.
It could also lead to the creation of large arsenals of weapons made possible by atomically precise manufacturing.
Some estimates put the risk of deliberate or accidental nuclear conflict in the next century at around 10 percent.
Devastation would be inevitable if a nuclear event triggered a “nuclear winter” – the creation of a cloud of dust and smoke high in the atmosphere that would block the sun’s rays, plunging temperatures below freezing, and possibly destroying the ozone layer.
This could lead to the destruction of the global food supply, making widespread starvation and the collapse of states likely.
Bad local governance
This refers to two main categories of government disasters – failing to solve major solvable problems and actively causing worse outcomes.
Global system collapse
This broad term refers to an economic or societal collapse on a global scale that involves civil unrest and a breakdown of law and order that makes the continuation of human life impossible on Earth.
There are too many unknown factors to predict how likely this outcome would be but such effects have been observed in intricately-connected systems like ecology and finance.
The possibility of collapse is more acute when several networks depend on each other.
This is an umbrella category that represents all the unknown risks that have not been considered, or appear extremely unlikely in isolation, but together could represent a significant threat.