Global catastrophe scenarios


Scenarios in which a global catastrophic risk creates harm have been widely discussed. Some sources of catastrophic risk are anthropogenic, such as global warming, environmental degradation, and nuclear war. Others are non-anthropogenic or natural, such as meteor impacts or supervolcanoes. The impact of these scenarios can vary widely, depending on the cause and the severity of the event, ranging from temporary economic disruption to human extinction. Many societal collapses have already happened throughout human history.

Anthropogenic

Experts at the Future of Humanity Institute at the University of Oxford and the Centre for the Study of Existential Risk at the University of Cambridge prioritize anthropogenic over natural risks due to their much greater estimated likelihood. They are especially concerned by, and consequently focus on, risks posed by advanced technology, such as artificial intelligence and biotechnology.

Artificial intelligence

The creators of a superintelligent entity could inadvertently give it goals that lead it to annihilate the human race. It has been suggested that if AI systems rapidly become super-intelligent, they may take unforeseen actions or out-compete humanity. According to philosopher Nick Bostrom, it is possible that the first super-intelligence to emerge would be able to bring about almost any possible outcome it valued, as well as to foil virtually any attempt to prevent it from achieving its objectives. Thus, even a super-intelligence indifferent to humanity could be dangerous if it perceived humans as an obstacle to unrelated goals. In Bostrom's book Superintelligence, he defines this as the control problem. Physicist Stephen Hawking, Microsoft founder Bill Gates, and SpaceX founder Elon Musk have echoed these concerns, with Hawking theorizing that such an AI could "spell the end of the human race".
In 2009, the Association for the Advancement of Artificial Intelligence hosted a conference to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence". They noted that self-awareness, as depicted in science-fiction, is probably unlikely, but there are other potential hazards and pitfalls. Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.
A survey of AI experts estimated that the chance of human-level machine learning having an "extremely bad " long-term effect on humanity is 5%. A 2008 survey by the Future of Humanity Institute estimated a 5% probability of extinction by super-intelligence by 2100. Eliezer Yudkowsky believes risks from artificial intelligence are harder to predict than any other known risks due to bias from anthropomorphism. Since people base their judgments of artificial intelligence on their own experience, he claims they underestimate the potential power of AI.

Biotechnology

Biotechnology can pose a global catastrophic risk in the form of bioengineered organisms. In many cases the organism will be a pathogen of humans, livestock, crops, or other organisms humans depend upon. However, any organism able to catastrophically disrupt ecosystem functions, e.g. highly competitive weeds, outcompeting essential crops, poses a biotechnology risk.
A biotechnology catastrophe may be caused by accidentally releasing a genetically engineered organism from controlled environments, by the planned release of such an organism which then turns out to have unforeseen and catastrophic interactions with essential natural or agro-ecosystems, or by intentional usage of biological agents in biological warfare or bioterrorism attacks. Pathogens may be intentionally or unintentionally genetically modified to change virulence and other characteristics. For example, a group of Australian researchers unintentionally changed characteristics of the mousepox virus while trying to develop a virus to sterilize rodents. The modified virus became highly lethal even in vaccinated and naturally resistant mice. The technological means to genetically modify virus characteristics are likely to become more widely available in the future if not properly regulated. In December 2024, a broad coalition of scientists warned that mirror life, organisms that use the mirror images of naturally occurring chiral biomolecules, should not be created. Mirror life, a technology "likely at least a decade away" and requiring "major technical advances", may escape into the environment, evading predation by natural organisms and competing against them for non-chiral nutrients.
Biological weapons, whether used in war or terrorism, could result in human extinction. Terrorist applications of biotechnology have historically been infrequent. To what extent this is due to a lack of capabilities or motivation is not resolved. However, given current development, more risk from novel, engineered pathogens is to be expected in the future. Exponential growth has been observed in the biotechnology sector, and Noun and Chyba predict that this will lead to major increases in biotechnological capabilities in the coming decades. They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control. In 2008, a survey by the Future of Humanity Institute estimated a 2% probability of extinction from engineered pandemics by 2100.
Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks, and developing facilities to mitigate disease outbreaks.

Chemical weapons

By contrast with nuclear and biological weapons, chemical warfare, while able to create multiple local catastrophes, is unlikely to create a global one.

Choice to have fewer children

through a preference for fewer children. If developing world demographics are assumed to become developed world demographics, and if the latter are extrapolated, some projections suggest an extinction before the year 3000. John A. Leslie estimates that if the reproduction rate drops to the German or Japanese level the extinction date will be 2400. However, some models suggest the demographic transition may reverse itself due to evolutionary biology.

Climate change

Human-caused climate change has been driven by technology since the 19th century or earlier. Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters. Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, increased spread of known infectious diseases such as malaria, and rapid mutation of microorganisms. This would further amplify human migration, human conflicts such as cross border immigration disputes, leading to increased human conflicts such as wars internally and externally, the breakdown of social order, reductions in GDP and economic welfare. In November 2017, a non-peer reviewed statement titled "World Scientists' Warning to Humanity" signed by 15,364 scientists from 184 countries indicated that increasing levels of greenhouse gases from use of fossil fuels, human population growth, deforestation, and overuse of land for agricultural production, particularly by farming ruminants for meat consumption, are trending in ways that forecast an increase in human misery over coming decades. An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for nine million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer. The report warned that the pollution crisis was exceeding "the envelope on the amount of pollution the Earth can carry" and "threatens the continuing survival of human societies".
Climate change may weaken the Atlantic meridional overturning circulation through increases in ocean heat content and elevated flows of freshwater from melting ice sheets. The collapse of the AMOC would be a severe climate catastrophe, resulting in a cooling of the Northern Hemisphere. It would have devastating and irreversible impacts especially for Nordic countries, but also for other parts of the world. In late 2025, Iceland classified the potential collapse of the AMOC as a national security and existential risk.
Some have proposed climate change could cause total human extinction. Carl Sagan and others have raised the prospect of extreme runaway global warming turning Earth into an uninhabitable Venus-like planet. Some scholars argue that much of the world would become uninhabitable under severe global warming, but even these scholars do not tend to argue that it would lead to complete human extinction, according to Kelsey Piper of Vox. All the IPCC scenarios, including the most pessimistic ones, predict temperatures compatible with human survival in at least some locations. The question of human extinction under "unlikely" outlier models is not generally addressed by the scientific literature. Factcheck.org judged that climate change does not pose an "existential risk", stating: "Scientists agree climate change does pose a threat to humans and ecosystems, but they do not envision that climate change will obliterate all people from the planet."