Risk of astronomical suffering
Risks of astronomical suffering, also called suffering risks or s-risks, are risks involving much more suffering than all that has occurred on Earth so far.
According to some scholars, s-risks warrant serious consideration as they are not extremely unlikely and can arise from unforeseen scenarios. Although they may appear speculative, factors such as technological advancement, power dynamics, and historical precedents indicate that advanced technology could inadvertently result in substantial suffering.
Sources of possible s-risks include advanced artificial intelligence, space colonization and the spread of wild animal suffering to other planets.
S-risks can be intentional, driven by factors like tribalism, sadism or a desire for retribution; or incidental, for example arising as a byproduct of economic incentives.
Artificial intelligence
is central to s-risk discussions because it may eventually enable powerful actors to control vast technological systems. In a worst-case scenario, AI could be used to create systems of perpetual suffering, such as a totalitarian regime expanding across space. Additionally, s-risks might arise incidentally, such as through AI-driven simulations of conscious beings experiencing suffering, or from economic activities that disregard the well-being of nonhuman or digital minds. Steven Umbrello, an AI ethics researcher, has warned that biological computing may make system design more prone to s-risks.Space colonization
could increase suffering by introducing wild animals to new environments. Animals may struggle to survive, facing hunger, disease, and predation, which could result in widespread suffering. Phil Torres argues that space colonization poses significant "suffering risks", where expansion into space will lead to the creation of diverse civilizations and post-human species with conflicting interests. These differences, combined with advanced weaponry and the vast distances between civilizations slowing communications, could result in catastrophic and unresolvable conflicts. Strategies like a "cosmic Leviathan" to impose order or deterrence policies are constrained by the physics of space and the destructive power of future technologies. Torres believes that space colonization should be delayed or avoided altogether.Magnus Vinding's "astronomical atrocity problem" questions whether vast amounts of happiness can justify extreme suffering from space colonization. He highlights moral concerns such as diminishing returns on positive goods, the potentially incomparable weight of severe suffering, and the priority of preventing misery. He argues that if colonization is inevitable, it should be led by agents deeply committed to minimizing harm.