PauseAI
PauseAI is a global political movement founded in the Netherlands with the stated aim of achieving global coordination to stop the development of more powerful general artificial intelligence systems, at least until it is known how to build them safely, and keep them under democratic control. The movement was established in Utrecht in May 2023 by software entrepreneur Joep Meindertsma.
Proposal
PauseAI's stated goal is to "implement a temporary pause on the training of the most powerful general AI systems". Their website lists some proposed steps to achieve this goal:- Set up an international AI safety agency, similar to the International [Atomic Energy Agency|IAEA].
- Only allow training of general AI systems if their safety can be guaranteed.
- Only allow deployment of models after no dangerous capabilities are present.
Background
History
Founder Joep Meindertsma first became worried about the existential risk from artificial general intelligence after reading philosopher Nick Bostrom's 2014 book Superintelligence: Paths, Dangers, Strategies. He founded PauseAI in May 2023, putting his job as the CEO of a software firm on hold. Meindertsma claimed the rate of progress in AI alignment research is lagging behind the progress in AI capabilities, and said "there is a chance that we are facing extinction in a short frame of time". As such, he felt an urge to organise people to act.PauseAI's first public action was to protest in front of Microsoft's Brussels lobbying office in May 2023 during an event on artificial intelligence. In November of the same year, they protested outside the inaugural AI Safety Summit at Bletchley Park. The Bletchley Declaration that was signed at the summit, which acknowledged the potential for catastrophic risks stemming from AI, was perceived by Meindertsma to be a small first step. But, he argued "binding international treaties" are needed. He mentioned the Montreal Protocol and treaties banning blinding laser weapons as examples of previous successful global agreements.
In February 2024, members of PauseAI gathered outside OpenAI's headquarters in San Francisco, in part due to OpenAI changing its usage policy that prohibited the use of its models for military purposes.
On 13 May 2024, protests were held across thirteen countries before the AI Seoul Summit, including the United States, the United Kingdom, Brazil, Germany, Australia, and Norway. Meindertserma said that those attending the summit "need to realize that they are the only ones who have the power to stop this race". Protesters in San Francisco held signs reading "When in doubt, pause", and "Quit your job at OpenAI. Trust your conscience". Jan Leike, head of the "superalignment" team at OpenAI, resigned two days later due to his belief that "safety culture and processes taken a backseat to shiny products".