Rationalist community


The rationalist community is a 21st-century movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten.
The movement initially gained prominence in the San Francisco Bay Area.
Its members seek to use rationality to avoid cognitive biases.
Common interests include probability, effective altruism, transhumanism, and mitigating existential risk from artificial general intelligence.
The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups. Members who diverge from typical rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" or "EA-adjacent".

Description

Rationality

Rationalists define rationality to include epistemic rationality and instrumental rationality.
The rationalists are concerned with applying science and probability to various topics, with special attention to Bayesian inference. According to Ellen Huet, the rationalist community "aim to keep their thinking unbiased, even when the conclusions are scary".
The early rationalist blogs LessWrong and Slate Star Codex attracted a STEM-interested audience that cared about self-improvement, and was suspicious of the academic humanities and how emotions inhibit rational thinking. However, rationalists rejected the Spock-style archetype of emotionlessness, with LessWrong-founder Eliezer Yudkowsky arguing that emotions can themselves be instrumentally rational responses to situations. The movement attracted the attention of the founder culture of Silicon Valley, leading to many shared cultural shibboleths and obsessions, especially optimism about the ability of intelligent capitalists and technocrats to create widespread prosperity.
Writing for The New Atlantis, Tara Isabella Burton describes rationalist culture as having a "technocratic focus on ameliorating the human condition through hyper-utilitarian goals", with the "distinctly liberal optimism... that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so". Burton writes that "Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is".

AI safety

One of the main interests of the rationalist community is combating existential risk posed by the emergence of an artificial superintelligence. Many members of the rationalist community believe that it is one of the only communities that has a chance at saving humanity from extinction. The stress associated with this consequential responsibility has been a contributing factor to mental health crises among several rationalists.

Extreme values

Bloomberg Businessweek journalist Ellen Huet adds that the rationalist movement "valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior".
Writing in The New Yorker, Gideon Lewis-Kraus argues that rationalists "have given safe harbor to some genuinely egregious ideas," such as scientific racism and neoreactionary views, and that "the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding." Though this attitude is based on "the view that vile ideas should be countenanced and refuted rather than left to accrue the status of forbidden knowledge", rationalists also hold the view that other ideas, referred to as information hazards, are dangerous and should be suppressed. Roko's basilisk and the writings of Ziz LaSota are commonly cited information hazards among rationalists.
Some members and former members of the community have said that aspects of the community can be cult-like.
In The New York Times, religious scholar Greg Epstein stated: "When you think about the billions at stake and the radical transformation of lives across the world because of the eccentric vision of this group, how much more cult-y does it have to be for this to be a cult? Not much."

Lifestyle

While the movement has online origins, the community is also active and close-knit offline. The community is especially active in the San Francisco Bay Area, where many rationalists live in intentional communities and some engage in polyamorous relationships with other rationalists.

History

was originally founded in 2009, although the community had previously existed on various blogs on the Internet, including Overcoming Bias. Slate Star Codex was launched in 2013, and its successor blog Astral Codex Ten was launched on January 21, 2021.
Eliezer Yudkowsky created LessWrong and is regarded as a major figure within the movement. He has also published the Harry Potter fanfiction called Harry Potter and the Methods of Rationality from 2010 to 2015, which led people towards LessWrong and the rationalist community. Harry Potter and the Methods of Rationality was a highly popular fanfiction and is well-regarded within the rationalist community. Yudkowsky has used the work to solicit donations for the Center for Applied Rationality, which teaches courses based on it, and a 2013 LessWrong survey revealed a quarter of its users had found the site due to the fanfiction.
In the 2010s, the rationalist community emerged as a major force in Silicon Valley. Silicon Valley founders such as Elon Musk, Peter Thiel, Vitalik Buterin, Dustin Moskovitz, and Jaan Tallinn have donated to rationalist-associated institutions or otherwise supported rationalist figures. The movement has directed hundreds of millions of dollars towards companies, research labs, and think tanks aligned with its objectives, and was influential in the abortive removal of Sam Altman from OpenAI.
Bay Area organizations associated with the rationalist community include the Center for Applied Rationality, which teaches the techniques of rationality espoused by rationalists, and the Machine Intelligence Research Institute, which conducts research on AI safety.

Overlapping movements and offshoots

The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups. The rationalist community has a large overlap with effective altruism and transhumanism. Critics such as computer scientist Timnit Gebru and philosopher Émile P. Torres link rationalists with other philosophies they collectively name TESCREAL: Transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism.
Members who diverge from typical rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" or "EA-adjacent".

Effective altruism

Postrationalists

The postrationalists are a loose group of one-time rationalists who became disillusioned with the rationalist community, which they came to perceive as "a little culty dogmatic" and as having lost focus on the less quantifiable elements of a well-lived human life. This community also goes by the acronym TPOT, standing for This Part of Twitter. The term postrationalist is also used as a hedge by people associated with the rationalist community who have drifted from its orthodoxy.

Zizians

The Zizians are a splinter group with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four murders. The Zizians originally formed around the Bay Area rationalist community, but became disillusioned with other rationalist organizations and leaders. Among the Zizians' accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI.
The group has been called a cult or cult-like by publications such as The Independent, the Associated Press, SFGate, and Reason. The Boston Globe and The New York Times have compared the Zizians to the Manson Family. Similarly, Anna Salamon, the director of the Center for Applied Rationality, compared the Zizian belief system to that of a doomsday cult.