A Study of Rationality and Its Community

For anyone interested in understanding the intersection of rationality, philosophy, and technology, LessWrong represents one of the most notable intellectual communities of the past two decades. This report provides a comprehensive exploration of what LessWrong is, its philosophical foundations, and how its community members are referenced.

The Philosophy and Purpose of LessWrong

LessWrong (sometimes written as Less Wrong) is a community blog and forum established in 2009, focused on the discussion of cognitive biases, philosophy, psychology, economics, rationality, and artificial intelligence (source). The site’s foundational purpose is clear: perfecting the art of human rationality.

LessWrong describes itself as “an online forum and community aimed at improving human reasoning, rationality, and decision-making, with the goal of helping its users hold more accurate beliefs and achieve their personal objectives” (source). This definition underscores a key distinction that forms the basis of the LessWrong philosophy: while truthfulness is a property of beliefs, rationality is a property of reasoning processes. In the community’s view, a more rational reasoning process is one that arrives at true beliefs and good decisions more reliably than a less rational one (New User’s Guide).

The community places significant emphasis on applying rationality lessons to important topics, particularly those with high stakes and uncertainty. Currently, artificial intelligence represents one of the most prominent areas of focus, with many community members working to ensure that increasingly powerful AI systems develop safely and beneficially for humanity (New User’s Guide).

Historical Development

LessWrong developed from an earlier group blog called Overcoming Bias, which began in November 2006 with artificial intelligence researcher Eliezer Yudkowsky and economist Robin Hanson as principal contributors (source). In February 2009, Yudkowsky’s posts were used as the foundation to create LessWrong, while Overcoming Bias became Hanson’s personal blog (EA Forum).

The philosophical backbone of LessWrong consists of “The Sequences,” a series of essays written by Yudkowsky between 2006 and 2009 that explore rationality concepts in depth (New User’s Guide). These essays aim to describe how to avoid typical failure modes of human reasoning with the goal of improving decision-making and evidence evaluation (source).

In 2013, a significant portion of the rationalist community shifted focus to Scott Alexander’s Slate Star Codex blog (EA Forum). By 2015, activity on LessWrong had declined considerably. However, the site experienced a revival when it was relaunched as “LessWrong 2.0” in late 2017 with a new codebase and dedicated team (EA Forum). Since then, activity has recovered and maintained steady levels.

References to Community Members

Across various platforms and discussions, members of the LessWrong community are referenced using several different terms:

  1. Less Wronger – A short form used to refer to community members, though as one user notes, “you have to explain it” (GreaterWrong).
  2. Aspiring rationalist – A term reflecting the community’s emphasis on rationality as an ongoing pursuit rather than a fixed state (New User’s Guide, GreaterWrong).
  3. Rationalist – The most commonly used term, though the community acknowledges this may oversimplify their position (New User’s Guide).

The Community’s Cultural Distinctiveness

LessWrong cultivates a unique culture that emphasizes specific norms and values around reasoning and discourse. These include:

  • A focus on collaborative truth-seeking over winning arguments
  • Using Bayesian reasoning to update beliefs based on evidence
  • Avoiding biases in decision-making and evidence evaluation
  • Applying rationality concepts to practical matters and real-world problems
  • Discussing potentially controversial or unusual ideas if they follow from rational inquiry (New User’s Guide)

The community’s culture is described as “uncommon for web forums,” with the site’s administrators and moderators actively working to maintain its distinctive character amid growing attention and new user influx (New User’s Guide).

LessWrong has played a significant role in the development of other intellectual movements, most notably effective altruism. The two communities remain closely intertwined, with surveys indicating substantial overlap in membership (source).

Conclusion

LessWrong represents a unique intellectual project centered on improving human rationality through systematic examination of reasoning processes. While terms like “Less Wronger” and “aspiring rationalist” are used to reference community members, these reflect a deeper philosophical commitment to viewing rationality as an ongoing pursuit rather than a fixed achievement. As the community continues to evolve, particularly in its focus on artificial intelligence safety, its distinctive approach to rational inquiry remains its defining characteristic.