Subscribe for updates and more.

Less Wrong

A compendium of Less Wrong.

Less Wrong is a community resource devoted to refining the art of human rationality, sometimes known as rationalism.

Less Wrong believes in promoting lifestyle changes that lead to increased rationality and self-improvement. Writing posts that often focus on avoiding biases related to decision-making, the evaluation of evidence, and psychological barriers that prevent good decision-making.

In the early 2000s, American artificial intelligence researcher Eliezer Yudkowsky was frequently annoyed, frustrated, and disappointed in people’s inability to think in ways he considered obviously rational. He eventually focused on teaching the rationality skills necessary to do AI safety research until a sustainable culture would allow him to focus on AI safety research while also finding and training new AI safety researchers.

In November 2006, Eliezer Yudkowsky and Robin Hanson developed the blog Overcoming Bias focused on human rationality.

In February 2009, after the topics drifted more widely, Yudkowsky’s took his Overcoming Bias contributions to start the community blog Less Wrong and Overcoming Bias became Hanson’s personal blog.

LessWrong started with a series of daily blog posts written by Eliezer, originally called The Sequences. The Sequences were compiled into an edited volume, Rationality: A-Z. These writings attracted a large community of readers and writers interested in the art of human rationality.

Diaspora

Starting in 2012, many core members of the community stopped posting on Less Wrong due to the physical community’s growth in the Bay Area and increased demands and opportunities from other projects. Yudkowsky received enough support to focus on AI research instead of community-building. Prominent writers on Less Wrong left to their own blogs to develop their voice without asking if it was within the bounds of Less Wrong—a move which collectively formed the ‘rationalist movement.’

Less Wrong acknowledges this departure of top writers and lists blogs from the diaspora and rationalist movement.

Some prominent ideas that grew out of the Less Wrong community members include:

In 2015-2016 the site continued a steady decline of activity leading some to declare the site dead.

LessWrong 2.0

Welcome to LessWrong!

In 2016-2017, discussion of revival occurred and Oliver Habryka formed a team to relaunch LessWrong in 2017 on a new codebase—LessWrong 2.0.

LessWrong 2.0 was the first time LessWrong had a full-time dedicated development team behind it instead of only volunteer hours.

After nearly being put into read-only archive mode, LessWrong 2.0 brought the site back to life.

launched in 2018 with a new codebase and full-time team.

LessWrong is a place to 1) develop and train rationality, and 2) apply one’s rationality to real-world problems.

The LessWrong Books

For the first time, you can now buy the best new ideas on LessWrong in a physical book set.

In 2019, LessWrong started an annual review process to determine the best content on the site. The community reviewed all the posts on LessWrong posted in 2018 and voted to rank the best of them.

Of the over 2000 LessWrong posts reviewed, the LessWrong 2018 book contains 41 of the top voted essays, along with some comment sections, some reviews, a few extra essays to give context, and some preface/meta writing.

Rationality: From AI to Zombies

Rationality: A-Z, also referred to as the sequences, is an extensive exploration of how human minds come to understand the world they exist in and reasons they commonly fail to understand.

These sequences are a series of essays written by Eliezer Yudkowsky between 2006 and 2009 on the blogs Overcoming Bias and Less Wrong. About half of these essays were organized into a number of thematically linked “sequences” of blog posts.

You can read the original sequences on Less Wrong, on this microsite, or buy the books.

The comprehensive work:

  • Lays foundational conceptions of belief, evidence, and understanding.
  • Reviews the systematic biases and common excuses which cause us to believe false things.
  • Offers guidance on how to change our minds and how to use language effectively
  • Depicts the nature of human psychology with reference evolution.
  • Clarifies the kind of morality we can have in a reducible, physical world.
  • Repeatedly reminds us that confusion and mystery exist only in our minds.

Abridged indexes of the sequences:

The Books

R:AZ 1 Map and Territory sequences

R:AZ 2 How to Actually Change Your Mind sequences

R:AZ 3 The Machine in the Ghost

R:AZ 4 Mere Reality

R:AZ 5 Mere Goodness

R:AZ 6 Becoming Stronger

The Codex

Harry Potter and the Methods of Rationality