A Whirlwind Tour of LW Rationality: 6 Books in 32 Pages - Map And Territory

(Back to Introduction)

Map and Territory A: Predictably Wrong

Epistemic rationality is using new evidence to improve the correspondence between your mental map and the world. Instrumental rationality is effectively accomplishing your goals. (What Do We Mean By Rationality?)

Rationality does not conflict with having strong feelings about true aspects of the world. (Feeling Rational)

Epistemic rationality is useful if you are curious, if you want to be effective, or if you regard it as a moral duty, the last of which can be problematic. (Why Truth? And…) A bias is an obstacle to epistemic rationality produced by the ‘shape’ of our mental machinery. We should be concerned about any obstacle. (…What’s A Bias, Again?)

We use an availability heuristic to judge the probability of something by how easily examples of it come to mind. This is imperfect, creating the availability bias. Selective reporting is a major cause. (Availability)

We use a judgement of representativeness to judge the probability of something by how typical it sounds. This suffers from the conjunctive bias, where adding more details increases perceived probability. (Burdensome Details)

We tend to examine only the scenario where things go according to plan. This suffers from the planning fallacy, in which difficulties and delays are underestimated. (Planning Fallacy)

We use our own understanding of words to evaluate how others will understand them. This underestimates differences in interpretation, leading to the illusion of transparency. (Illusion Of Transparency: Why No One Understands You)

Inferential distance is the amount of explanation needed to communicate one person’s reasoning to another. We routinely underestimate it. This is because our background knowledge differs a lot more now than it used to in the past. (Expecting Short Inferential Distances)

A metaphor for the human brain is a flawed lens that can see its own flaws. (The Lens That Sees Its Flaws)

Map and Territory B: Fake Beliefs

A belief should be something that tells you what you expect to see; it should be an anticipation-controller. (Making Beliefs Pay Rent (In Anticipated Experiences))

Taking on a belief can acquire social implications, and this results in a variety of compromises to truth seeking. (A Fable Of Science And Politics) It is possible to believe you have a belief while still truly expecting to see the opposite; this is belief-in-belief. (Belief In Belief, Bayesian Judo) Holding a neutral position on a question is a position on it like any other. (Pretending To Be Wise)

Religious claims to be non-disprovable metaphor are a socially-motivated backdown from what were originally beliefs about the world, with claims to ethical authority remaining because they have not become socially disadvantageous. (Religion’s Claim To Be Non-Disprovable) At other times, we can see socially-motivated claims of extreme beliefs, as a way to cheer for something. (Professing And Cheering)

Belief as attire is belief that is professed in order to show membership of a group. (Belief As Attire)

Some statements exist simply to tell the audience to applaud and do not actually express any belief; we call these applause lights. (Applause Lights)

Map and Territory C: Noticing Confusion

When uncertain, we want to focus our anticipation into the outcome which will actually happen as much as possible. (Focus Your Uncertainty)

It means exactly what you think it means for a statement to be true. Evidence is an event, entangled by cause and effect, with what you want to know about. Things that react to that event can become entangled with what you want to know about in turn. Beliefs should be determined in a way that makes them entangled, as this is what makes them accurate. You must be conceivably able to believe otherwise given different observations. (What Is Evidence?)

Scientific evidence and legal evidence are subsets of rational evidence. (Scientific Evidence, Legal Evidence, Rational Evidence)

The amount of entanglement needed to justify a strong belief depends on how improbable the hypothesis was to begin with, which is related to the number of possible hypotheses. (How Much Evidence Does It Take?, Einstein’s Arrogance)

Occam’s Razor is the principle that the correct explanation is the simplest that fits the facts. The simplest explanation must be defined as the shortest length it takes to fully specify a program that simulates the explanation/a universe that performs the explanation rather than English sentence length. Solomonoff Induction is a formalisation of this; one variant predicts sequences by assigning a base probability to programs of 2-[bit length] and then weights based on how their predictions fit. This definition reduces probability of an explanation equally to the extent to which it simply embeds a copy of the observations, and only so rewards explanations which are compressed relative to the observations. (Occam’s Razor)

Your strength as a rationalist is your ability to notice confusion; your sense that your explanation feels forced. (Your Strength As A Rationalist)

Absence of evidence is evidence of absence. If something being present increases your probability of a claim being true, then its absence must decrease it, in amounts depending on how likely the presence was in either case. (Absence Of Evidence Is Evidence Of Absence) There is conservation of expected evidence. (Conservation Of Expected Evidence)

We have a hindsight bias which makes us think we already believed something when we read it. (Hindsight Devalues Science)

Map and Territory D: Mysterious Answers

A fake explanation is an explanation that can explain any observation. (Fake Explanations) Using scientific-sounding words in one is using science as attire, and not actually adhering to science. (Science As Attire) After seeing a thing happen, we tend to come up with explanations for how it was caused by a phenomenon, even when we couldn’t have predicted it ahead of time from our knowledge of that phenomenon. This is fake causality and made hard to notice by the hindsight bias. The hindsight bias is caused by failing to exclude the evidence we get from seeing a claim when evaluating how likely we thought it was before we saw it. (Fake Causality)

Positive bias is attempting to confirm rather than disconfirm theories, which fails to properly test them. (Positive Bias: Look Into The Dark)

There is a normal human behaviour when asked to proffer an explanation where we pull out phrases and offer them without a coherent model. We call this guessing the teacher’s password. (Guessing The Teacher’s Password) A good way to examine whether you truly understand a fact rather than have it memorised as a password answer is to ask whether you could regenerate it if forgotten. (Truly Part Of You)

It is not necessary to counter irrationality with irrationality, or randomness with randomness, despite this being the intuitive thing to do as a human. (Lawful Uncertainty)

A fake explanation often serves as a sign to end further examination despite containing no real understanding, in which case it is a semantic stopsign or curiosity-stopper. (Semantic Stopsigns) We should not expect answers to be ‘mysterious’, even for ‘mysterious questions’, such as the cause of fire or life. (Mysterious Answers To Mysterious Questions) Any time humans encounter a phenomenon, they can choose to try to explain it, worship it, or ignore it. (Explain/Worship/Ignore?)

The term ‘emergence’ is a contemporary fake explanation and semantic stopsign. (The Futility Of Emergence) The word ‘complexity’ in the sense of a desired addition can also be so. It is tempting to assign fake explanations to mysterious parts when trying to understand something. This must be resisted. (Say Not Complexity)

Eliezer failed at this in his earlier days, despite knowing to reject the standard ‘fake explanations’; it takes a lot of improvement to not simply find new, interesting mistakes instead of the old ones. (My Wild And Reckless Youth) Solving a mystery should make it feel less confusing, but it is difficult to learn what believing the old fake explanations felt like to recognise new ones. (Failing To Learn From History) Trying to visualise believing in ideas like “elan vital”, without being able to immediately see your error, may help. (Making History Available)

Explanations like ‘Science’ can serve as curiosity-stoppers, by telling us that someone else knows the answer. (“Science” As Curiosity-Stopper)

(Continue with “How To Actually Change Your Mind”)