0. Key points
- Having correct beliefs about the world is crucial for making good decisions
- Having correct beliefs is not that easy, as we run on corrupted hardware that is full of biases and other unhelpful cognitive moves. E.g. Motivated Reasoning, Resulting, Self Serving Bias, Hindsight Bias, etc
- Think of decisions as bets about the future: you are saying that this action is most likely to lead to desired future state
- Work with other people as a group, helping each other build better beliefs, by building a self correcting epistemic community. Use Robert Mertons Principles of Communism, Universalism, Disinterestedness, Organized Skepticism (CUDOS)
- Some other tools you might want to use: Precomittment, Decision swear jars, backcasting, premortem, putting numbers on possible outcomes
1. Why care about correct beliefs?
You probably have some things in live you want to achieve. Things you prefer to happen over other things. You probably want to stay healthy, have good relationships, have success in whatever is important to you, depending on your Values. To achieve those things you are making decisions all the time. Small decisions in your everyday life, up to bigger decisions that change your life trajectory all at once. Depending on the decision you take, you might end up in a preferable world state, or in one that is not as preferable. Obviously, choosing the right decision is better for you, if you care about reaching your goals and living out your values. Making good decisions means shifting the probability of ending up in a desired world state upwards.
For that, we need as correct beliefs about the world as possible. Our decisions are based on our beliefs about the world. If I believe that sugar is healthy and that there is nothing wrong with smoking cigarettes, I might choose to base my diet on eating sweets and end up smoking a pack a day, which would result in me being more likely to develop health issues. Fair, these two statements seem obviously false. But sometimes even subtle differences in our beliefs about the world can have quite noticeable impacts on where we end up. If I believe in the potential for development (growth mindset) or if I believe in fixed traits (fixed mindset) has effects on how I approach new challenges, how I learn and grow. In the long run, this difference in belief about oneself can have quite different impact on my decisions and subsequently on how my life play out.
Beliefs are hard
Sadly, having correct beliefs about the world is not as straightforward as ![]()
We humans run on faulty and corrupted hardware and are far from an ideal bayesian agent that updates in the right amount, according to new evidence. No, we use some heuristics and cognitive moves that are far from ideal. We tend to believe things we hear without thinking about them first, especially when they are embedded in a narrative (Conspiracy Theories, health advice from instagram, prejudices, the great wall of china being visible from space,…). We use Motivated Reasoning, a style of thinking about something that starts with the conclusion and then looks for arguments that support this conclusion. The conclusion being chosen not by virtue of being true, but mostly for other reasons, e.g. a conclusion that signals our affiliation with a tribe (ahem ahem all of political discussion ahem).
The world is an uncertain place. Before actually happening, a thing has a probability to happen, alongside other things that could happen. Our beliefs about the world should reflect this by being probabilistic - we can not be sure about what is the case. But we tend to think in black and white - either a thing happens / is true, or it is not. This all or nothing thinking, this thinking in only 100% or 0% is making it hard for us to update on new evidence, because you can only go from completely right to completely wrong. And being wrong feels bad [CITATION NEEDED], so we tend to just not update at all and stick to our initial guess. Probabilistic thinking on the other would let us update a little bit on each new evidence, reflecting the world more truly.
Decisions and Updating are hard
But wait, there is more!
Evolution gave us a quite a few more heuristics that probably were useful to help us survive back then, but are not useful if we want to find truth today. We can identify many phenomena that lead to us having distorted views on our beliefs and decision making.
Shit we humans do:
- Resulting. Treating a decision as good or bad depending on the outcome. Just because you won a hand at poker doesn’t mean it was a smart decision to play it. And vice versa.
- Probability Matching: If you have a 70% chance of drawing a red ball from a bowl and 30% of blue, the optimal decision is to always choose red. This is not what humans tend to do.
- Self Serving Bias. In ourselves, we categorize bad outcomes as due to luck (“i was just unlucky”), good outcomes due to skill (“it worked because i am good”), because it serves our self image. This leads to us not learning from bad outcomes (if it was due to luck, i cannot improve) and learning the wrong lessons from good outcomes (it worked, so i should do this more often). The reverse is true for categorizing the outcomes of other people.
- Temporal Discounting. We favor our present self at the expense of our future self. See the Marshmallow Test as illustration, our behaviour regarding instant gratification as application.
- Hindsight Bias. We tend to misremember how clear it was that something will happen, after it happened. The enemy of probabilistic thinking. “Of course it was going to happen, it was inevitable, you must have known” - this shit. Did you write down your prediction beforehand? No? Then this is too easy to say.
All this is of course only a selection of everything that makes us less than optimal decision makers. But even knowing them all doesn’t seem to help that much, in itself. Being smarter could even be problematic if unchecked: as per the blind spot bias we default to recognizing the fault in others, but not in ourselves.
So, is there something we can do about that?
2. Are we screwed?
Short answer: Probably, but we can try.
This is not a good predicament to find oneself in - we want to make good decisions to achieve our goals, but our brains are working against us, thanks to evolutionary trained in behavioral patterns that stopped working when we changed our environment to one that does not resemble our ancestral one in the slightest. So we have to use some tools to help us with this and work against this default pattern.
The first tool is a perspective, a framework to view decisions with: viewing them as bets. It’s what you do all the time: basically, a decision is a bet on a future outcome. You are betting that the decision you take is the most likely to lead to an outcome you like. It helps to make this explicit and really ask yourself: “(what) would you bet on it?” - this prompts you into thinking about the beliefs that are foundation to your decision and helps you think in probabilites.
This works better and best with other people. If you do this by yourself, it is easy to weasel out and skip the “actually thinking about it” part, because there is no one to hold you accountable and you are pretty good at convincing yourself that you did your part, while not having done your part. With other people around, this might not be so easy. Your social brain is a really powerful motivator (this is why Focusmate works), so it would be wise to use this to your advantage. Having other people to reflect on your decisions and beliefs with is probably the best you can do, conditional on you all getting the social dynamics right.
Building a self correcting epistemic community
Ideally, we’d want to have a community, a group, a culture, that has norms that reward being open and intellectually honest. We humans crave approval. We usually end up in echo chambers, where your opinion and view is the same as the groups and everyone pats everyone else on their backs for having such good opinions and views. If we don’t want to end up there, we have to make our approval based on accuracy and intellectual honesty. Reinforcing each other for sharing their true viewpoint in all the detail possible leads to a diversity of viewpoints and better arguments that you can then run your model against to make it stronger (Steel manning).
We don’t have to invent this anonymous alcoholics for truthseeking from scratch, we can look to existing work. Robert Merton formulated some principles for his ideal-type model of a self correcting epistemic community. His norms of CUDOS are
- Communism. Data belongs to the group. You should share all data you have, especially there is something that makes you uncomfortable. You might not realize the relevance of the data you provide, you might tell it from a new perspective that gives new insight (Rashomon Effect).
- Universalism. “Don’t shoot the messenger” - don’t ignore an idea just because ou don’t like who or where it came from (and vice versa), apply uniform standards to claims and evidence, regardless where they come from.
- Disinterestedness. Vigilance against potential conflicts that can influence the groups evaluation. Obvious: financially. Other forms: interpreting the world to confirm beliefs, comparing well with peers, live in a world where stuff makes sense. Don’t let someone know how the story ends to discourage resulting.
- Organized Skepticism. Asking why things might not be true, rather than why they are true. This is consistent with good manners and friendly communication. Not “you’re wrong”, but rather “I’m not sure about that / are you sure about that?” - we come from a place of not being sure, of probabilities.
Finding a group where you can establish these rules seems like one of the most useful things to do if you value having true beliefs and making good decisions. Some patterns and other behavioral tools I would like to see in such a group:
- Talk about your wins and identify mistakes in it
- Challenge each other for a bet. Reduces motivated reasoning and holds each other accountable
- Point out when someone is tilting. To tilt means having bad outcomes impact your emotions, which in turn comprose your decision making so you make irrational decisions that lead to bad outcomes, and so on.
It seems that this process of building a group/community like this has to be deliberate. Especially in the beginning, you would have to make an conscious effort to stick to these rules, to act according to these principles instead of going with your autopilot. Building a collective epistemic immune system probably takes some time and effort.
More tools to use
Some other useful tools and frames you might want to use to work around your biases:
- Using Ulysses contracts, precomitting to something.
- Decision swear jars. When we find ourselves using certain words or succumbing to the thinking patterns we are trying to avoid, a stop-and-think moment can be created. Commit to vigilance around a list of words, phrases and thoughts, such as signs of certainty (“I’m sure”, “100%”, “always happens this way”), overconfidence, complaining about bad luck, shooting the messenger, the word “wrong”, and so on. Maybe this could also be implemented in your group? Best if you notice by yourself, but mabe it takes someone else to point it out to you.
- Mapping the future. Figure out the possible outcomes of a decision, then try to put probabilities on them. You are already guessing, making it explicit helps go away from black or white thinking. This also works best in a group.
- Backcasting. Imagining the succesful event in the future and working backwards from there. What did you do to get there?
- Premortem. Imagining a negative future where we failed. Work backwards from there: what went wrong? This seems to work better than working backwards from a positive future, in terms of results [source]
Possible Problems
We don’t want to end up in a community where it is only about criticizing each other, tearing each other down before we have built something. Only giving destructive criticism is not useful either. We might need to kill Socrates. Learn to recognize these patterns of unhelpful criticism and learn to differentiate between them and actually helpful criticism.