Logical fallacies are errors in human reasoning that exist for a reason - in many cases they lead to good outcomes.
Yet these shortcuts in reasoning can also get us in trouble.
Awareness of common logical fallacies can help you to identify these situations and avoid mistakes. It’s especially important to be aware of the logical fallacies that can make us vulnerable to others who are willing to exploit us.
The brain operates in a challenging environment - constantly filtering out information, making conclusions from incomplete information and deciding what to remember and what to forget.
The logical fallacy is a shortcut for decision making that allows quicker decisions. The result of using these fallacies is biased decision making - we will consistently make the same errors using this faulty logic.
Automated shortcuts that are correct 90% of the time offer an advantage over slowly thinking your way through each situation. It’s the 10% of situations where awareness of common fallacies can help.
Where all think alike, no one thinks very much.
Walter Lippman
Social proof is copying the behaviour of others. It is useful in social situations where correct behaviour is ambiguous or unclear.
The problem with social proof is when we copy in a mindless or reflexive fashion - automatically copying without questioning what we are copying. This is decision making based on imitation - even if the outcome is bad.
In 1964 Kitty Genovese was murdered in New York with 38 witnesses - none of whom helped. Each one looked to the other for guidance on how to act - and concluded that no helping was appropriate. If you are wondering how Kitty could have made one of the bystanders help - it would be to address a specific person, rather than making general cries for help.
Group members will also go along with a norm they privately reject to fit in with the group (known as pluralistic ignorance). Many Germans in the 1930’s are likely to either have become Nazi or appeared as Nazi because of social proof.
This leads nicely into our second group of fallacies - consistency and commitment.
Wherever you turn, this consistency and commitment tendency is affecting you. In other words, what you think may change what you do, but perhaps even more important, what you do will change what you think.
Charlie Munger
Commitment bias is the desire to be (and appear to be) consistent with our past declarations and commitments. Consistency bias is the desire to be (and appear to be) consistent with our past behaviour.
Most people think that our thinking drives our behaviour; it’s actually the opposite. Our past commitments and actions form the mental model we have of ourselves. Nothing influences your own self images (there are more than one) as much as your past behaviour.
As will all biases, consistency and commitment bias developed from the usefulness of automating decisions - instead of remaking decisions, we look to our past actions for guidance on what to do next.
The problems arise when we feel pressure to be consistent with sub-optimal past actions or commitments - such as eating poorly because ‘thats what I do on a Friday’, or the more deadly - ‘this is what I’m like’.
Commitment bias can be expolited using the foot in the door technique - first asking for small commitment, then larger one. Starting with small commitment changes how people view themselves - they become the kind of person that does that kind of thing (such as donating to charity). Be careful about agreeing to trivial requests!
Consistency and commitment is a fallacy that can hacked to your advantage. The more effort you put into committing to who you want to be, the more likely you will act in that way.
It’s also worth catching moments when you make statements like ‘I am lazy’ or ‘I am confident’. These statements will impact how you act - make sure they affirm the qualities you want. Public commitments create additional pressure for consistency. The more you can make your commitment public, the better (or perhaps worse!).
Don’t believe everything you think.
Thubten Chodron
Understanding the problems with human memory is a significant step in understanding the limits of the our condition. The problems are numerous - here I’d like to focus on a two.
False memory is the phenomenon of recalling something that either didn’t happen or differently from the way it happened. These false memories are remembered confidently, without conscious intention to deceive.
Suggestibility occurs when false but plausible infomation is used to fill in the gaps in memory. Suggestibility might be useful in order to fit into social groups - to recall memory in a way that is consistent with others.
One value of false memory is being able to appear better to others - rather than saying you can’t remember, you instead present a picture of perfect recall. Another reason the brain might do this is to fill in gaps in memory that allow sequences to be used for learning (similar to how lagged features are used for time series forecasting).
The problems of false memory and suggestibility are obvious - especially in light of how important eye witness testimony is valued by our legal system. Treat your own memories with a degree of skepticism - don’t believe everything you remember.
Great source of both misery and disorders of human life seems to arise from overreacting to the difference between one situation and another.
Adam Smith
Contrast bias is the flaws in decision making that occur after being sequentially presented things that are different. The prior event influences our experience of the second, leaving us seeing the second thing as more different than it actually is.
The contrast effect is useful for making it clear what the best or worst decisions might be. However the contrast effect is perhaps the most often and easily exploited of all logical fallacies - especially due to the fact that it can swing both ways.
An example of the contrast bias being exploited is a real estate agent showing run down houses before showing a good house. Another would be to show overpriced houses before showing the house the estate agent wanted to sell.
Distinction bias is the tendency to view two options as more distinctive when evaluating them simultaneously than when evaluating them separately.
Distinction bias is useful in narrowing allowing the brain to remove details that are similar and focus only on what is different.
The problem with distinction bias occurs when it encourages decision paralysis for trivial decisions - our inability to make a decision between two choices that are essentially the same.
An example of distinction bias is being unable to choose between two similar TVs presented side by side at a store, when neither offers any benefit over the other. The time and stress wasted on such a decision is not worth it.
Anchoring is a cognitive bias that occurs when people rely too heavily on an initial piece of information, known as the “anchor”, when making decisions. This often leads to a distortion in the decision-making process as the initial information holds more weight than it should.
When people are presented with an anchor value, they tend to adjust their estimates based on that value, but often insufficiently. This leads to biased judgments that are influenced more by the initial anchor than by the actual information available.
Anchoring can be useful in situations where there is limited information available, and the anchor provides a reasonable starting point for decision-making. However, it becomes problematic when the anchor is irrelevant, misleading, or biased.
No matter how far you’ve gone down the wrong road, turn back.
Turkish Proverb
The sunk cost fallacy arises when we choose to continue investing into a project based on how much we invested in the past. In fact, only future costs and benefits should be used when making decisions. A subtlety to the sunk cost fallacy is that usually, past investments mean that less future investment is needed. But this effect will be evident in a reduction in future costs - we shouldn’t explicitly take into account that past investment.
The sunk cost effect is useful due to that subtlety - often if we have invested a lot this means we have a lot less to invest in the future, meaning keeping on going is the correct decision. But this isn’t always the case!
The problems with the sunk cost fallacy occur when we continue to invest in projects based only on how much we have invested in the past - even if that project as little chance of succeeding or would cost massive amounts to succeed.
An example is choosing to continue with a past relationship based on the amount of emotional effort you have expended in the past. In fact, only the future costs and probability of success should be taken into account.
Related to the sunk cost fallacy is the goal gradient effect - where we put more effort into achieving a goal based on how close we are to it. We can usefully exploit this effect by breaking down tasks into smaller tasks, which we are closer to achieving.
… our first-person point of view of our own minds is not so different from our second-person point of view of others’ minds: we don’t see, or hear, or feel, the complicated neural machinery churning away in our brains but have to settle for an interpreted, digested version, a user-illusion that is so familiar to us that we take it not just for reality but also for the most indubitable and intimately known reality of all. That’s what it is like to be us.
From Bacteria to Bach and Back - Daniel C. Dennet
The introspection illusion is the mistaken belief that we have direct insight into the origins of our mental states. In reality, most of the mental processes that drive how we think, feel and act are hidden to our conscious awareness. Part of the human condition is to be unaware of this, and instead make confident but false explanations of our behaviour.
The usefulness of the introspection illusion is rather dark - it allows our subconscious to hide the reasons for our actions, allowing us to communicate to others a socially acceptable reason without knowing we are being deceptive.
The problem with the introspection illusion is that this false sense of security about the reasons for our actions leads to us not understanding why we think or feel the way we do.
Examples of the introspection illusion include donating money to charity for the social status is brings, or aggressive body language towards a social competitor.
… markets in which people are not completely sure of how to assess quality, use price as a stand-in for quality.
Robert Cialdini
The price-value bias is our tendency to associate higher prices with higher quality. It’s evident in the common phenomenon of market shares increasing when a business raises prices. People associate price with quality - if it’s more expensive, it must better!
The price-value bias can be useful when higher prices really do mean more value - the problem occurs when in reality it’s just the cheaper stuff in a different packet. It’s worth noting here that value is customer dependent - if someone feels happier spending more money, then this is a form of value.
The availability fallacy is estimating frequency based how easy the event is to remember.
This makes the probability of more memorable events to be overestimated.
Thanks for reading!