Common Fallacies To Avoid in Daily Life

Motivation

As we live our daily lives, we tend to make many decisions that are done by ruled by our subconscious mind. We can imagine ourselves as learning machines. Every unique experience we have sets a new connection in our brain that helps us decide the next time we encounter the same situation. An excellent example of this experience is driving a car. Think of the first time you drove a car. It was a very taxing experience, given the hundreds of decisions one needs to take when riding a vehicle. However, over-time, driving a car becomes a seamless experience. Our mind's ability to learn and make decisions almost instantaneously helps us get through day-to-day life without consuming the least amount of energy.

Daniel Kahneman is an Israeli psychologist and economist notable for his work on the psychology of judgment and decision-making and behavioral economics. In his book Thinking Fast & Slow, he talks about a dichotomy in the human brain that consists of two thought processes. The first is a slow, conscious thought process, and the other is a fast, subconscious slow process.

While the fast, subconscious mind has its advantages, it comes with certain disadvantages too. In this article, we talk about some of the drawbacks of biases that prevent us from thinking clearly. A natural question arises as to why we should care about these biases. There are several reasons for it. These are as follows:

  1. Fast decision-making can be erroneous.
  2. Awareness about preferences can help us correct them.
No alt text provided for this image

The Survivorship Bias

Survivorship or survival bias is the logical error when we generalize a theory based on an ordeal's survivors and do not consider people's collection.

Examples

  • We often hear about the stories of college dropouts such as Bill Gates, Mark Zuckerberg, or Ritesh Agarwal as having built large enterprises. It is a logical fallacy if we start to attribute their success due to them dropping out of college. Furthermore, we may conclude that a college education doesn't help in an individual's success. However, this runs into inconsistencies given that we haven't considered all the college dropouts and then looked at the percentage that has made it big.
  • During World War II, the statistician Abraham Wald took survivorship bias into his calculations to minimize bomber losses to enemy fire. The Statistical Research Group (SRG) at Columbia University, which Wald was a part of, examined the damage done to aircraft that had returned from missions and recommended adding armor to the areas that showed the least harm, based on his reasoning. It contradicted the US military's conclusions that the most-hit regions of the plane needed additional armor. Wald noted that the army only considered the aircraft that had survived their missions; any bombers that had been shot down or otherwise lost had logically also been rendered unavailable for assessment. The bullet holes in the returning aircraft represented areas where a bomber could take damage and still fly well enough to return safely to base. Thus, Wald proposed that the Navy reinforce areas where the returning aircraft were unscathed since those were the areas that, if hit, would cause the plane to be lost. His work is considered seminal in the then-nascent discipline of operational research.
No alt text provided for this image


The Confirmation Bias

Confirmation bias is the father of all prejudices. It posits that we tend to accept the facts that agree with the hypothesis we have in mind and ignore the data that does not agree with our thesis.

Examples

  • When reviewing a candidate you like, we tend to focus on only those skills that he/she performed well on and vice-versa.
  • When we try to change a habit, we tend to focus on only those days to successfully follow the practice and overlook the days when it didn't work.
  • When voting for a particular electoral party, we glorify the successes from their previous run and downplay their failures while the opposition does just the opposite.
  • The dot com bubble of the late 90s was also an example of the confirmation that the tech industry down-played the negative signals coming out of poorly run organizations and over-played the internet's impact in solving real-life problems.

The Cultural Bias

Also known as Herd Effect, or Social Proof.

The Cultural Bias posits that we believe something is right if a large set or most people around us think it is true.

Examples (from History)

  • The Holocaust was an example of an event where a dictator could convince an entire nation to wipe off a specific section of the human race.
  • The Sati system was a practice that Indians practiced where a widow had to sit in her deceased husband's funeral pyre.
  • The Untouchability system was a practice that was/is practiced in India where the society disallows mingling with people of a particular caste.

All these practices and incidents, in isolation, seem abhorrent and yet were widely adopted by society and considered legal at some point in time in human history.

Examples

  • You go on a road and see a mob by the side of the road. All of a sudden, you wait to see what has happened to the crowd.
  • You are running a marathon and start to feel tired. Suddenly, a small group of runners passes by, and you begin to feel back energized.
  • A new social media app hits the app store, and although you hate downloading yet another social media app, you still try it anyway.
  • Sharing this article on LinkedIn is another example of Social Proof or Cultural Bias that we suffer from daily bias 😉.

One of the rationales why this occurs commonly, is because of evolutionary purposes. One of the most important characteristics needed to survive was to stay and hunt in groups as hunter-gatherers. It meant that humans had to agree to group thinking; otherwise, they would be outcasts. With civilization's advancement, this holds no longer true (except for in very extreme scenarios, e.g., war-zones, mountaineering, etc.). However, our genetic makeup hasn't yet evolved, and we still believe in listening to the wisdom of the crowd.

The Sunk Cost Fallacy

The Sunk Cost Fallacy is one in which we are not ready to give up losses once we have invested some resources in an effort. The resources may be time, money, effort, etc.

Examples

  • You have worked on a project for multiple years and now hit a dead-end. You are still not ready to abandon it because of the cost involved.
  • You have invested emotionally in a relationship and now do not see it going anywhere. Still, you do want to ensure that you can withhold it given the effort that has gone in.
  • You have invested in a stock going to the drain, yet you cannot part away with it because of the time, money, and losses you have already incurred.

The fallacy lies in the fact that the future doesn't care about the losses about the fast. In general, humans tend to cling to their past experiences and get emotionally attached to losing battles, and cannot reason in times of difficulties.

One prime example of this is the never-ending Vietnam War that the US government fought over multiple decades and eventually lost. Numerous governments repeatedly made the argument that given they had already invested so much men and money into the war, pulling out would make the present-day government look like a failure. It then led to a Sunk Cost Fallacy.

There are multiple reasons why humans are not able to overcome the Sunk Cost Fallacy. These are as follows:

  • Emotional attachment to past
  • Ego clashes when accepting defeat
  • Signaling inconsistency, thus losing credibility.

The Halo Effect

It is also related to Success Bias.

The Halo Effect is a tendency of humans to judge the positive impressions of a person, brand, and company in one dimension as impacting all the other dimensions of their personality. As humans, we tend to place successful people on a pedestal and start to worship them as Gods. In reality, every one of us has our strengths and weaknesses.

Examples

  • We see actors, sportsmen in media campaigns endorse products, services, etc., that they are not capable of judging.
  • We believe in our leaders’ judgment in policy areas that are entirely out of their scope of the decision.
  • We tend to follow the policies adopted by large successful companies even though they aren’t related to our domain, have the same set of resources, or may not be in the same era.
  • We tend to hate or worship leaders like Mahatama Gandhi, Nelson Mandela, or Adolf Hitler without looking at their positive and negative qualities.

Avoidance

Instead, the right way would be to look at each individual, company, brand, etc., from a fresh perspective, with a uniform lens, and gather data about the field they are being evaluated in. They will be useful in the fields that they are experts in. However, for the rest of the fields, an unbiased analysis will help you make the right judgment. A simple mental experiment that one could do is to apply the blindfold test. This says that in case the endorsement was done by a person you did not know, would you still be willing to accept/reject the idea. If yes, then the idea and not the vehicle of the idea has merit and should be followed. Otherwise, the idea should be rejected.

The Information Bias

The Information Bias posits that any information after a certain threshold is rendered meaningless. This tends to happen because we tend to obfuscate or ignore the obvious facts and try to go deep down into questions that are not necessarily meaningful.

Examples

  1. Consider an investor who interacts with many entrepreneurs every single day. He/she has the best possible view of all the latest ideas/technologies that are being developed throughout his/her area of expertise. However, he/she still fails to find the next big social-network, E-commerce platform, or chat-application.
  2. In his short story, “del rigor en la Ciencia” (or “On Exactitude in Science”), is a one-paragraph short story written in 1946 by Jorge Luis Borges about the map–territory relation, registered in the form of a literary forgery. In this story, Borges imagines an empire where the science of cartography becomes so exact that only a map on the same scale as the empire itself will suffice. The only catch here is that the excess of information renders the whole process meaningless.
  3. Think of all the work that the many epidemiologists and economists do before we hit the Covid-19 pandemic. Most of the work they have done in the last decade was rendered useless as soon as the pandemic hit different countries. Most of them were not looking in the right direction or couldn’t surface up the right set of problems to the world’s governments.

The Action Bias

The action bias indicates our tendency to act even when we think that there is no clear reason as to why the action will lead to clear benefit. There is a popular saying that Movement is not progress. Progress comes with only well-thought-out actions and deliberate action. Therefore, stopping for sometime before deciding which direction to take is sometimes the right way to go about it than taking action and then figuring out what the results are going to look like.

Often, the individuals who are in high-stress environments suffer from this fallacy. Some common examples include hospitals with emergency wards, fast-moving startups, the week before an examination, etc. We have often felt the urge to move fast and take a lot of decisions without thinking about the consequences or the impact of those actions in the long run.

Examples

  • The action bias is much more pronounced in the field of investing. The famous investor Warren Buffet says that all he needs to make 1–2 right decisions in a year, and that defines his success. Young investors often believe that the more the number of companies they invest in, the higher their chances of success. Unfortunately, this falls flat in the face of reality.
  • There is a big examination coming up in the next month. What do you do? You spend 16 hours a day studying every topic until you have learned it by heart. However, if you look at the set of questions from the last ten years, you can easily figure out what areas or topics to study. Quite often, 80% of the tough questions come from 20% of the topics only. This is also called the Pareto Principle.
  • Side-notes: This is an interesting blog that I wrote sometime back on common techniques for prioritization that one can use in his/her own daily life.

The Quantity Bias

The quantity bias is a tendency in humans to believe that large quantities of entities have a larger impact than each entity’s quality. This is a common fallacy that gets exacerbated esp. with data-science measures where the emphasis is laid heavily on numbers than anecdotes. Furthermore, the advent of social networks has exacerbated the need for virtue-signaling through quantitative measures. Our success has become the number of re-tweets, likes, loves, shares, etc., that we get on each and every one of our posts and status updates. On the contrary, the world is shaped heavily by a few major events, outliers, or Black Swans, as the former author Nassim Nicholas Taleb puts distinctively puts it in his book called The Black Swan.

Examples

  • While there are hundreds of thousands of books published a year, only a handful of those end up making any sizeable impact on the conscience of the human population. It is because most of the books that are published out there do not add to the pyramid of knowledge or experiences of human society.
  • In his research paper, The Mundanity of Excellence, Daniel F. Chambliss studied Olympic-level swimmers for more than six years to understand what leads to excellence. He concluded that excellence is not correlated to the number of hours that are spent in your art. Achieving excellence is closely related to the quality of the hours you spend doing the work. Time is so precious, And you need to ask yourself, “what am I going to do today?” But more importantly, you need to ask yourself, “how am I going to do it?

---------------------------------------------

Thank you for patiently reading through this article.

Originally posted on my blog here.

Related Reading

  1. Thinking Fast and Slow by Daniel Kahneman
  2. The Art of Thinking Clearly by Rolf Dobelli
  3. The Survivorship Bias
  4. The Confirmation Bias
  5. The Cultural Bias
  6. The Vietnam War
  7. The Sati System
  8. The Holocaust
  9. On Exactitude in Science
  10. The Action Bias
Satish Kumar

CEO, GLIDER.ai - Skill Intelligence Platform; HR Tech; Co-founder, Edulastic (acquired by Sumeru Equity Partners)

3y

Awesome compilation of common fallacies! Ravi Tandon. Need to keep reminding of ourselves.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics