THE POWER OF DARK SIDE OF OPTIMISM IN PROJECT

The optimism has been proven to help us in many situations of life. People with high expectations feel better, it keeps stress and anxiety levels low, and it helps drive success. It works as a "self-fulfilling prophecy" as explained in a TED talk by Taili Sharot, Scientist and PhD in Cognitive Psychology (Sharot T.-T., 2012) and author of the book "Optimistic Bias - Why we are programmed to see the world on the bright side" (Sharot, 2015). But what is the threshold for making optimism a problem? To think objectively, it is important to point out that 80% of people are prone to be optimistic, according to a study by psychologist David Armor, Yale University. And going deeper into the subject, it is verified the tendency to overestimate the probability of experiencing good events and underestimate the probability of experiencing bad events, and this is detailed in the studies and books of Dr. Tali Sharot.

The product of this behavior is accompanied by the fact that they ignore or underestimate the risks, believing that this will not happen to them, overestimate their capabilities, because they believe that the future will be different, even if there is information pointing in another direction. And in several sectors of the corporate world this behavior is detected, and the world of projects is no exception.

In the article "Delusions of Success: How Optimism Undermines Executives' Decisions", written in HBR by Dan Lovallo and Daniel Kahneman (Lovallom & Kahneman, 2007), it is explained that most large projects are not successful, and that the justification according to standard economic theory says, "The frequency of bad results is the inevitable result of companies taking rational risks in uncertain situations.". is incorrect.

In an IT project there are similar difficulties and sometimes the same as in large merger projects, mega constructions etc. Each stakeholder has their perception of how interesting that project or action is for them, what their benefits and burdens will be until the end and after its implementation, they are often lost in their desires and ego, forgetting or unaware of what the company or client really needs. This scenario becomes more intriguing because most of those involved know the main offenders of a project and we have an expressive range of tools to be used, such as the PMBOK, classic methodologies such as Waterfall and RUP, or agile such as Lean, Kanban, Scrum, FDD, DevOps, XP and others, but even so the projects fail? And why is it so frequent?

The reality is that we can have all the processes defined, the best market practices, the best equipment, we can use AI, IoT, Analytics and the latest technologies, but "for now" most decisions will be made by humans, who contain a "Brain", or rather a nervous system that makes mistakes from birth and throughout life, we are imperfect and by evolution our "brain" seeks to save energy and automate our actions, and with the complexity of living daily with Cognitive Biases, and for these and other reasons it will continue to play tricks on us and compromise our decision making and we tend to ignore the bad news.

In an experiment reported in Dr. Tali Sharot's book, using magnetic resonance imaging, she identified that the "Left Inferior Frontal Gyrus" responded to positive information and the "Right Inferior Frontal Gyrus" responded to negative information, confirming the possibility of intervening in these regions, and a person may become for a few minutes more optimistic or lose this bias. At another point in this process, it was identified that the brain (frontal lobe part) monitors prediction errors and is much more efficient when the information is positive, and the same does not occur when the information is undesirable, leading us to think that we give more importance to good news.

 

When a project goes off track it is a full plate for the "optimism bias", the "loss aversion bias" and the "Anchoring Bias", the latter is dissected by Kahneman and Tversky in behavioral economics studies (Kahneman, 2011). When the numbers point to failure and risks become issues (problems), loss aversion arises as a force of nature, sponsors and key stakeholders understand that it is out of the question to take steps back and review what has been done, for fear of losing everything that has been "built". The third, "Anchoring Bias" is based on a starting point, a piece of information received prior to the decision, which can be real or random. An example is if you are asked how much television a child can watch, you will probably say values close to your experience. However, the situation can also be true for random numbers. A study done by Dan Ariely, George Loewenstein and Drazen Prelecse, where they priced two digits of "ID Securty" and associated them with different types of products. As part of the result, it was realized that the products with higher prices received three times higher bids (Eduardo A. Schilman, n.d.).

But being honest, the "optimism bias" usually happens at the beginning of the project, when the main people involved understand that the team, although inexperienced (new), can handle the job and that if they have any doubts, just ask the older ones, that the business and operation rules are in the domain and full knowledge of the business owners and they agree with this, and that they do not need tools to speed up, because activities need to be performed even if they have to increase the amount of resources or hours worked, and the time is more than enough to perform all activities, even if they do not have a history or do not really know how long each activity takes.

These problems are common and contribute heavily to the derailment of the train and become more evident when we understand the Dunning-Kruger effect (Justin Kruger, 1999), where most confuse superficial knowledge with deep knowledge and do not know how much their ignorance is about. In one study researcher Rebecca Lawson asked if they knew how a bicycle worked and asked them to draw it, something simple right? Wrong, the harsh reality is that most did not know how it worked and could not draw it correctly. This situation of superficial knowledge I have come across several times in projects and throughout my life, and I believe that you have had a similar experience. However, these deviations can be addressed and resolved as long as everyone involved understands and believes that these biases exist and need to be addressed at all levels. So what can we do?

One of the essential points in working with any bias is to know of its existence, and to have the notion of being able to experience it at any time, working for us or against us. Several surveys report that reducing or eliminating this optimism bias is very difficult and certainly does no good, as it is also fuel for pushing boundaries. An example of this difficulty is a study with smokers where they were shown the risks of dying, and people after this event strengthened the idea that they would not be negatively affected by this behavior.

Unrealistic optimism can lead us to make decisions that are unsafe or catastrophic for us, for other people or for a project, and can be reinforcing of risky behaviors. So we should seek the balance between optimism and realism, separating what involves decision making and what determines action, where decision making should pursue realism, taking into account the two models of thinking "Fast and Slow" by Daniel Kahneman (Kahneman, 2011) and preferably use system 2 which requires a deeper and more analytical analysis of the situation, looking at data and information with different points of view, seeking diverse and divergent opinions, avoiding hiding or directing information to what you believe and being prudent in your decisions, with the aim of bringing your goals and objectives close to reality. Actions now need optimism (Lovallom & Kahneman, 2007), where subordinate leaders, customers and suppliers can move forward consistently and sometimes surprisingly in such a competitive world.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics