Recent scandals in top corporations – such as Facebook’s Cambridge Analytica data breach, United’s three pet-related debacles scandals in one week, and Equifax discovering an additional 2.4 million data breach victims – all exemplify the kind of thinking errors that lead to disasters. You might be surprised to learn that all of these disasters were avoidable.

Researchers have found that our brains make systematic and predictable errors – what behavioral scientists call cognitive biases – that lead us to make poor decisions, such as the overconfidence effect, optimism bias, and planning fallacy. If the leaders of United, Facebook, and Equifax are vulnerable to these biases, so is everyone reading this article: it’s just that when you experience a disaster due to cognitive biases, it doesn’t make it into the news.

Knowing about the kinds of well-publicized mistakes made by top corporations helps us understand the kind of mistakes we might be making right now. Fortunately, recent research shows that we can easily improve our ability to make better decisions. As someone who researches how we can avoid these errors, and consults and speaks for corporations on these topics, I want to show you an effective, research-based approach that you can use to avoid professional and/or business disasters (described in more detail in my Amazon bestseller The Truth-Seeker’s Handbook: A Science-Based Guide).
 

Overconfidence Effect

 
When asked whether they are more, less, or equally skilled compared to the average driver, 93% of Americans report themselves as more skilled. When study subjects said they were 100% confident in their answers, they were wrong 20% of the time. No wonder that the overconfidence effect – our tendency to be excessively confident in our decision-making – has been found by researchers to harm performance in the workplace, whether by CEOs or ordinary professionals.

Consider Facebook’s excessive confidence as an example. When Facebook learned that Cambridge Analytica might have received data from over 50 million Facebook users inappropriately, it asked Cambridge Analytica to provide it with a “legal certification” – in other words, a promise – that it deleted the data. Cambridge Analytica provided that legal certification, and Facebook accepted that as sufficient. After the scandal broke, Facebook’s CEO Mark Zuckerberg called accepting that certification as sufficient as “one of the biggest mistakes that we made” and said that Facebook will “not just rely on certifications that we’ve gotten from developers, but… do a full investigation of every single app.”

Facebook’s excessive confidence in the good faith of all the external developers with whom it worked is an example of how all of us need to be wary when we engage in professional collaborations. Don’t trust your gut reactions, as they will often lead you astray. Try to second-guess yourself – and those with whom you collaborate – to avoid professional disasters.
 

Optimism Bias

 
Overconfidence feeds into another thinking error, optimism bias, which refers to us being excessively optimistic about the future. For example, studies show we tend to believe our risk of suffering negative events is less than it actually is, and we overestimate the likelihood of positive events. We fall into optimism bias frequently in the workplace, overemphasizing the benefits of projects and understating the costs.

As an example, recall that United got in hot water last year for its crew dragging a passenger off the plane. It worked hard to rebuild trust among customers, and its favorability rating was going back up – until the three recent pet-related accidents brought its favorability tumbling down again. United was too optimistic about its efforts to rebuild trust and failed to react quickly enough to the new round of bad PR. Indeed, only after the third incident did United’s CEO speak out to acknowledge the problem and suspend for review its pet transport program.

Don’t take your example from United. If you notice a problem in your professional activities, don’t wait for it to repeat three times before you start to do something about it. Such excessive optimism about the quality of your work will not end well for you. Instead, notice when things go wrong and consider a variety of alternative explanations for this problem, including both optimistic and pessimistic ones, to deal with optimism bias.
 

Planning Fallacy

 
The planning fallacy combines overconfidence and optimism bias in how they apply to our plans for the future and assessments of existing processes. We tend to assume our plans will go well, resulting in us failing to build in enough resources for potential problems. For instance, one study involved a group of students asked how long it would take the complete their senior thesis in the best-case scenario (they estimated 27.4 days on average) and the worst-case scenario (48.6 days). In reality, the actual average completion time was 55.5 days, substantially worse than the original estimates of the worst-case scenario. Research shows that in professional settings, falling into planning fallacy results in us going over budget and over time.

The data breach suffered by Equifax illustrates the problem of planning fallacy. Apparently, several months before the data breach, the Department of Homeland Security warned Equifax of a vulnerability in its computer systems. However, the company failed to follow its own process to fix the security flaw, enabling hackers to access the data of over 140 million customers. Moreover, Equifax bungled its response to the data breach. It waited six weeks to inform customers about the breach, set up an unsecure website to inform customers about it, and hid the full extent of the breach.

Just because Equifax fell into planning fallacy with both existing processes and new projects does not mean you have to suffer the same fate. As a rule of thumb, when you start new projects build in twice as many resources – of time, money, and energy – then you anticipate. Always be ready for your existing processes and practices to fail you, and have contingencies ready just in case. Finally, avoid denying negative information about your professional circumstances, and be proactive about dealing with problems.
 

Addressing Avoidable Disasters

 
One of the most effective ways to address avoidable disasters is to use a premortem, which has been shown by research to address cognitive biases that lead to disasters. To conduct a premortem, first gather a team of relevant stakeholders, consisting of a mix of people with decision-making authority and expertise in the matter under evaluation. If you are doing this by yourself, ask a couple of fellow professionals or friends who know you well to help you out.

Then, ask everyone to imagine that the project or process definitely failed. Ask everyone to write out anonymously some plausible reasons for why it failed, especially reasons that might be seen as rude or impolitic. Next, reflect on the potential reasons for failure, and brainstorm solutions. Following that, consider possible next steps for implementing these solutions.

Premortems conducted regularly to evaluate existing processes could have caught the kind of issues that led to disasters for United, Equifax, and Facebook, and they can help you avoid professional disasters in any context. Good luck avoiding business disasters!