The Perils Of Assuming Everything Is Fine: Normalcy Bias And The Rushed Approval Of Boeing’s New 737 Max 10 Jet
tags: leadership,decision making,wise decision making,leadership development,cognitive bias,decision-making process,leaders,normalcy bias,boeing 737Max
Congress just cleared the Boeing 737Max 10 jet for certification in the omnibus end-of-year spending package without further modifications and safety enhancements. That’s despite significant opposition by those demanding a safety upgrade: from the families of those killed in the 2 deadly crashes in 2019, from the union representing the 15,000 pilots at American Airlines, and from Rep. Peter DeFazio, D-Ore., chair of the House Transportation Committee that led the key congressional investigation into the MAX crashes, who said the language in the bill was included over his objection.
This rushed clearance stemmed from the pressure of lobbying by Boeing and its allies. It suggests neither Boeing nor Congress learned the lesson of Boeing’s earlier 737Max fiasco: when 346 people lost their lives; Boeing lost $5 billion in direct revenue and over $25 billion when counting damage to the brand and losing customers; and Boeing fired its CEO Dennis Muilenburg.
What caused the disaster for Boeing? At a high level, it was the company's desire to keep up with Airbus's newer, more fuel-efficient aircraft, the Airbus 320. To do this, Boeing rushed the production of the 737 Max and provided misleading information to the Federal Aviation Administration (FAA) in order to receive fast approval for the plane. In the process, Boeing disregarded the safety systems that its own engineers had recommended and did not fix known software issues with the 737 Max, which ultimately led to the crashes.
The New Normal
The root cause of the disaster at Boeing can be traced back to a cognitive error known as normalcy bias. This bias causes people to overestimate the likelihood that things will continue as they have been and underestimate the potential consequences of a disaster occurring.
Ironically, the transformation of the airline industry in recent decades to make airplanes much safer and accidents incredibly rare is key to understanding Boeing’s disaster. The Boeing leadership was overconfident in the safety record of their airplanes and saw the FAA certification process as an obstacle to doing business rather than a necessary safety measure. This normalcy bias contributed to their decision to rush the production of the 737 Max and overlook known software issues.
Boeing’s 737 Max disaster is a classic case of the normalcy bias. The Boeing leadership felt utter confidence in the safety record of the airplanes it produced in the last couple of decades, deservedly so, according to statistics on crashes. From their perspective, it would be impossible to imagine that the 737 Max would be less safe than these other recent-model airplanes. They saw the typical FAA certification process as simply another bureaucratic hassle that got in the way of doing business and competing with Airbus, as opposed to ensuring safety.
Think it’s only big companies? Think again.
The normalcy bias is a big reason for bubbles: in stocks, housing prices, loans, and other areas. It’s as though we’re incapable of remembering the previous bubble, even if occurred only a few years ago.
Similarly, the normalcy bias helps explain why leaders at companies of all sizes were so vastly underprepared for COVID-19 and its impact. While pandemics post a major threat, it’s a low-likelihood, high-impact, slow-moving disaster. The normalcy bias keeps tripping us up on such disasters, unless we take effective steps to deal with this problem.
Normalcy Bias in a Tech Start-Up
Of course, the normalcy bias hits mid-size and small companies hard as well.
At one of my frequent trainings for small and mid-size company executives, Brodie, a tech entrepreneur shared the story of a startup he founded with a good friend. They complemented each other well: Brodie had strong technical skills, and his friend brought strong marketing and selling capacity.
Things went great for the first two and a half years, with a growing client list - until his friend got into a bad motorcycle accident that left him unable to talk. Brodie had to deal not only with the emotional trauma, but also with covering his co-founder’s work roles.
Unfortunately, his co-founder failed to keep good notes. He also did not introduce Brodie to his contacts at the client companies. In turn, Brodie - a strong introvert - struggled with selling. Eventually, the startup burned through its cash and had to close its doors.
The normalcy bias is one of many dangerous judgment errors, mental blindspots resulting from how our brains are wired. Researchers in cognitive neuroscience and behavioral economics call them cognitive biases. Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors in your professional life.
Preventing Normalcy Bias Disasters
It really helps to use the strategy of considering and addressing potential alternative futures that are much more negative than you intuitively feel are likely. That’s the strategy that Brodie and I explored in my coaching with him after the training session, as he felt ready to get back to the startup world.
While Brodie definitely knew he wouldn’t be up to starting a new business himself, he also wanted to avoid the previous problems. So we discussed how he would from the start push for creating systems and processes that would enable each co-founder to back up the other in cases of emergencies. Moreover, the co-founders would commit to sharing important contacts from their side of the business with each other, so that relationships could be maintained if the other person was out of commission for a while.
So what are the broader principles here?
1) Be much more pessimistic about the possibility and impact of disasters than you intuitively feel or can easily imagine, to get over the challenges caused by the normalcy bias.
2) Use effective strategic planning techniques to scan for potential disasters and try to address them in advance, as Brodie did with his plans for the new business.
3) Of course, you can’t predict everything, so retain some extra capacity in your system - of time, money, and other resources - that you can use to deal with unknown unknowns, also called black swans.
4) Finally, if you see a hint of a disaster, react much more quickly than you intuitively feel you should to overcome the gut reaction’s dismissal of the likelihood and impact of disasters.
Unfortunately, Boeing - and Congress - did not appear to learn this lesson in the rushed approval of the new 737Max model. The fact that they failed to make the safety upgrade demanded by so many diverse external stakeholders signals that more deadly lessons may be in store for us in the future.
Image Credit: Boeing plane crash AFP via Getty Images
Originally Published in Forbes on Jan 10, 2023
Bio: Dr. Gleb Tsipursky helps leaders use hybrid work to improve retention and productivity while cutting costs. He serves as the CEO of the boutique future-of-work consultancy Disaster Avoidance Experts, which helps organizations adopt a hybrid-first culture, instead of incrementally improving on the traditional office-centric culture. A best-selling author of 7 books, he is especially well-known for his global best-sellers Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (Career Press, 2019) and The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships (New Harbinger, 2020). His newest book is Leading Hybrid and Remote Teams: A Manual on Benchmarking to Best Practices for Competitive Advantage (Intentional Insights, 2021). His writing was translated into Chinese, Korean, German, Russian, Polish, Spanish, French, and other languages. His cutting-edge thought leadership was featured in over 650 articles and 550 interviews in prominent venues. They include Harvard Business Review, Fortune, Inc. Magazine, Business Insider, Fast Company, Forbes, and elsewhere. His expertise comes from over 20 years of consulting, coaching, and speaking and training for mid-size and large organizations ranging from Aflac to Xerox. It also comes from his research background as a behavioral scientist. After spending 8 years getting a PhD and lecturing at the University of North Carolina at Chapel Hill, he served for 7 years as a professor at the Ohio State University’s Decision Sciences Collaborative and History Department. A proud Ukrainian American, Dr. Gleb lives in Columbus, Ohio (Go Bucks!). In his free time, he makes sure to spend abundant quality time with his wife to avoid his personal life turning into a disaster. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on LinkedIn @dr-gleb-tsipursky, Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, Facebook @DrGlebTsipursky, Medium @dr_gleb_tsipursky, YouTube, and RSS, and get a free copy of the Assessment on Dangerous Judgment Errors in the Workplace by signing up for the free Wise Decision Maker Course at https://disasteravoidanceexperts.com/newsletter/.
comments powered by Disqus