You probably heard the business advice of “failing to plan is planning to fail.” That phrase is a misleading myth at best and actively dangerous at worst. Making plans is important, but our gut reaction is to plan for the best-case outcomes, ignoring the high likelihood that things will go wrong.
A much better phrase is “failing to plan for problems is planning to fail.” To address the very high likelihood that problems will crop up, you need to plan for contingencies.
When was the last time you saw a major planned project suffer from a cost overrun? It’s not as common as you might think for a project with a clear plan to come in at or under budget.
For instance, a 2002 study of major construction projects found that 86% went over budget. In turn, a 2014 study of large IT projects found that only 16.2% succeeded in meeting the original planned resource expenditure. Of the 83.8% of projects that did not, the average IT project suffered from a cost overrun of 189%.
Such cost overruns can seriously damage your bottom line. Imagine if a serious IT project such as implementing a new database at your organization goes even 50% over budget, which is much less than the average cost overrun. You might be facing many thousands or even millions of dollars in unplanned expenses, causing you to draw on funds assigned for other purposes.
Moreover, cost overruns often spiral out of control, resulting in even bigger disasters. Let’s say you draw the extra money from your cybersecurity budget. As a result, you’ve left yourself open to hackers, who successfully stole customer data, resulting in both bad PR and loss of customer trust.
What explains cost overruns? They largely stem from the planning fallacy, our intuitive belief that everything will go according to plan, whether in IT projects or in other areas of business and life. The planning fallacy is one of many dangerous judgment errors, which are mental blindspots resulting from how our brain is wired that scholars in cognitive neuroscience and behavioral economics call cognitive biases. We make these mistakes not only in work, but also in other life areas, for example in our shopping choices, as revealed by a series of studies done by a shopping comparison website.
Fortunately, recent research in these fields shows how you can use pragmatic strategies to address these dangerous judgment errors, whether in your professional life, your relationships, your shopping choices, or other life areas.
You need to evaluate where cognitive biases are hurting you and others in your team and organization. Then, you can use structured decision-making methods to make “good enough” daily decisions quickly; more thorough ones for moderately important choices; and an in-depth one for truly major decisions.
Such techniques will also help you implement your decisions well, and formulate truly effective long-term strategic plans. In addition, you can develop mental habits and skills to notice cognitive biases and prevent yourself from slipping into them.
For instance, we can address the planning fallacy by planning around it. Such planning involves anticipating what problems might come up and addressing them in advance by using the research-based technique of prospective hindsight, by envisioning yourself in the future looking back at potential challenges in the present. It also involves recognizing that you can’t anticipate all problems, and building in a buffer of at least 40% of the project’s budget in additional funds. If things go better than anticipated, you can always use the money for a different purpose later.
First, break down each project into component parts. An IT firm struggled with a pattern of taking on projects that ended up losing money for the company. We evaluated the specific component parts of the projects that had cost overruns and found that the biggest unanticipated money drain came from permitting the client to make too many changes at the final stages of the project. As a result, the IT firm changed their process to minimize any changes at the tail end of the project.
Second, use your past experience with similar projects to inform your estimates for future projects. A heavy equipment manufacturer had a systemic struggle with underestimating project costs. In one example, a project that was estimated to cost $2 million ended up costing $3 million. We suggested making it a requirement for project managers to use past project costs to inform future projections. Doing so resulted in much more accurate project cost estimates.
Third, for projects with which you have little past experience, use an external perspective from a trusted and objective source. A financial services firm whose CEO I coached wanted to move its headquarters after it outgrew its current building. I connected the CEO with a couple of other CEO clients who recently moved and expressed a willingness to share their experience. This experience helped the financial services CEO anticipate contingencies he didn’t previously consider, ranging from additional marketing expenses to print new collateral with the updated address to lost employee productivity due to changing schedules as a result of a different commute.
If you take away one message from this article, remember that the key to addressing cost overruns is to remember that “failing to plan for problems is planning to fail.” Use this phrase as your guide to prevent cost overruns and avoid falling prey to the dangerous judgment error of planning fallacy.
Because we usually feel that everything is going to go according to plan, we don’t pay nearly enough attention to potential problems and fail to account for them in our plans. This problem is called a planning fallacy. Click To Tweet
Questions to Consider (please share your thoughts in the comments section)
- Do you agree that “failing to plan is planning to fail” is misleading? If not, why not?
- Where have you seen the planning fallacy lead to problems for your team and organization?
- How might you help your team and organization address the planning fallacy? What are some next you can take to do so?
Image credit: Pixabay/Rawpixel
Bio: Dr. Gleb Tsipursky is on a mission to protect leaders from dangerous judgment errors known as cognitive biases. His expertise and passion is using pragmatic business experience and cutting-edge behavioral economics and cognitive neuroscience to develop the most effective and profitable decision-making strategies. A best-selling author, he wrote Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (2019), The Truth Seeker’s Handbook: A Science-Based Guide (2017), and The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships (2020). Dr. Tsipursky’s cutting-edge thought leadership was featured in over 400 articles and 350 interviews in Fast Company, CBS News, Time, Business Insider, Government Executive, The Chronicle of Philanthropy, Inc. Magazine, and elsewhere.
His expertise comes from over 20 years of consulting, coaching, and speaking and training experience as the CEO of Disaster Avoidance Experts. Its hundreds of clients, mid-size and large companies and nonprofits, span North America, Europe, and Australia, and include Aflac, IBM, Honda, Wells Fargo, and the World Wildlife Fund. His expertise also stems from his research background as a behavioral economist and cognitive neuroscientist with over 15 years in academia, including 7 years as a professor at the Ohio State University. He published dozens of peer-reviewed articles in academic journals such as Behavior and Social Issues and Journal of Social and Political Psychology.
He lives in Columbus, OH, and to avoid disaster in his personal life makes sure to spend ample time with his wife. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, Facebook, YouTube, RSS, and LinkedIn. Most importantly, help yourself avoid disasters and maximize success, and get a free copy of the Assessment on Dangerous Judgment Errors in the Workplace, by signing up for his free Wise Decision Maker Course.
Originally published at Disaster Avoidance Experts on October 27, 2019.