Slanted Thinking: Overcome the mental biases that weaken your forecasts

By Jonathan Thatcher, CSCP | November/December 2013 |

Reader F.W. writes, “Our team tends to disagree on our forecasts, but our disagreements largely stem from subjective reasons. How can we eliminate the risk of bias in forecasting?”

Even the best forecasts from a quantitative standpoint will not satisfy everybody. The numbers alone rarely demonstrate the complexity of the forecast picture and the reasoning behind the determinations. Plus, even when teams are confident in the numbers behind the forecast formula, forecasts always contain a degree of uncertainty, as they try to predict future events based on past trends.

Qualitative analysis attempts to address the issue of the ever-uncertain forecast. There are more than a few reasonable questions to ask from a qualitative standpoint: Should some values be weighted greater than others? How aggressive (or conservative) should the forecast be? Will the forecast have to be justified to a skeptical audience? The danger of qualitative analysis is that it opens the door to cognitive bias.

Cognitive bias is the tendency of the human mind to form well-intentioned but skewed conclusions. Some psychologists think of cognitive biases as “thought shortcuts.” The mind works through vast data sets by filtering out complex data and using rapid or simple thought models to form conclusions. However, shortcuts leave people prone to faulty memories, poor understandings of probability, and inappropriate use of rules of thumb. These are just some of the ways cognitive biases can alter judgments.

Cognitive biases that have the ability to negatively affect forecasts generally fall into one of the following four categories.

Probability-based biases. One example of a probability-based bias is the illusory correlation bias—the incorrect perception that a relationship exists between two variables. An exaggerated illustration would be if a team member said, “Our forecasts are always more accurate if we take notes at the meeting with our lucky pen.” This indeed would be a specious connection. Also consider the memory-probability bias—the tendency to consider what is most vivid or clear in a person’s memory as the most likely possibility. An emotionally charged speech is one way to create lasting memories that can warp evaluations.

Decision-making biases. This category includes the negativity bias—the tendency to weigh negative information or experiences more heavily than positive ones. An individual might devote more attention to people who say demand is falling rather than those who say demand is rising, for example. Another decision-making predisposition is information bias, or the tendency to seek more and more information even when it will not affect the decision. This likely brings about forecasting paralysis.

Memory-related biases. These can be subtle. Consistency bias is the tendency to incorrectly evaluate past attitudes and actions as similar to those in the present. It’s the effect of thinking, “I’ve been consistent all along; why would anyone think I’m performing differently now?” Another member of this category is hindsight bias, the propensity to view past events as predictable based on current knowledge. The classic statement that illustrates this bias is “I knew it all along.”

Social biases. These tend to revolve around interactions with other people. One example is the false consensus bias. This occurs when an individual overestimates the level of agreement others have for his or her opinion. After all, it often is easier and faster to assume agreement than to verify by asking.

These are only a few examples of the many cognitive biases that exist. The important thing to keep in mind is that most people are unaware of their cognitive biases and that they automatically, unconsciously form flawed conclusions. Nobody intends to make biased evaluations, but intention is not enough to correct the problem.

Every forecasting model, no matter how well designed, still leaves open ways for cognitive biases to distort the picture. While it is difficult, if not impossible, to eliminate bias from people’s minds, the good news is that the better we understand ourselves, the better we can improve the way we make evaluations. Find a forecasting model with the right amount of simplicity and complexity that fits your group—and stick with it. No forecast is perfect; but, over time, practice will increase your skills and techniques and, hopefully, improve accuracy.

Jonathan Thatcher, CSCP, is director of research for the APICS professional development division. He may be contacted at