Eat. Sleep. Think. Repeat.

By: Amy van Zuydam

Research shows that the higher you score on an IQ test, the more likely you are to fall victim to bias, because you are faster at recognizing patterns. While reading Adam Grant’s book, Think Again, this jumped off the page at me. It’s possible that my superiority bias was acting up, but nonetheless, it certainly got me thinking. Being equipped with a good brain, or at least being surrounded in your organization by individuals with good brains, doesn’t necessarily mean that the mental ‘heat you’re packing’ will be put to good use. Implementing checks and balances to assess your thinking and decision-making should form part of day-to-day business. This article will address some of the common errors in judgement that individuals and teams make, together with some practical tips to avoid falling into these mental traps.

Broadly speaking, people fall victim to two different types of decision-making errors, namely cognitive bias and noise.

Overcoming cognitive biases in decision-making
The influence of noise on professional judgments
Strategies for minimizing decision-making errors
The impact of confirmation and desirability bias in problem-solving
Addressing common errors in judgment in organizations.

Cognitive Bias

Cognitive bias refers to systematic errors in judgement. In essence, the decisions we make are influenced by the brain’s tendency to perceive information through a filter of personal experience and preferences. Stemming from the numerous mechanisms which the brain has adopted over time to help conserve energy (thinking is tiring, so the brain creates shortcuts), cognitive biases compromise our ability to make good decisions.

There are hundreds of different biases. There’s probably a bias which accounts for the types of biases I’ve chosen to focus on in this article. I’m narrowing it down to a few examples which impact 1) our ability to process information as individuals, and 2) our ability to problem-solve in groups. It’s a drop in the ocean - and for the sake of my word count I’ll try keep it brief - but increasing awareness around these common mistakes allows us to consider the potential knock-on impact, and more importantly, how we could go about reducing it.

Let’s start with a couple which relate to how individuals receive and process information. Confirmation bias causes you to seek evidence that confirms your existing beliefs and reject evidence that doesn’t, regardless of the quality of the information. This means you only see what you expect to see. Similarly, desirability bias causes you to see only what you want to see. The presence of these types of biases during your problem-solving process could have devastating effects. The ability to problem-solve is the ability to find answers to difficult questions. It’s logical that these biases could lead us to finding sub-optimal solutions to our problems. However, one shouldn’t overlook the tendency of confirmation and desirability bias to cause us to focus on the wrong problems entirely. If a problem well-defined is a problem half solved, your ability to identify and clearly articulate the problem statement is arguably the most important step. When too much emphasis is placed on speed of execution, and not enough time is spent on understanding the problem, you may end up answering a different question; most likely one with a more accessible answer.

While many of us may fancy ourselves as autonomous thinkers (cue philosophical debate), our decisions are greatly influenced by those around us. This is known as conformity bias. Humans tend to mimic the behavior of those around them, even when it goes against their personal judgement. Thrown into group environments created for ideation, the desire for harmony results in groupthink, meaning divergent opinions become suppressed by self-censorship. On a similar note, authority biasshows that we are also more likely to support the opinions of authority figures. Even when other ideas are more creative and relevant, ideas coming from senior team members generally trump all. These types of inter-personal traps steer us away from creative thinking and smother the hopes and dreams of budding innovative ideas. Poor little guys.

Noise

Noise, in contrast to bias, is termed as unwanted random variability in judgement. Our judgments are strongly influenced by irrelevant factors, such as our mood, the time since our last meal, and the weather. Academic researchers have confirmed on numerous occasions that professionals often contradict their own prior judgments when given the exact same data on two different occasions. Kahneman, et al (2016) give the example of a study where software developers were asked on two days, separated by time, to estimate the completion time for a given task. Results showed that the hours they projected differed by 71% on average. It boils down to the fact that humans are a lot less reliable at making decisions than we think.

Minimising decision errors

Thankfully, while it’s not possible to eliminate errors in judgement completely, we can work at reducing them. Armed with an open mind (and the ability to admit that you may have been wrong), there are several strategies that can be employed during problem-solving to minimize errors. As a starting point, challenge yourself to answer the following questions:

Frame the question - What problem am I trying to solve?

Ensure the problem statement or opportunity is clearly understood and correctly framed. Phrase the problem in multiple ways before you begin trying to approach it. This stimulates pre-thinking and results in better quality thinking. Taking time to frame the question reduces the risk of answering the wrong question.

Create multiple options – What options do I have for solving the problem?

Learn to form your own second opinions. Kahneman refers to the concept of the wisdom of the inner crowd: when asked the same question twice, separated by time, the average of the two answers is more accurate than either independently. Thinking more than once about a problem increases your sample size and encourages consideration of differing perspectives, thereby adding valuable information. Part of the value of working in teams stems from the diversity of opinions. Teams should consider using lateral thinking techniques to stimulate diversity and make use of trained facilitators who are able to identify biases when they start popping up.

List assumptions - What can I safely assume?

Once you have identified possible solutions, work on identifying the things that you are taking for granted in the fruition of those solutions. Make a list of the things which you deem as unknowns. In listing these unknowns, it opens the possibility of seeking verification of your assumptions as more information becomes available, or alternatively, managing the risk of having made the incorrect assumption.

Evaluate options against evidence - Based on available information, what is the best way to solve the problem?

Test your assumptions against evidence. Use data, facts, observations, and experiences to assess the validity of your assumptions. Where there is no evidence to support an assumption, assess the risk of proceeding without this information. To keep yourself honest, consider which conditions might cause you to rethink your assumptions in future. i.e., “In the presence of x I would need to think again about my solution”.

Challenge yourself to think again - How can I think differently about this?

Get into the habit of questioning your thoughts – and better still – invite others to do the same. This is easier said than done. As Adam Grant mentions in his book, we live in a world which mistakes confidence for competence and encourages us to avoid the discomfort of doubt. However, adopting a learning mindset as individuals and organizations gives us the freedom to perceive being wrong as a discovery that we’ve learnt something. There are various strategies which can be employed to help you do this. Be proactive about your learning by seeking out information and opinions that contradict your own. Try establishing a ‘challenge network’ of individuals whose purpose is to challenge you, give you critical feedback and play devils advocate. Getting into the habit of generating counterfactuals to reflect on past decisions is another effective technique. The process involves uncovering possible alternatives to outcomes from past events by choosing to modify one or more factors which led to the outcome and then assessing the possible consequences of that modification.

Eat, sleep, think, repeat – How can I give myself the best chance of making the right decision?

Remember that you are human, and therefore, you are sensitive to random environmental factors. Do what you can to ensure the best possible conditions for decision making. Eat a sandwich, take a break, get some sleep and see how you feel in the morning. When you are feeling tired and drained, your brain is more likely to take short-cuts. On a team level, organizations can adopt processes and procedures that promote consistency. Ensuring consistency across the tools, methods, and checklists that employees in the same role utilize can go a long way towards reducing random errors.

In summary, we find ourselves in a rather sticky situation. We are unreliable decision makers by default and the “I’m not biased” bias is rife among us. We’ve been taught to have confidence in our convictions but Dunning and Kruger might hint that you’re less competent than you think you are. The good news is that you can strive to be better. Taking the above pointers into consideration during your problem-solving process will hopefully serve as a reminder that you are human, and therefore, [slightly] less than perfect. Getting into the habit of questioning your own thinking can help you avoid mental traps. At the very least, taking the opportunity to reflect and rethink might introduce some fresh perspective. If it doesn’t, try eating a sandwich.

References:

Grant, A. (2021) Think Again: The Power of Knowing What You Don't Know. Viking.

Kahneman, D., Rosenfield, A. M., Gandhi, L., Blaser, T. (2016) Noise: How to Overcome the High, Hidden Cost of Inconsistent Decision Making. Harvard Business Review.

divider