- Compare various biases and errors in decision making
There are two types of decisions—programmed and non-programmed. A programmed decision is one that is very routine and, within an organization, likely to be subject to rules and policies that help decision makers arrive at the same decision when the situation presents itself. A nonprogrammed decision is one that is more unusual and made less frequently. These are the types of decisions that are most likely going to be subjected to decision making heuristics, or biases.
As we become more embroiled in the rational decision making model—or, as we discussed, the more likely bounded rationality decision making model—some of our attempts to shortcut the collection of all data and review of all alternatives can lead us a bit astray. Common distortions in our review of data and alternatives are called biases.
You only need to scroll through social media and look at people arguing politics, climate change, and other hot topics to see biases in action. They’re everywhere. Here are some of the more common ones you’re likely to see:
The overconfidence bias is a pretty simple one to understand—people are overly optimistic about how right they are. Studies have shown that when people state they’re 65–70% sure they’re right, those people are only right 50% of the time. Similarly, when they state they’re 100% sure, they’re usually right about 70–85% of the time.
Overconfidence of one’s “correctness” can lead to poor decision making. Interestingly, studies have also shown that those individuals with the weakest intelligence and interpersonal skills are the most likely to exhibit overconfidence in their decision making, so managers should watch for overconfidence as a bias when they’re trying to make decisions or solve problems outside their areas of expertise.
The anchoring bias is the tendency to fix on the initial information as the starting point for making a decision, and the failure to adjust for subsequent information as it’s collected. For example, a manager may be interviewing a candidate for a job, and that candidate asks for a $100,000 starting salary. As soon as that number is stated, the manager’s ability to ignore that number is compromised, and subsequent information suggesting the average salary for that type of job is $80,000 will not hold as much strength.
Similarly, if a manager asks you for an expected starting salary, your answer will likely anchor the manager’s impending offer. Anchors are a common issue in negotiations and interviews.
The rational decision making process assumes that we gather information and data objectively, but confirmation bias represents the gathering of information that supports one’s initial conclusions.
We seek out information that reaffirms our past choices and tend to put little weight on those things that challenge our views. For example, two people on social media may be arguing the existence of climate change. In the instance of confirmation bias, each of those people would look to find scientific papers and evidence that supports their theories, rather than making a full examination of the situation.
Hindsight bias is the tendency we have to believe that we’d have accurately predicted a particular event after the outcome of that event is known. On the Saturday before a Super Bowl, far fewer people are sure of the outcome of the event, but on the Monday following, many more are willing to claim they were positive the winning team was indeed going to emerge the winner.
Because we construct a situation where we fool ourselves into thinking we knew more about an event before it happened, hindsight bias restricts our ability to learn from the past and makes us overconfident about future predictions.
Representative bias is when a decision maker wrongly compares two situations because of a perceived similarity, or, conversely, when he or she evaluates an event without comparing it to similar situations. Either way, the problem is not put in the proper context.
In the workplace, employees might assume a bias against white males when they see that several women and minorities have been hired recently. They may see the last five or six hires as representative of the company’s policy, without looking at the last five to ten years of hires.
On the other side of the coin, two high school seniors might have very similar school records, and it might be assumed that because one of those students got into the college of her choice, the other is likely to follow. That’s not necessarily the case, but representative bias leads a decision maker to think because situations are similar, outcomes are likely to be similar as well.
Availability bias suggests that decision makers use the information that is most readily available to them when making a decision.
We hear about terrorism all the time on the news, and in fictional media. It’s blown out of proportion, making it seem like a bigger threat than it is, so people invest their time and efforts to combat it. Cancer, however, kills 2,000 times more people. We don’t invest in that, it doesn’t get enough news coverage, and it’s not as “available” in our mind as information. Hence, the availability bias.
This is an increased commitment to a previous decision in spite of negative information. A business owner may put some money down on a storefront location to rent DVDs and Blu-rays, start purchasing stock for his or her shelves and hire a few people to help him or her watch the cash register. The owner may review some data and stats that indicate people don’t go out and rent videos too much anymore, but, because he or she is committed to the location, the stock, the people, the owner is going to continue down that path and open a movie rental location.
Managers sometimes want to prove their initial decision was correct by letting a bad decision go on too long, hoping the direction will be corrected. These are often costly mistakes.
If you are certain your lucky tie will help you earn a client’s business at a meeting later today, you’re committing a randomness error. A tie does not bring you luck, even if you once wore it on a day when you closed a big deal.
Decisions can become impaired when we try to create meaning out of random events. Consider stock prices. Financial advisors feel they can predict the flow of stock prices based on past performance, but on any given day, those stock prices are completely random. In reality, these advisors were able to predict the direction of stock prices about 49 percent of the time, or about as well as if they’d just guessed.
In the case of the lucky tie, that’s more a superstition. Decision makers who are controlled by their superstitions can find it difficult or impossible to change routines or objectively process new information.
Managers who can objectively collect data and arrive at alternatives without being affected by these biases are already head-and-shoulders above other decision makers who aren’t aware of these pitfalls. Finding unique solutions to unique problems requires a little something more, though. Creativity in decision making can take you to the next step. We’ll talk about that next.