Unforced Errors: Cognitive Psychology for Risk Managers

Dr. Michael Lacroix, Ph.D.
Medical Director, The Hartford
background image

I’ll bet this has happened to you – it happens to me every time. You buy a new car, and then you start noticing that there are a lot more of these cars around town than you had thought. Is it that all your neighbors are suddenly discovering what you discovered about this brand, that it’s really good?

Actually, it’s a form of what psychologists call Confirmation Bias. Because you’ve bought the car, you’re more aware of it, and your antennas are selectively on the lookout for the brand. Another example of confirmation bias, and one that is very relevant these days, has to do with where you get your news. If you tell me which cable news channel you watch and which thought leaders you follow or which newspapers you read, I’ll bet I can predict your political views. Like attracts like. How do you combat confirmation bias? Get your information from multiple, independent sources.

In his best-selling book, Thinking Fast and Slow, Nobel-prize winning psychologist Dr. Daniel Kahneman outlined a number of ways in which our brains’ wiring creates unconscious biases in how we evaluate information. Risk managers, whose jobs require evaluating evidence and who like to think of themselves as logical, analytical and data-driven, might be surprised at the myriad ways in which they can be fooled if they’re not extra careful. Confirmation bias is one; and there are a couple of others.

Memory is one cognitive function that is very susceptible to being fooled. There is much research on the unreliability of eyewitness testimony (look up Elizabeth Loftus’ Wikipedia page), and the courts are beginning to pay attention. But did you know that imagining that we performed a particular action causes people to misremember that they actually performed it? There is also a frequency effect that impacts memory: If you hear something repeated often, it will come to “feel” true as the repetition renders the statement familiar. This again is an unconscious process that takes place even if you knew the statement to be false initially. Comedian Stephen Colbert’s concept of truthiness (which was named Word of the Year in 2005) follows the same logic: What “feels” true must surely be true! Remember, just because something “feels” true does not obviate the need for more thorough investigations. To counter our fallible memories, double check your “memory” against objective data.

Another interesting unconscious process that can take us down the wrong path is the Substitution Heuristic. When a satisfactory answer to a hard question cannot be found readily, our brains look for an easier question to answer, and answer that one instead. And so, is Competitor A likely to eat into our margins next year, is translated to, How much are they affecting our business this year – a much easier question to answer. Or ask yourself how you would have approached this question this past January: What are the risks associated with a possible pandemic? To counter this one, try to catch yourself doing the substitution, and then laser focus on doing the harder analytical work.

One last one: Anchoring. Anchoring refers to considering a certain value to an unknown quantity before evaluating what the value should be. Retailers are masters at using Anchoring to fool us. Which would be a better value, a pair of shoes with a price tag of $149, or a pair of shoes with a price tag of $199, marked down to $149? Unconsciously, we would opt for the latter. But we would be fooled twice. Second, because we would assume that the mark-down is genuine, but first because we would have anchored the value of the shoes to the suggested price. Maybe they’re only worth $79, but we would have agreed to start the bidding at $149. How do you estimate risk – do you build the model from scratch, or do you start with existing estimates? Note that doctors are not immune to the anchoring error. So often busy doctors will start their evaluation of a new patient with the diagnosis given by the previous treater – which speaks to the usefulness of independent evaluations.

Our brains are magnificent organs, but they’re not fool-proof. Caveat emptor!

By: Dr. Michael Lacroix, Ph.D.
Medical Director, The Hartford

Dr. Michael Lacroix is Medical Director with The Hartford. He is a licensed psychologist in Florida. He has worked in the overlapping areas of Disability and Workers' Compensation for the last 30 years, in a variety or clinical, management, program development, and consulting roles. Prior to joining the corporate world (Concentra, then Coventry, then Aetna, and now The Hartford) Dr. Lacroix held academic appointments, he developed a large clinical practice focused on assessment, treatment, and rehabilitation of injured and disabled workers, and he also carried out grant-supported research over many years, resulting in over 100 peer-reviewed publications and papers. He is a frequent contributor at industry conferences, including PRIMA. He currently resides in Sarasota, FL.

Sign Up for Our Education Newsletter

You Might Also Be Interested In