How 4 types of cognitive bias contribute to physician diagnostic errors — and how to overcome them

Staff -

Diagnostic errors affect approximately 12 million U.S. adult patients each year, according to a 2011 study published by the U.S. National Library of Medicine. Such errors could harm patients and also make physicians more vulnerable to medical malpractice claims.

Various issues could lead to diagnostic errors, including misinterpretation of clinical studies, narrow diagnostic focus, inadequate or inappropriate testing, failure to adequately assess a patient's condition, failure or delay in obtaining a referral or consult and overreliance on a previous diagnosis.

Cognitive biases — systematic errors in thinking that influence decision making and judgment — belie many of these errors, which can ultimately lead to missed or inaccurate diagnoses and patient harm, as well as lawsuits.

However, it's important to note cognitive biases are intrinsic to the human mind.

"Cognitive bias is not an illness. It's not something that is bad or people should be judged as bad for having," says Geri Amori, PhD, vice president of academic affairs at Coverys, a medical professional liability insurance firm. "It is a normal thought process in all human beings. It's inherent in everything that each of us does daily."

Despite the prevalence of cognitive biases, their influence and potential effects on clinical decision making are rarely discussed, and as a result, not addressed. But as health systems and hospitals drive forward in their pursuit to provide high-quality care in safe environments, it has become more important than ever before to acknowledge the cognitive components that influence clinicians' decision making.

Types of cognitive biases

Dr. Amori, who holds a doctorate in counselor education and a master's degree in counseling and human systems, says four types of cognitive bias are most common.

  1. Anchoring bias is the tendency to rely too heavily on one piece of information or idea — usually the first — when making decisions.
  1. Wishful thinking bias is the idea that people believe in what they want to be true. This bias could cause someone to overestimate the rewards while underestimating the risks of certain decisions.
  1. Confirmation bias is the tendency to look for information that confirms one's preconceptions, often while dismissing information that may challenge them.
  1. Availability heuristic is the tendency to overestimate the likelihood of events that are more readily available in one's memories.

Everyone is susceptible to these and many other cognitive biases. But without understanding how they influence our thoughts, clinicians may not be aware of their own lack of objectivity in decision making.

"Clinicians believe they're just making knowledgeable decisions with the information at hand," says Dr. Amori. "And they are. They are just focusing on certain pieces of information and seeing it through the lens of what they remember."

Simply deploying a "debiasing" strategy is not a tenable solution, according to the article "From Mindless to Mindful Practice — Cognitive Bias and Clinical Decision Making" by Pat Croskerry, MD, PhD, which was published in The New England Journal of Medicine in 2013. "First, many decision makers are unaware of their biases, in part because our psychological defense mechanisms prevent us from examining our thinking, motivation and desires too closely," Dr. Croskerry wrote. "Second, many clinicians are unaware of, or simply don't appreciate the effect of, such influences on their decision making."

However, there are certain steps physicians and healthcare leaders can take to mitigate the influence of cognitive bias on diagnoses, according to Dr. Amori. In addition to acknowledging that cognitive biases exist, physicians must "force themselves to expand their view" during the diagnostic process.

This may mean adopting diagnostic decision tools, such as those that analyze a patient's medical history and current symptoms, and compute a list of likely diagnoses. "A physician may think about the most common diagnosis" for a given set of symptoms, "but they may not think about the 10 or 15 possible others," says Dr. Amori. "When they see a list of possibilities, they may think, 'Oh, it could be that,' instead of where they stopped mentally."

Another essential component of avoiding the effects of cognitive bias on the diagnostic process is creating a culture of safety that promotes open communication and learning.

To create a culture of safety, start by recognizing human imperfection

Dr. Amori defines a culture of safety as one that "encourages the sharing of learning, as well as communication among providers and all members of the care team," she says. "In a culture of safety, there is open communication where care team members are able to help each other see where bias may have entered into a decision and how they can be avoided in the future."

Errors that stem from cognitive bias are difficult for physicians to discuss because they're personal. Acknowledging them may feel like admitting failure, according to Dr. Amori. Unfortunately, traditional medical training and many healthcare environments have reinforced the mindset that clinicians must operate as lone, strong and infallible practitioners. This dangerous self-view sets physicians up for burnout, imposter syndrome and a profound sense of worthlessness.

"If you are in a culture where leadership sets the expectation that you always have to be perfect, always have to know what you're doing and can never make a mistake, you'll never have a true, open culture of safety," says Dr. Amori. "You may have one that has the trappings of a culture of safety, but it won't be there underneath."

By cultivating a culture that recognizes basic human vulnerabilities, clinicians can become more open to reflecting upon and discussing diagnostic errors that derive from cognitive biases. They can become more compassionate toward themselves and empathetic to peers with a greater understanding that mistakes are inevitable. Most importantly, they will create opportunities to learn from errors and devise strategies to avoid them in the future.

"The culture of medicine has never been one that encourages discussion of physician vulnerability," says Dr. Amori. "I'm seeing that change, and I'm hoping it continues in that direction so physicians can help each other provide the optimum care for patients while examining their personal feelings and thought processes."

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.