We are all influenced by many biases. Many of them make us good physicians. These cognitive biases are sometimes those quick tools in our thinking, like pattern recognition, that are essential for achieving efficiencies in thought, which is crucial to allow our brain to spend energy and focus on other less familiar things. But sometimes the quick-thinking shortcuts, also known as medical heuristics, can lead us astray. Are we able to recognize those situations?
As medical students, we first learned about different types of bias in the context of our clinical epidemiology courses. Bias in study design could affect whether the conclusions being drawn were clearly valid or open to the impact of confounding factors that were not otherwise adequately considered. We learned how case-control studies contained inherent design features that subjected them to potential recall bias, susceptibility bias, and detection bias, to name just a few. Many of these biases referred to recognized patterns in the ways that patients in these clinical trials remembered information or to selected patterns of behavior that were later being analyzed.
Such biases were characteristic of research studies. Other biases in my mind typically referred to the prejudicial ways other individuals thought or made decisions. For years, I thought that I did not carry biases. How wrong I was.
I readily acknowledge that I have used medical heuristics for years. These tools are strategies, applied either deliberately or implicitly, that lead us in quicker fashion to decisions or conclusions. Sometimes, these shortcuts in thinking lead us to decisions using only part of the information we might really need. Early in medical school, I learned “When you hear hoofbeats, think horses, not zebras” from my first internal medicine mentor, Theodore E. Woodward, MD, MACP, at the University of Maryland. Little did I know that I was using a medical heuristic that generally has served me well.
But the key word is generally. What if I were on safari in Africa or even visiting a zoo? Such a “hoofbeats” generalization might steer me wrong. When relying on heuristics, our thinking is limited due to the effect of our own inherent cognitive biases. I am demonstrating an anchoring bias if the situation makes me unduly persuaded by features encountered early in the case, thereby committing to a premature diagnosis. I hear hoofbeats and so I am thinking horses from the start. I might also exhibit recall bias because the last hundred times I heard hoofbeats, they were horses. Yet if in a given case I end up with a zebra and not a horse, my heuristic steered me wrong. I was “biased” into thinking erroneously.
Biases do not occur simply on individual levels. We all realize that there are societal and cultural biases, and these are now routinely called implicit bias. This refers to the tendency for stereotype-confirming associations around gender and race to occur outside of our conscious awareness. These types of bias are everywhere, just below the surface, and clearly hidden from our conscious thinking. We face challenges when they can lead to negative evaluations of individuals, whether our colleagues or our patients, based on irrelevant characteristics. As physicians, we are not immune to this, and we must recognize that.
At every level of our own thinking, we all need to pause and ask ourselves if we could be subject to any biases. This is not easy and takes a special self-awareness involving metacognition, the process by which we reflect upon and have the ability to analyze what and how we are thinking. At any point in time, we must pause and ask ourselves whether we are being impacted by one of our many cognitive biases and hence not considering other factors adequately. Would we all agree that metacognition leads us to improved diagnostic thinking and subsequently better care of our patients?
Similarly, institutions implement the pause of metacognition to address omnipresent implicit bias through policies. This became evident to me at ACP initially as a member of the Clinical Guidelines Committee several years ago. The committee has adhered to Institute of Medicine definitions for guidelines, which call out the need to identify and minimize conflicts of interest, which merely places bias in the context of self-interest, whether financial or intellectual. With the goal to have best evidence drive guideline development, there is no role for potentially biased opinions.
College policy around conflict of interest and disclosure of interest has become more explicit in these areas and is highlighted in two recent papers published by Annals of Internal Medicine, “The Development of Clinical Guidelines and Guidance Statements by the Clinical Guidelines Committee of the American College of Physicians: Update of Methods,” published June 11, 2019, and “Disclosure of Interests and Management of Conflicts of Interest in Clinical Guidelines and Guidance Statements: Methods from the Clinical Guidelines Committee of the American College of Physicians,” published Aug. 20, 2019.
College governance recognized a need for a greater pause in considering implicit bias across the entire organization and recently established a Diversity, Equity, and Inclusion Subcommittee. In July 2019, the Board of Regents approved several recommendations from this committee that included modifications of ACP's goals to include the following:
- To promote and respect diversity, inclusion, and equity in all aspects of the profession;
- To welcome, consider, and respect the many diverse voices of internal medicine and its subspecialties and work together for the benefit of the public, patients, our members, and our profession.
With focus in recent years on implicit and systematic bias, we need to maintain a high level of self-awareness across our organizations as well as in our individual thinking process. As we gather information and make conclusions, we need to recognize where the different biases are leading us in frequently helpful ways. But we must always be prepared to pause and ponder whether those same biases are making us not even consider less likely diagnostic possibilities, therefore increasing the risk for diagnostic errors. The heightened awareness of what we are thinking and why is an intrinsic aspect of our expertise as physicians, our ability to recognize what we know but also what we do not know.
We need many of these biases to make our thinking more efficient. As professionals, we need to be self-reflective and to be conscious of when they are potentially confounding our critical thinking and leading us down the wrong path. Bias, heuristics, metacognition, and conflicts of interest exist on a continuum and are all part of who we are. The better we understand and manage that, the better we will be as physicians caring for patients, and the better we will be in our personal lives.