Clinical Intuition & EBP? (Science in Practice, Part 3)

Evidence-based practice relies heavily on practitioners’ clinical judgment but clinical judgment is fallible.  Our fast thinking (system 1) helps us be fast and frugal as we judge and decide —we quickly attend to simplified aspects of complex situations to get a ‘good enough’ answer, and are prone to perception and reasoning biases and errors. Our slow thinking (system 2) kicks in to prevent and correct biases and errors —we purposefully reason to consider more aspects of the situation and be more rigorous as we draw inferences(1).

But not so much. Decision science has shown that in fact both fast and slow thinking are motivated—if quick impressionistic thinking doesn’t yield the answer we expect or want, then we are prone to use our slower reasoning skills to better fend off or construe disconfirming evidence to fit our goals(2). 

Consider one simple example: data show that 37-49% of clients do not respond to CBT for depression(3).  Well-trained research therapists under good clinical supervision doing the most empirically supported protocols get that response rate. Now estimate, what percentage of your depressed clients will not respond to treatment?

Do you think, “probably somewhere between a third to a half of my clients won’t respond to CBT for depression?” Or does this data jar your intuitive sense that most of your clients get better? Purposeful reasoning probably kicks in—you ask, which studies I am referring to, what exactly did they do, you explain that what you do is different and better than the studies’ protocols. Notice the natural way we are prone to purposefully reason to fit our goals. This, along with other fallibilities of clinical judgment, makes it problematic to rely on clinical judgment as we do evidence-based practice.

What to do about this?

Kahneman and Klein

In search of ways to overcome the weaknesses in clinical decision-making, I found a great article by Daniel Kahneman and Gary Klein “Conditions for Intuitive Expertise: A Failure to Disagree.”(4) Klein studies how amazing expert intuition can be in complicated situations (natural decision making). Kahneman studies the predictable ways intuitive judgment fails (heuristics and biases).

They define “intuitive judgments” as those that come to mind without our awareness of their evoking cues. Some arise as we detect cues due to experience and skill, others from simplifying heuristics that are prone to systematic biases. Because intuitive hunches and impressions arise from detecting patterns out of awareness, we seldom explicitly evaluate the validity of the cues we use.

What these scholars from opposing traditions agree upon is that intuitive expertise is possible. But it won’t develop unless you have two things:

1.      An environment that has high validity, i.e., an environment with stable relationships between objectively identifiable cues and subsequent events or between cues and the outcomes of possible actions.

2.      Adequate opportunities to learn the environment.

Kahneman & Klein give examples of intuitive expertise: over years of observing, studying, and debriefing, a fireground commander learns to detect subtle cues that signal a building’s imminent collapse and a neonatal intensive care nurse develops the ability to detect cues that signal imminent infection. The cues in their work environment signal the probable relationships among causes and outcomes of behavior (valid cues). Standard methods, clear feedback, and direct consequences for error make it possible to learn the rules of these environments.

In other words, some professions have ways of training pattern recognition and making valid inferences from observation. They calibrate unconscious processes, and train them to select some hunches about suspected patterns for the attention of System 2’s deliberate analysis(5).  Hunches based on invalid cues are likely to be detected and assessed for error.  Hunches become smarter.

But our profession faces low validity environments—along with stockbrokers, college admissions officers, court judges, personnel selectors and intelligence analysts—where the predictability of outcomes and availability of good feedback are poor and the cues are dynamic rather than static.

Our circumstances are “wicked environments”(7). Whereas kind environments offer plenty of valid cues, wicked environments are those without feedback or where the regularities are misleading. He gives the example of a doctor who had miraculous ability to predict typhoid by examining the tongue—and because he didn’t wash his hands, was systematically causing disease onset by infecting each person he examined! “(T)he fact that you take a particular action can prevent you from learning about possible outcomes associated with the actions you did not take.”7

To do evidence-based practice, we need ways to structure our environment to make the relationships between objectively identifiable cues and subsequent events or between cues and the outcomes of possible actions more learnable.  In other words, if I say to you, “based on controlled studies, one would guess 30-50% of our depressed clients wouldn’t get better.” Then you’d say, “let’s look at our data. Here are the outcomes for the last 20 clients I worked with.”  We’d have the routines, tools, and culture that made a kind environment, full of valid cues about the relationship between our actions and outcomes.  

Right? When it comes to getting clients good outcomes, we want to help each other be good, not look good.

What do you think?  Are there tools and routines you have in place to create a kind environment so you can better learn the regularities? 

-- Kelly Koerner, PhD

 

1 Kahneman, D. (2011) Thinking, fast and slow. Farrar, Straus and Giroux, New York.

2 Kahan, D. M. (2012). Ideology, Motivated Reasoning, and Cognitive Reflection: An Experimental Study. Judgment and Decision Making, 8, 407-24

3 Persons [insert link to talk]

4 Kahneman, D & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree.  American Psychologist, 64, 515-526. http://dx.doi.org/10.1037/a0016755

5 Kahan, D. M. Part2http://www.culturalcognition.net/blog/2013/7/24/integrated-reciprocal-dual-process-reasoning-and-science-com.html

6 Hogarth, R. M. (2001). Educating intuition. Chicago, IL: The University of Chicago Press

7 Hogarth http://www.econ.upf.edu/docs/papers/downloads/654.pdf