Barriers to Science in Practice (Part I)

snowflakes mean no science

I'm working on a chapter, "Science in Practice," for a forthcoming book edited by Steve Hayes and Stefan Hoffman on Core Competencies.

It's a practical chapter. I began by listing the barriers I encounter to using a science-based approach to therapy, so I can then spell out principled workarounds. Here comes a short series with some early thinking that I hope you'll mull over with me-would love to hear your thoughts via email or post in this blog.

A first problem: scientific research doesn't help me out as much as I wish it did. I'm not arguing, "We're all unique snowflakes" so you can't apply the scientific method to clinical practice because it's "too complex, ill-defined, multifaceted and situational.”1

Yet I am sympathetic to what’s valid in this family of arguments.
 
Say a person seeks help with depression. On the one hand, I am all for the actuarial2 application of evidence-based treatments—clinical judgment is fallible, and it could be my clients would get better faster if I stick, Checklist-Manifesto-style3, to the Behavioral Activation protocol.  
 
But on the other hand, my decision-making rapidly branches beyond actuarially applied protocols, even in straightforward cases.  Continuing the example, say ten minutes into a first assessment I note problems with insomnia and marital conflict. Already I am out of a single evidence-based manual for depression and considering whether and how to sequence interventions: treat depression first and see if the other two problems resolve? Or instead treat either insomnia or marital difficulties first because they are driving the depression? Some evidence guides me to treat insomnia and depression concurrently4.  Some evidence shows combining depression treatment and marital therapy could help depression and marital satisfaction5. Twenty minutes in, the client and I have added problems with alcohol use, excessive self-criticism, and experiential avoidance to our problem list. 
 
Alrighty then! 
 
When there are multiple problems, I face very complicated decision branching. I want conditional plans that have a high chance of success (if this client marker, then this intervention will regularly produce this change). Yet little research evidence is available to directly inform choices. Rather than throw up my hands and start thinking 'doesn’t apply, unique snowflake,’ I fall back to heuristics and the scientific method. Into principles and away from specific manual based packages.
 
My first go-to heuristic is case formulation a la Persons 6, 7, i.e., “Apply the scientific method to clinical practice. Be informed by research on problems and processes of change.”  But as self-evident as this move seems, case formulation itself is not well researched. Questions about reliability, validity and efficacy mean I cannot simply assume tailoring via case formulation will improve outcomes8.  (I’ll return to unpack these findings more in future posts because smart routines might make a difference.)
 
This dilemma—that I need to use science to correct for weakness in clinical judgment, yet must use clinical judgment to apply the science, yet exactly where science can’t inform me is where I’m most prone to errors—is a true dialectic. It often collapses into an either or position: Do It By The Book vs. The Book Doesn’t Apply. Being committed to evidence-based practice means instead maintaining the dialectical tension and humility to find workable solutions.
 
Know what I mean? 
 
A second problem:  keeping up with science in a responsible way. It’s a fire hose of information!  How to separate the wheat from the chafe?
 
While I still struggle, I’ve finally accepted that I as an individual cannot keep up with all the newest research that should inform my practice. I have to trust others to distill scientific findings.
 
But this can introduce mighty biases to my information stream. Structural constraints of social networks (e.g., the Matthew Effect9) mean highly cited articles become more likely to be cited. Without awareness, higher ranks on search engine results pages influence me10. I’m more likely to see the cool new article shared on Facebook amplified by the selection bias of like-minded friends.
 
To counter selection bias, I actively populate my feeds with diverse, dissenting voices. I search out colleagues who can give me a good bounce, understanding an idea and rapidly throwing it back with a wider set of implications (like Curry to Iguodala to Barbosa!11)
 
So passing the ball to you. Got thoughts, reactions, or solutions to either of these first 2 problems evidence-based therapists face? Write me at Kelly@ebpi.org or share ideas via this blog.

---Kelly Koerner, PhD


References

1 Cox, K. (1995) Clinical practice is not applied scientific method. Australia New Zealand Journal of Surgery, 65, 553-557  
2 Dawes, R. M., Faust, D. & Meehl, P. E. (1989) Clinical Versus Actuarial Judgment, Science 243 (1989): 1668–1674
3 http://atulgawande.com/book/the-checklist-manifesto/
4 http://www.nytimes.com/2013/11/19/health/treating-insomnia-to-heal-depression.html?_r=0 
5 Jacobson, N. S., Dobson, K, Fruzzetti, A. E., Schmaling, K. B., & Salusky, S. (1991) Marital therapy as a treatment for depression. Journal of Consulting and Clinical Psychology, 59, 547-557. http://dx.doi.org/10.1037/0022-006X.59.4.547
6 Persons, J. B. (1989). Cognitive therapy in practice: A case formulation approach. New York: Norton.
7 Persons, J. B. (2012). The case formulation approach to cognitive-behavior therapy. Guilford Press.
8 Kuyken, W., Fothergill, C. D., Musa, M., & Chadwick, P. (2005). The reliability and quality of cognitive case formulation. Behaviour Research and Therapy43(9), 1187-1201.
9 http://www.garfield.library.upenn.edu/merton/matthew1.pdf
10 http://www.sciencemag.org/news/2015/08/internet-search-engines-may-be-influencing-elections
11 http://bleacherreport.com/articles/2615470-warriors-vs-rockets-score-highlights-and-reaction-from-2016-regular-season