Lecture – Examining the Evidence for ‘Evidence-Based’ Therapy

Available with English captions.

Presented by Jonathan Shedler, PhD, University of California San Francisco – The Adam Corneel Major Teachers in Psychotherapy Lecture

Evidence-based therapies—brief, manualized therapies, usually involving cognitive behavior therapy (CBT)—are widely promoted as the gold standard of care. Research literature and stories in the mainstream press consistently present evidence-based therapy as scientifically proven to be extremely effective in addressing conditions such as depression, panic disorder, and PTSD.

But is evidence-based therapy as effective as the studies indicate?

Lecture highlights include:

  • Discussion of why research on evidence-based therapy and empirically supported therapy (e.g., manualized CBT) demonstrates that these therapies are ineffective for most patients
  • Analysis of biases in psychotherapy research that may render findings inapplicable to real-world practice
  • An examination of the roles of bias and research design problems in inflating the reported benefits of evidence-based therapies
  • Suggestions for developing studies into evidence-based therapies that better assess outcomes and patient experiences

In this talk, Shedler explores the gap between the rhetoric and the reality of evidence-based therapy. He does a deep dive into several major studies on these treatments and finds that most patients who receive these therapies do not get well.

For example, Shedler discusses a comprehensive American Psychological Association report on the use of evidence-based treatment for depression. The review offered clinical practice guidelines for the treatment of depression, which strongly recommend that clinicians practice evidence-based therapies.

However, the report also found that more than half of patients studied remained depressed after completing evidence-based treatment. And, of those who do improve by the end of treatment, about 40% of those experienced a relapse, typically within six months to a year.

Shedler cites several other psychotherapy studies of evidence-based therapies showing similar gaps between recommendations and results. He discusses the factors that may lead to this disconnect. He explains how study design and publication bias may lead to misleading conclusions. Also, Shedler asserts that the criteria guiding many studies focus on research methodology and have nothing to with how the therapy benefits patients.

To address the problem, Shedler offers strategies for making psychotherapy research more clinically relevant. Specifically, he calls for more naturalistic research that tracks patient progress throughout therapy. More useful studies into evidence-based therapy, he says, should cover a period of at least six to 18 months “and look at just how many sessions does it take before something meaningful starts to shift.”