Evidence in the Psychological Therapies: A Critical Guide for Practitioners (Book Review)

Author:  Mace, Chris, Stirling Moorey and Bernard Roberts
Publisher:  New York: Brunner-Routledge, 2001
Reviewed By:  Brian Stagner, Winter 2005, pp. 46-47

Among the many challenges to psychodynamic psychotherapy, none has seemed as potentially ominous as the current drive toward empirically validated treatments (EVTs). Mace, Moorey, and Roberts are British psychiatrists who have assembled diverse authors to illuminate and critique the state of thinking about EVTs. The impetus for this compact volume was a conference on evidence in therapy. The discussion was held in the context of the National Health Service’s reforms intended to maximize quality assurance. As the editors note, “Evidence based practice is no longer a movement that any clinician can ignore.”

American readers, confronted at many sides by proponents of EVTs, may find the title of this small volume is somewhat misleading, but for some it will be a welcome and useful contribution. Much of mainstream academic psychology is engaged in single-minded advocacy of EVTs. A good deal of that writing endeavors, by supplying extensive reviews of the research literature, to settle, once and for all, just which particular treatments are approved. The results of these reviews are often promulgated quite definitively, as though the hard thinking had already been done about the subtleties of inconsistent evidence, the multiplicity of goals of treatment (beyond the a priori goals of the researcher), or the experience of the client.

Reviewing and cataloging research findings is unquestionably a foundation of scientific progress, but it is not the only building block. The collection of essays under review is a critique in a broader sense. Mostly, the contributors are less interested in weighing the inventory of what we know and are much more interested in puzzling over what it is we are thinking about. Several of the contributions discuss the nature of evidence. In the courtroom, it is observed, evidence is that which is necessary to resolve a case or dispute, whereas the scientific notion of evidence (ideally) rests on an evaluation of the totality of the evidence with the object of arriving at Truth. The contributors weigh in on various aspects of this problem. Various types of data are considered, from behavioral observations to neuroimaging to idiographic clinical data “from the couch.” Each has strengths and weaknesses and each is consistent with different research and clinical goals. From a scientific viewpoint, the strongest data are those from Randomised Controlled Trials (RCTs). The rationale for the (inevitable) elevation of this standard of evidence is nicely summarized here in a chapter that even skeptics will find persuasive.

Among practitioners, the juggernaut of RCTs is often resisted, on several grounds. Most notably, stringent research controls and inflexible operational definitions inexorably exclude some types of data. Although some contributors feel that the appreciation for all types of data is gaining ground, many readers will remain unconvinced. Attention is given to the possibility that narrowly focused, RCT-based outcome research may crowd out essential aspects of the matter at hand. To the extent that the data are selectively and partially represented, it is argued that the evidence has been marshaled more for a courtroom debate than a comprehensive scientific investigation. Likewise, as RCTs (and the methods which they entail) become reified as the only game in town, the use of other methodologies (including psychoanalytic investigation, among others) will wane. If it is true, this is not just a loss to treatment, but also a sacrifice of inquiry into the theoretical underpinnings of treatments and, by extension, to new understandings of self and mind. Thus, the possible superiority of outcomes established by the RCTs may be disconnected from another goal of treatment: to explain and give meaning.

What kinds of evidence are being discussed here? Several contributions examine particular research tools or theoretical traditions to discover the uses and limitations of the evidence. A surprisingly rich chapter reviews the use of case study and single subject experimentation, reminding readers that this strategy is neither outmoded nor necessarily limited to studies of cognitive and behavioral treatments. As with any clinical investigation, a single case experiment will succeed to the degree that outcomes are clearly operationalized. This chapter provides a cogent summary of the threats to validity in a single case study and a pragmatic discussion of how to design such investigations.

Three chapters illustrate the hypothesis testing approach to understanding psychotherapy. Some researchers have argued that the effective factors in psychotherapy can be reduced to one of four domains: mastery/coping, clarification of meaning, problem actuation, and resource activation. A case study is presented in which the psychoanalytic psychotherapy of a severely disturbed patient is reconceptualized in the vernacular of these domains. Readers may not agree that the author has met the standards of evidence and hypothesis testing presented by other contributors to this book, but the exercise is itself instructive.

By contrast a discussion of hypothesis testing with cognitive/behavioral treatment is a much crisper analysis of how hypotheses might be formulated and tested. It is suggested that useful hypotheses must first explain the patient’s present difficulties in a manner that is consistent with the patient’s history and the precipitating events. However, it is also important that the effects of treatment can be used to test the hypotheses. Hypothesis testing is discussed within the frameworks of functional analysis, Beck’s cognitive model, and Young’s schema theory. The testing of hypotheses within analytic and cognitive worldviews is echoed in a subsequent dialog contrasting cognitive-behavioral therapy with cognitive-analytical therapy (which, like Young’s model, is a cognitive model that is amplified and enhanced by psychoanalytic, i.e., relational ideas).

Practitioners often view the EVT controversy from a distance, not translating the academic discussion to clinical practice. Two excellent chapters examine how practitioners may think about evidence. An evidence-based practitioner would be mindful of the quality of evidence that has demonstrated support for a given procedure and would prefer treatments for which the quality of evidence is superior. To establish efficacy under research conditions, the gold standard is evidence from RCTs, but it has been observed that laboratory efficacy does not necessarily translate into clinical effectiveness in the less controlled, messier world of clinical practice. Reliance on efficacy data from RCT evidence may obscure the effectiveness of psychodynamic treatments for several reasons that are reviewed here, including over-reliance on bio-behaviorally based DSM taxonomy, the tendency of RCTs to minimize the mutative importance of relationship factors, the simplistic application of the drug metaphor to psychotherapy, and so forth. The gold standard cannot therefore be the only standard.

Evidence-based practice requires more complex and nuanced evaluation of various kinds of data, and the move from efficacy studies to effectiveness data raises conceptual and methodological questions. Pristine efficacy studies generally try to isolate homogeneous client samples, while the messier conditions of clinical effectiveness entail clients of differing complexity and variegated diagnostic comorbidity. Further, differential access to resources, varied goals, and competing stakeholders (clients, therapists, payers, community need) skew the choices of outcome measures in many directions. Thus, prior to designing any data collection about real-world efficacy, a researcher must weigh a priori decisions about whether the research questions should focus on competing techniques versus (for example) on effective resource utilization.

The evidence that is most directly meaningful for the practitioner may perhaps come from an audit of that clinician’s practice. A final chapter describes two workshops on practice audits that tried to identify conceptual steps to conducting a useful audit. What domains of clinician behavior should be sampled? How will referral pathways be tracked? What initial (intake) information will be useful to the audit? What outcome measures are pertinent and feasible? Can critical events be defined that must always go well (e.g., management of suicidality) and how can these be operationalized? One useful outcome of these workshops was the realization that simply planning an audit can be a very useful exercise for the clinician.

There are things one might change in this book. The three chapters on hypothesis testing in psychodynamic, cognitive, and cognitive analytic therapies really focus more on illustrating the theory and talking about how hypotheses may be generated or reconceptualized in EVT vernacular, but not enough about actually testing hypotheses systematically in the course of treatment. That sort of effort is especially needed with the psychodynamic approaches and is essential if psychoanalytic therapy is to sustain its credibility.

One might also wish for a more systematic analysis of the factors underlying the blossoming of EVTs and the RCT standard. Has the RCT really become the sort of self-perpetuating paradigm of the sort described by Thomas Kuhn? And if so, are the more psychoanalytic therapies inevitably going to be crowded out? Or is psychotherapy research beginning to embrace more complex hypotheses and more diverse data that will move the inquiry beyond the present winners vs. losers paradigm? We need a discussion of the sort of evidence needed to establish the legitimacy of psychodynamic treatments in the RCT-dominated world of EVTs. It is to be hoped that new research programs will surpass the present level of debate and advance the scientific basis of practice without sacrificing the uniqueness of each patient. This volume is intended to provoke abstract reflection on the many possible ways to think about validating evidence. It will certainly stimulate critical reflection.

Reviewer Note

Brian H. Stagner is clinical associate professor in the department of psychology at Texas A&M University and is president of Associates for Applied Psychology in College Station, Texas.


© APA Div. 39 (Psychoanalysis). All rights reserved. Readers therefore must apply the same principles of fair use to the works in this electronic archive that they would to a published, printed archive. These works may be read online, downloaded for personal or educational use, or the URL of a document (from this server) included in another electronic document. No other distribution or mirroring of the texts is allowed. The texts themselves may not be published commercially (in print or electronic form), edited, or otherwise altered without the permission of the Division of Psychoanalysis. All other interest and rights in the works, including but not limited to the right to grant or deny permission for further reproduction of the works, the right to use material from the works in subsequent works, and the right to redistribute the works by electronic means, are retained by the Division of Psychoanalysis. Direct inquiries to Henry Seiden, Publications Committee chair.