Cover Stories

Proof positive?

Lynne Shallcross September 1, 2012

Offering counseling treatments that are backed by research is a personal passion for R. Trent Codd. When he founded the Cognitive-Behavioral Therapy Center of Western North Carolina 11 years ago, it was with the mission of delivering and disseminating evidence-based treatments. His practice hires only clinicians who are trained in and dedicated to delivering evidence-based treatments. It also offers training to other clinicians and agencies and produces a free podcast dedicated to evidence-based treatment and cognitive behavior therapy (CBT).

Codd believes a similar focus on evidence-based treatments should be more widely adopted throughout the counseling profession. Although the ACA Code of Ethics states that counselors will use empirically supported treatments, Codd asserts other aspects of the profession’s culture allow for training in and practice of non-validated and potentially harmful treatments.

As an example, Codd shares his viewpoint on critical incident stress debriefing (CISD). “The data here are clear that people recover following a trauma if this intervention is delivered. However, they do so more slowly than with no intervention. That is, this treatment has been shown to impede the natural recovery process,” says Codd, who is a diplomate in the Academy of Cognitive Therapy. “To be more explicit, this intervention is harmful. Delivering harmful interventions is certainly not congruent with the ACA Code of Ethics.”

The American Red Cross and other organizations promote CISD, which can contribute to confusion among counselors, Codd says. Counselors who don’t read the research literature might assume that a technique is safe and effective — even if research seems to indicate otherwise — simply because multiple organizations endorse that technique, he says. (There is ongoing debate about CISD within the mental health professions, and its proponents take issue with claims that there is no evidence of its effectiveness or that it has been proved to be harmful.)

To Codd, the ongoing use of CISD is just one illustration that research and evidence-based practice have yet to find the following they deserve within the counseling profession. “I wish I knew what to recommend to remedy this problem,” he says. “This is something that I’ve spent quite a bit of time thinking about over the years. I think the only thing that will make a difference is a change in the professional counseling culture. The bottom line is that we, as a profession, are going to have to agree that this is important. Unless that happens, I don’t think much change is going to occur.”

From his position as president of the American Counseling Association, Bradley T. Erford says he senses the push for evidence-based practice coming from multiple sides — and he hopes that push will continue to grow stronger. Externally, he says, health care providers and government organizations are increasingly demanding to see counseling practices with demonstrated effectiveness. Internally, Erford says, the counseling profession is constantly striving to identify what works, how well, with whom and under what conditions, as any scientific discipline should.

“Knowing and applying what works in counseling not only raises the integrity of professional counselors, it also serves to protect the public from ineffective or even dangerous interventions and treatments,” says Erford, a professor in the school counseling program at Loyola University Maryland.

In Erford’s view, conducting research and using evidence-based practices are important to the profession for two main reasons: adherence to professional ethics and economic survival. “The ACA Code of Ethics states [in Section C, Professional Responsibility] that ‘Counselors have a responsibility to the public to engage in counseling practices that are based on rigorous research methodologies,’” Erford says. “That statement pretty much says it all. Regarding economic survival, if professional counselors use the best available research-based approaches to help clients and students, counselor effectiveness, client satisfaction and third-party insurer satisfaction improve. When professional counselors provide effective services, our services become even more valued, and we create a market for more counseling jobs at higher pay.”

Kelly Wester, an associate professor in the Department of Counseling and Educational Development at the University of North Carolina at Greensboro (UNCG), cites credibility and accountability as two additional reasons that counselors should conduct research and then adhere to evidence-based practices. This would assure clients that whatever treatment a counselor is offering has been shown to be effective, says Wester, a member of ACA who co-chaired the development of research competencies for the Association for Counselor Education and Supervision in 2011.

“Using the medical profession as an example, if an oncologist told you that you required an invasive medical procedure to remove or minimize the cancer that was in your abdomen, you would want to know of the effectiveness of this procedure, the risks and the benefits,” Wester says. “You may even want to know who else has been through this procedure and their outcome so you [can] compare yourself, your demographics and your situation with those individuals to see how you may fare in the treatment. While counseling is typically not as invasive as some cancer treatments, our clients may have the same interests and concerns regarding their treatment. Thus, counseling research should be done so that our clients, as well as supervisees and students, know the benefits, risks and outcomes of engaging in the service we are offering them and can truly make an informed choice.”

ACA Chief Professional Officer David Kaplan says health care companies are beginning to suggest that they may stop reimbursing mental health practitioners who don’t use evidence-based practices. The danger if that scenario plays out, Kaplan notes — particularly if counselors don’t begin producing more evidence of effective counseling interventions — is that counselors might find themselves locked out of using helpful approaches because of a lack of research on those approaches.

CBT is often recognized as the most effective treatment in many situations, Kaplan says. This is not necessarily because CBT is the only approach that works, he says, but because it is the treatment that fits best into the prevailing research paradigm. Therefore, the evidence needed to support its effectiveness has been ample. “If we don’t generate outcome research across the entire gamut of counseling interventions,” he says, “the only approach the insurance companies are going to let us use and the only one the government will fund [in the future] will be CBT.”

‘A theoretical basis is not enough’

When it comes to conducting research and applying it to counseling techniques, a variety of terms are used. According to Codd, the term evidence-based has been applied more liberally in recent years. He understands the meaning to be “following approaches and techniques that are based on the best available research evidence.”

Kaplan says the technical definition of evidence-based research promoted by the National Institute of Mental Health and other federal agencies requires the inclusion of a manual with specific step-by-step protocols so the procedure can be replicated. The term best practices, on the other hand, implies that a counselor is looking for the one “right” approach that works better than all other approaches, he says. “That term is losing favor because we know that there’s not one absolute best approach to a problem. There are different interventions that can work,” Kaplan says.

Outcome research is another relevant term. According to Kaplan, it encompasses conducting research that speaks to Gordon Paul’s question posed in the 1960s: What works best with this particular client in this particular situation with this particular problem in this particular setting?

Wester views evidence-based practice as consisting of quality research findings, counselor skill and ability, and client desires. “I think the myth is that evidence-based counseling equates to using a manual that gives you Week One, Week Two and Week Three and that it does not allow you to account for individual clients who come into our office,” she says. “This is not my understanding of evidence-based practice. Evidence-based practice, to me, is what has been proven to work, and it typically provides more of an outline of interventions or steps that allow us to work with our clients from a method that has been proven to be accountable. Simply because the evidence-based practice indicates that we need to set goals in week one does not mean that we ignore the client who walks into our office during intake crying and in crisis. That wouldn’t be ethical on our part as counselors. It would mean that the ‘week one’ part of the evidence-based practice might take another week or two to finalize … while we stay with their emotion and work with the client to alleviate the crisis.”

Regardless of the terminology used, more research needs to be done to support the techniques counselors are using, Kaplan asserts. Historically, the counseling profession has been grounded in theory, he says, and as a result, many practitioners have thought that if they followed a particular theory, they were being successful, regardless of client outcomes. “With the push in recent years for accountability and to show that what you do works, having a theoretical basis is not enough,” Kaplan says.

As a whole, the counseling profession has been more resistant than other helping professions to the push from health care and government to back treatments up with research, Kaplan says, in part because counselors don’t generally like to do research. “Counseling tends to attract professionals who are interested in interacting with people and helping people directly,” he says. Those who are more interested in conducting research tend to gravitate toward other fields such as psychology, Kaplan says.

Counseling also attracts greater numbers of people who are creative and like to use creative interventions, Kaplan says. The downside to that is that creative interventions are often more difficult to research, he says. For example, behavior therapy approaches are more concrete — “do this, then this” — so they better lend themselves to the prevailing quantitative research model, he says.

Another factor in play is that it can be more complicated to determine what works in counseling than in other professions, Erford says. “Take medicine, for example. It is relatively simple to determine if one pill works better than another for treating a certain medical problem,” he says. “The personalities of the doctors and clients, while diverse, generally have little effect on the client’s physical system. Likewise, what the client does before and after taking the pill usually has little effect. The administration of the treatment and consequences are usually easily controlled. This is not the case in counseling. The treatment must be personalized to client needs, which means that even if a professional counselor is using a manualized treatment protocol, variations occur in how the treatment is administered. And the treatment is only a small piece of the puzzle when trying to understand clients’ complex change processes.”

Erford points to research from Michael Lambert 20-plus years ago showing that only 15 percent of the treatment outcome was due to specific techniques used. In comparison, 30 percent was due to the therapeutic alliance, 15 percent to the client’s expectations for change and 40 percent to factors outside of counseling. “So, in order to maximize client outcomes, all four facets should be the focus of the professional counselor, not just what evidence-based practice you are using,” Erford says. “On the other hand, while 15 percent may sound like a small amount, it makes a huge difference to overall client well-being and counselor effectiveness. That said, when clients perceive that counseling is working, their expectations improve, they are more likely to follow through on out-of-session activities and the therapeutic relationship improves. So, these change factors are not four discrete facets; they are synergistic and interconnected.”

No matter the reason for it, the profession’s dearth of research leaves counseling at a disadvantage in Codd’s opinion. “It pains me to say this about my profession, but I really believe we lag significantly behind these other disciplines in this area. I think it’s important for our field to catch up to these other disciplines if we are to truly mature as a field.”

Widening the scope

Finding middle ground on the topic of evidence-based practice will require a little give on both sides, Kaplan says. On one hand, counselors need to acknowledge that to advance the profession and to do the right thing for their clients, they must produce evidence that what counselors do is working, he says. On the other hand, organizations and agencies that fund research need to be more flexible concerning what constitutes acceptable research, he says. This could mean embracing qualitative research rather than focusing only on quantitative research and understanding that not all approaches will use “cut-and-dried protocols,” Kaplan says.

Wester agrees, adding that qualitative and quantitative research should be viewed on a continuum, where both have their own strengths. “Qualitative provides us more of an in-depth understanding and allows us to explore areas and opinions that we are unsure of, while quantitative provides us numerical support and evidence that something works or doesn’t,” she says. “No one methodology is better than another; they serve completely different purposes. Thus, what research should look like is less about the methodology and more about what research questions will benefit and impact our counseling field. What questions would help us to be better counselors, be more effective with our clients and train our students better? Once we have those questions, then the methodology that best answers those questions should follow.”

The counseling profession also needs to change the current focus of the research it conducts, Kaplan says. “We need to focus more on clients in research than ourselves,” he says. “The [current] research is often focusing on asking ourselves opinions about ourselves and has nothing to do with client outcomes. We need to find real clients who have real problems, and we need to find out if what practicing counselors are doing with their clients is working. And, yes, that’s hard to do.”

But before producing and applying the research these leaders say the profession needs, counselors must acquire the requisite skills, which Wester says they should be learning both in graduate school and through continuing education after graduation. “Graduate school training provides the basis and grounding for what we need to know as professionals, but the world keeps changing, our clients keep changing, and the interventions and treatments continually change — and so does research,” she says. “Thus, continuing education is important to stay abreast of knowledge and gain new skills.”

In Codd’s view, graduate programs need to up their games and better train future counseling researchers. “I think our curriculums should add course work and, even more importantly, require active participation in research projects — doing the behavior as opposed to just reading and hearing about how the behavior is acquired,” he says.

Making research relatable

Codd senses a divide in counseling between those in favor of increased research and evidence-based practice and those who do not want to see the profession rely so heavily on research. Among the objections he has heard is that certain theories cannot be researched and that scientific methodology is not valuable.

He suspects, however, that much of the resistance to research has to do with how hard it can be for human beings — including counselors — to let go of deeply held beliefs. “We cling to our pet theories [and have] perhaps even built our careers around writing, lecturing [and] delivering certain interventions,” he says. “Learning whether or not we’ve been correct can be hard to take.”

Throughout the history of the counseling profession, people have argued about whether counseling is a science or an art, Erford says. He believes it is both. “We are a scientific discipline that allows practitioners to creatively adapt to the individual needs of a client,” he says.

One obstacle that may keep more counselors from adopting a pro-research attitude is that many practitioners do not view the literature base as being particularly user-friendly or helpful, Erford says. “Some counseling journals, like the Journal of Counseling & Development, have tried to address that by requiring that authors provide a section called ‘implications for counseling practice.’ But what we know about what works in counseling today is so much broader and deeper than it was 20 or 30 years ago. Most practicing counselors don’t have time to keep up with all of the published literature. They want meaningful, easy-to-read summaries that will help them to hit the ground running and create effective client or student outcomes. Some counselor researchers have begun conducting meta-analyses and systematic research syntheses to try to pull together related literature, sort of like one-stop shopping. Many of the textbooks I write have a synthesis chapter, which addresses the question, ‘What works in counseling?’”

ACA is developing two initiatives intended to address this need, Erford points out. “First, we are exploring how best to provide summaries of research-based approaches to issues encountered by counselors. Once produced, these informational summaries will be available to ACA members and will be designed to help practitioners, students and counselor educators stay abreast of effective counseling practices. Also, the new ACA National Institute for Counseling Research Task Force will identify and recognize the best counseling research produced during each year as exemplars for the counseling profession.”

Wester points to a “practitioner-researcher gap” within the counseling profession that she says has yet to be successfully bridged. “Practitioners frequently will question the applicability of our findings and our research, indicating it does not allow them to use their creativity or speak to the uniqueness of each client,” Wester says. “Interestingly, we think about evidence-based practice as research [telling] us what to do. However, if one would really explore the literature on evidence-based practices, it is the combination of a) quality research findings, b) counselor skill and ability and c) client wants and desires.”

Erford agrees, saying the push for additional research and evidence-based practices in no way diminishes the importance of creative and innovative theories, interventions and treatments. “Instead, the emphasis is on subjecting innovative and creative treatments and new theories to rigorous study in order to determine treatment efficacy, just as currently accepted evidence-based practices have been rigorously tested,” he says. “In the classic sense, after the treatment has been proposed, the new treatments are studied using randomized controlled trials on real clients with a real target condition. If the results are positive, evidence emerges that the treatment is supported. Usually, multiple clinical trials are needed to support an evidence-based practice.” Having more than one evidence-supported approach expands options for clinicians and clients, Erford says.

‘Voices from the field’

Counselor practitioners should not only be using research to inform their practices with clients, they should also consider taking part in research themselves, Erford says. “Practitioner voices from the field are incredibly powerful,” he says. “Much of the progress we have made over the past century is because practitioners noticed important things about clients, the counseling process, and the strategies and techniques used, and then shared these insights with other practitioners and researchers.”

In general, however, counselor practitioners seem less likely to participate in research and collaboration with counselor researchers than do practitioners in related professions such as psychology and psychiatry, Erford says. “Part of this is a professional orientation issue, which we are addressing in counselor education,” he says. “We need to recruit and produce graduate students who are excited and knowledgeable about research and its application to practice, and then keep them excited and engaged as they enter practice. If practitioners understand how research can be applied to clients in the field, they will notice things and question their practices more actively, thus opening their curiosities to research opportunities.”

Erford says he and a few colleagues completed meta-studies between 2010 and 2012 of 10 ACA and division journals, learning that in nearly every case, practitioner contributions to the counseling literature have declined significantly during the past 20 years. “Professional counselors, regardless of setting, are supposed to be collecting data to substantiate effectiveness and outcomes with every client or student served,” he says. “This constitutes a huge pool of existing data. If we could develop a system for collecting and using this outcome data for research, we would leap ahead in our understanding of what works in counseling. Partnerships between counseling researchers and practitioners could be mutually beneficial, meeting the needs of the researcher for access to clients and data, and the practitioner for access to research or evidence-based practices and assessments that help with screening, diagnosis and accountability. If you are a practitioner with ready access to clients or the data they generate, please reach out to counseling researchers in universities and institutes. Through networking, we can build a powerful system for research and development.”

Before counselor practitioners can team up with researchers, the lines of communication need to be opened, Wester says. “One of the things our department did [at UNCG] was to send our internship site supervisors a survey on what was needed in terms of research and [asking if they would] be interested in collaborating with our department faculty on answering any questions they were interested in or needed answered through research,” she says. “They were able to indicate what they needed in terms of current literature, what they would like in terms of research relationships, topics they needed help researching and how we could help them and their agency. The first step is setting up the lines of communication between practitioners and researchers. But practitioners should feel able to contact the local universities, or even their
alma maters, to inquire how to bridge the gap.”

Research in a humanities profession

James Hansen, professor and coordinator of the mental health specialization in the Department of Counseling at Oakland University in Rochester, Mich., agrees that research is a vital part of professional counseling. But he believes counseling should be “informed” by research — rather than “guided” or “determined” by it — for two fundamental reasons.

First, Hansen says, the essence of counseling is the relationship between the counselor and the client. “Indeed, one of the most consistent research findings over the past four decades is that the quality of the counseling relationship is the within-treatment variable that accounts for the majority of the variance in counseling outcomes,” says Hansen, a member of ACA and the Association for Humanistic Counseling, an ACA division. “Therefore, the research unequivocally informs us that the quality of the counseling relationship is the factor to which practicing counselors should be most attentive. However, every counseling relationship is unique, just like every marriage, friendship, etc., is unique. Therefore, although research informs us that the counseling relationship is vitally important, research cannot tell us how to deepen a particular counseling relationship because every counseling relationship is unique.”

Second, Hansen says, all research is conducted within a set of assumptions. “The set of assumptions in ‘evidence-based,’ ‘best practices’ or ‘empirically supported treatment’ outcome research is that researchers should attempt to find the best techniques to use with particular disorders. The findings can then be disseminated to practitioners, who will diagnose their clients and use the techniques that have been found to be most effective with their client’s disorder,” says Hansen, who wrote a “Reader Viewpoint” in the October 2010 issue of Counseling Today on this topic, as well as another article for a special issue of the Journal of Humanistic Counseling due out next month.

But the set of assumptions is essentially medical, Hansen argues, and although that makes sense for medicine, it doesn’t make sense for counseling. According to Hansen, meta-analytic research studies have consistently found that specific techniques account for less than 1 percent of the variance in counseling outcomes. “Specific techniques, generally speaking, appear to be relatively unimportant to outcomes,” he says. “Therefore, a counseling research agenda that is based on finding specific techniques for particular diagnostic conditions is focused on a factor that only accounts for a minuscule portion of the outcome pie. A general research agenda for the counseling profession should be focused on factors that we know to be highly important to outcomes, not factors that are relatively trivial.”

The bigger factors in the pie, Hansen says, are the quality of the therapeutic relationship, extratherapeutic factors such as social support, and positive expectations from the client about counseling.

The truth about techniques is complex and nuanced, Hansen says. “Specifically, the evidence strongly suggests that the ‘contextual model’ of counseling is the general way of thinking about treatment that counselors should adopt. There is an important role for techniques in the contextual model, but that role is related to the overall context of counseling, not as isolated, technical interventions.”

Hansen adds a second point to support his contention that the set of assumptions often relied upon in evidence-based counseling research is faulty. He asserts that the manual many mental health professionals use to identify client disorders, the Diagnostic and Statistical Manual of Mental Disorders (DSM), is “fundamentally unsound” yet is used in evidence-based research. Hansen calls the DSM highly unreliable and believes it has virtually no validity. “Because evidence-based research operates from these deeply flawed assumptions, it is generally a harmful trend in counseling,” he says.

In Hansen’s view, counseling is a humanities profession, akin to history, literary analysis or philosophy. The raw data of all of those professions is in human meaning systems, he says. On the other hand, the sciences, such as biology, chemistry and physics, deliberately attempt to remove subjective human meaning from their investigative efforts, aiming to be objective and impartial, he says.

“Even if counseling is considered a humanities profession, science still has a valuable role in counseling, just as it does in other humanities professions,” Hansen says. “For instance, although historians study human meaning systems, they rely on scientific methods to date historical documents. However, science does not dictate or determine the activities of historians. It is simply used as a tool to help the profession along. I envision the role of science in counseling in much the same way. Science is a vital tool to help counselors determine if their interventions are working, for example. However, science should not dominate and determine the professional life of counselors or historians, because both of those humanities professions are aimed at uncovering human meaning systems — a goal which science, as an enemy of subjectivity, is grossly unsuited to accomplish.”

Although Hansen reiterates that research is vital to the counseling profession, he believes it’s important for its focus to be on enhancing understanding of the factors most known to help clients. “For instance,” he says, “we know that the quality of the counseling relationship is an important factor in counseling outcomes. However, we have a lot to learn about the nuances of the counseling relationship, how it unfolds, the points at which it is most important, etc. The primary agenda then should be to focus research attention on factors that are known to be vital to counseling outcomes.”

 To contact the individuals interviewed for this article, email:

Lynne Shallcross is the associate editor and senior writer for Counseling Today. Contact her at

Letters to the editor:

Click here to read two additional perspectives on evidence-based counseling.

A home for research

In September, the American Counseling Association will launch its Center for Counseling Practice, Policy and Research. ACA Executive Director Richard Yep, one of the driving forces behind the center’s creation, discussed what counselors can expect from this new endeavor.

Where did the idea for the Center for Counseling Practice, Policy and Research originate?

The center concept was the result of input and commentary that I heard from leadership and members for many years. To have a dedicated unit within ACA that focused on areas of the counseling profession that could have both short- and long-term impact is something that we have wanted to do. With the support of the ACA Governing Council and the excellent input of those with whom I work on staff, we are now able to realize the launch of this new entity.

What will its goals be?

In the beginning, our hope is that the center will begin building a framework that will allow ACA to more deeply explore a number of issues that include how best to position counselors for job opportunities for which they are uniquely qualified through their education and experience. However, it will also be looking at the professional counselor who will be working in the middle of the 21st century to position them for whatever they may face. And an additional aspect of the center will encompass how we can host interns and scholars-in-residence here at ACA headquarters to work on projects of critical importance to the profession.

What do you hope to see the center accomplish?

In an ideal world, within three years, I hope that the center will have produced products, research and resources that result in more professional counselors being able to practice. An additional deliverable will encompass increased awareness by the public in terms of its understanding of the impactful and important work that these tireless mental health professionals do each and every day.

Why is this an important move at this time in the profession?

Professional counseling is at a crossroads. The services and support of the center are something that we hope will move the profession in a direction that will support more job opportunities, allow the public to better understand what counselors do and inform public policy decision-makers so that they help to create an environment that allows professional counselors to deliver the best possible services to clients and students. I am extremely excited about the work that I know the center can accomplish, and I look forward to the input, suggestions and feedback from our members in regard to the efforts we will make.

— Lynne Shallcross


Leaving room for creativity

Exploring creativity in counseling might sound at odds with following evidence-based counseling practices, but Thelma Duffey says that doesn’t have to be the case. Duffey, the founding president of the Association for Creativity in Counseling, a division of ACA, says evidence-based counseling and creative counseling interventions are largely complementary and developmentally aligned.

“Many creative interventions and techniques are founded in an established theory or theories and are implemented with these in mind,” says Duffey, a professor and chair of the Department of Counseling at the University of Texas at San Antonio. “For example, all best practices begin with a creative thought or idea. Many times, these may develop into models, techniques or interventions that emerge from our practices. We often talk through them and collaborate or share them with others. Finally, we assess and research their efficacy.”

“Now, one way that evidence-based counseling could interfere with creative approaches would be if we were to adopt a rigid, one-dimensional perspective on our work or endorse cookie-cutter recipes of treatment that don’t allow for context or counselor and client individuality,” Duffey says. “Evidence-based counseling practices could also interfere with creative approaches if we were to discredit spontaneity, creativity or innovation in our work. I see none of these as likely. Rather, I see counselors as embracing the idea that creativity involves using available resources, while ethically attending to best practices. Using music, the cinema and books are some excellent and ready resources that are compatible with evidence-based research paradigms.”

Duffey says she supports researching creative approaches, just as she would any other counseling approach. “The same quantitative research principles apply, such as adequate counselor training, valid and reliable measurement instruments, and clear methodology,” she says.

Although some counselors are more passionate about research, while others are more passionate about practice, Duffey says there’s room for a global view that incorporates both sides. “I believe that when counselors and counselor educators are flexible in their thinking, able to look at a big picture, allow for developmental progress and acknowledge the role of creativity and innovation while respecting rigor in research, the dichotomy ceases to exist.”

To contact Thelma Duffey, email

— LS


  1. Raul Machuca, Ph.D

    I am really hopeful about the fact that perhaps we are now aiming to become a more effective discipline, and one that is as competitive as other mental health fields. I am however very disappointed with the fact that the reality of what ACA preaches in this sense seems to contradict what the organization actually does, a clear example of this is the fact that CBT related presentations were completely absent from the most important conference in the field (The San Francisco conference only included two poser sessions by students). I hope for the opposite but I can almost bet that the next conference will not include a significant amount either.

  2. Jim Trivelpiece LMHC

    I have mixed feelings about the article and about the topic.
    Much of my emotional response has been in the anger / disgust family of emotions. Much is centered on how our profession can remain with heads stuck in the sand to such a degree. The decisions of whether to use results based methods have been made. It is over, there is no further argument.
    I’m twenty some years in practice at clinics here in the Pacific NW. I have used CBT and DBT to great effect with clients. And, I believe I’ve done so without damaging the “therapeutic relationship” I’ve held with clients.
    The argument about results based methods potentially limiting the creativity of therapy is completely invalid. I am a practicing artist. My office is filled with my art. But when I meet with a client the focus is on therapy and we use results- based methods. I suggest a counter-argument that therapists needing to remain creative do so on their own times, and not inflict their needs on the client.
    The decisions of whether to use results based methods have been made. They were made many years ago by insurance companies and third party payers who do have a right to expect results. The decisions were made by clients have a right to expect results as well.
    Wake up, folks. The client is not sitting in the therapy room to be entertained by the force of our personality.
    jwt / lmhc

  3. Chris Saville

    I don’t understand the frenetic push to get certain approaches established as superior to others. In study after study, research demonstrates that no approach is superior to any other- particularly when researcher bias is controlled for. There is some, admittedly confusing, research that superficially appears to support the sentiment that CBT or other approaches are superior, but upon closer inspection, the fallacy of this is seen: in every case either researcher bias is not controlled for, and/or the control group received no treatment (or a treatment that was intended to fail). When high-quality research is consulted, there is no difference between treatment modalities. In fact, one of the most consistent findings from research is that the vast majority of durable client change is due to client and therapist variables, the therapeutic relationship, and other common factors, with specific techniques accounting for very little of the measureable change. So why all the fuss? Why can’t we just admit that all legitimate counseling approaches work with equal efficacy?

    1. R. Trent Codd, III, Ed.S.

      Dr. Saville,
      I strongly, but respectfully, disagree with your analysis of the psychotherapy process literature and your perspective on the importance of evaluating psychotherapy. There are many suffering human beings in this world and I can think of no greater responsibility than, for us as a profession, to learn to help them as effectively as we can. We also need to be confident that we aren’t providing harmful interventions (no matter how well-intentioned they may be or counterintuitive it might seem). Quite importantly, I disagree with your assertion that the verdict is in and that it reads “…no approach is superior to any other…”

      I find it curious that you critique the evidence-based literature, but fail to provide any commentary on the common factors literature which is replete with difficulties (I will comment on some of this here, but more extensive commentary can be found by following the links I provided in another post).

      In your commentary you quickly get to the common factors argument (i.e., that common factors, such as the therapeutic relationship, account for most of the outcome in psychotherapy). While I agree the relationship is important, to maintain it accounts for most of the variance in outcome is to misunderstand this literature. One can only make that argument if one aggregates data across treatments, populations and disorders. While alliance does emerge ahead of the pack when you average in this way, it only accounts for 5% of the variance and does not tell the full story as averaging obscures important differences. It’s undeniable that specific efficacy for specific treatments for specific disorders has been demonstrated empirically. I have additional comments about the difficulties with the common factors literature in a forthcoming CT letter to the editor so I won’t repeat those comments here. I hope interested readers will be on the lookout for my remarks there.
      I’m always a bit confused as to how one “controls” for halo effects (researcher orientation bias/allegiance effects). It seems to me that this can be “controlled for” when different research groups get similar outcome data for a treatment. However, typically, someone considered expert enough to do a treatment in a clinical trial would almost be considered de facto to have allegiance/orientation bias, so I’m not sure how any well-done study would not have the issue of allegiance effects. I suppose where this is likely to have a negative impact on an investigator’s less-favored treatment is when the alternative treatment is implemented poorly (e.g., “experts” in CBT do the CBT, while the alternative treatment is done by “novices,” or the treatment is basically a straw treatment). The way to overcome this is to have each treatment in a clinical trial represented by experts/strong proponents of that particular treatment so each treatment has “equal” allegiance. An even better move is to have the investigator’s less desired treatment possess the allegiance. This has been done.

      A related issue you raise pertains to the use of Treatment as Usual (TAU) comparison groups. Using TAU or waitlist, etc. as an alternative treatment is a way of establishing initial efficacy of a treatment (e.g., CBT is better than “general therapeutic contact”). This is basically the equivalent of a psychotherapy placebo. It’s important to learn whether a treatment is superior to a TAU condition before embarking on head-to-head comparisons.

      In response to your claim (“…in every single case… the control group received no treatment (or a treatment that was intended to fail.”) one can see that CBT treatments are effective even when compared to medication treatments (although typically a tie). This is certainly not a comparison group “intended to fail” as you assert. Additionally, CBT tends to have an enduring effect beyond the end of treatment (especially medication discontinuation). One does not see many other treatment modalities in head-to-head comparisons with medication treatments outside of CBT-oriented models.

      Another issue is that in head-to-head comparisons for specific treatments for specific issues differential outcomes have been demonstrated. Again you will not see this effect when you combine all treatments for all disorders in one bucket, which tends to wash out any specific effects (e.g., pretty much anything is effective for mild depression, including placebo). I think the fallacy that you allude to is that when there is not a significant difference in outcomes between two psychotherapies then it must be common factors that account for that finding. If you do not get significant differences between CBT and medication for moderate depression, is that due to common factors? Doubtful. You may have two treatments that work, but that work in different ways (with a final common pathway) and thus are both effective–that is very different from saying it’s about the common factors.

      Another way to examine your claims is by evaluating a “common factors” style treatment in clinical trials–how do they fair? Wampold’s argument would be that this type of treatment would not be perceived as a bona fide treatment…yet that is what you are implying, that you just need the common factors and the rest is unnecessary smoke and mirrors–why bother with the smoke and mirrors?

  4. Trent Codd, Ed.S.

    Here are a few resources (both free) that relate to these issues that might be of interest to the readers of Counseling Today:

    1) Psychotherapy Brown Bag – This site is run by my friends and colleagues Drs. Mike and Joye Anestis. You will find a wealth of information pertaining to evidence-based psychotherapy there. I recommend that you visit it often and that you pass this URL along to your colleagues as well as to your students.

    There are also a few items at Psychotherapy Brown Bag that relate to some of the specific issues discussed in the article and subsequent commentary:

    a) CISD:

    b) Common factors:

    2) CBT Radio – This free podcast is dedicated to evidence-based psychotherapy and contains interviews with a number of leaders in the field (e.g., Dr. Beck, Steven Hayes, etc.). You can access the podcast by visiting this URL or you can access it within iTunes. If you would like to access it within iTunes click on “store,” then click on “podcasts,” then search “CBTWNC.” You can either select individual episodes to listen to or you can subscribe there and download them all. This resource, like Psychotherapy Brown Bag is completely free. I hope that you will pass this along to your students and colleagues too.

  5. Sonia McC

    I agree completely with R. Trent Codd in that the window of opportunity in which a client may decide to come to us is often very small. It is unethical for us, as clinicians, NOT to provide the most effective and proven forms of treatment to our clients. It’s a matter of not only respecting them but also their time, effort in being well, and their hard-earned money.


Leave a Reply

Your email address will not be published. Required fields are marked *