CT Daily, Member Insights, Opinion

A new view of evidence-based practice

Stanley B. Baker December 1, 2012

(Photo:Wikimedia Commons)

Promotion of the evidence-based practice concept is widespread across the mental and behavioral health professions. Intrinsic motives include placing the well-being of our clients/patients/students at the forefront, desiring to discover and use the best practices available, and wanting to be respected as highly proficient professionals. Extrinsic motives include being eligible for insurance reimbursements, avoiding ethical and legal challenges, and saving one’s job from funding cuts or other negative employment decisions. Unfortunately, too few counselors either conduct research or read research findings. Although they may value research intellectually, many lack confidence in their ability to use research findings.

The responsibility for engaging in evidence-based practice falls primarily on counseling practitioners. Evidence-based practice requires application of practices for which the evidence was the product of rigorous scientific empirical studies — that is, outcome research. Outcome research is the domain and responsibility of trained researchers, who are usually employed in university settings. Therefore, counselor educators are included among those responsible for producing the evidence.

The corresponding responsibility for counselors is to be willing and able to locate and use evidence-based interventions. Consequently, the two concurrent challenges are 1) having counselor educators (outcome researchers) produce sufficient volumes of evidence and 2) training counselor practitioners to find, interpret and use the evidence. Ironically, the circumstances create a codependency.

Counselors are dependent on counselor educators to conduct the research and teach them how to find and use the evidence with confidence. Counselor educators are dependent on counselors to respond to training efforts enthusiastically, search for the evidence constantly, use the evidence appropriately and help the counselor educators to conduct the needed outcome studies. These challenges limit the range of interventions available to counseling practitioners.

Accountability and action research

Counseling practitioners who evaluate their local interventions can use the findings to improve their practices and to be accountable to their stakeholders. This accountability process involves action research as opposed to outcome research.

Action research focuses on generating local rather than generalized knowledge (as is the case with outcome research). There are numerous approaches to and definitions of action research, but the common theme seems to be that it is not outcome research. That is, the demand for attention to rigorous research design controls and inferential data analyses is often not a requirement in action research. Action research seems to cover all data collection activities that lead to findings that are useful for evaluating local programs. Goals for action research include acquiring useful local knowledge for program improvement, involving local stakeholders in the process, being open to the viability of a variety of data sources and anticipating that constructive actions/
decisions will follow the data.

Although action research typically is less rigorous and sophisticated than outcome research, the brunt of the responsibility for conducting the accountability process is also on counseling practitioners. Historically, counselors have been perceived as resistant to evaluation and accountability and needing to be coaxed or assisted in the process by counselor educators and local supervisors. This resistance was usually attributed to a number of supposed impediments, including a perceived lack of the requisite sophistication, insufficient time to do it, uncertainty about the value of the kinds of data being collected, the perceived cost of the process, uncertainty (and possibly fear) about how stakeholders would use the findings and a dislike of being evaluated.

Counselor educators share the accountability challenges with counseling practitioners. Counselor educators can and should address all the causes of resistance when training entry-level counselors. The evaluation/accountability competencies covered in the standards of the Council for Accreditation of Counseling and Related Educational Programs fall within the action research domain. Therefore, the ability to conduct action research to achieve accountability goals is not beyond the sophistication of entry-level counselors. Teaching the necessary skills and influencing appropriate attitudes about action research and accountability are important responsibilities held by counselor educators.

Linking evidence-based practice with accountability and action research

Evidence-based practice and accountability appear to depend on different research paradigms and focus on different viewpoints. While evidence-based practice is a product of outcome research findings, accountability activities employ action research methods. Evidence-based practice is synonymous with aptitude testing, having a focus on how previously collected data can be applied to future performances. On the other hand, accountability is akin to achievement testing. The focus is on past performance to seek evidence of how well interventions have worked. Therefore, evidence-based practice and accountability appear to be two different concepts, each of which is very important for the counseling profession but apparently difficult for counseling practitioners to do well.

My thesis is that the two concepts can be combined in a manner that might make it easier for counseling practitioners to be accountable and engage in evidence-based practice. The keystone of this idea is to view evidence-based practice more broadly than is currently the case.

As mentioned earlier, evidence-based practice is the product of rigorous, sophisticated outcome research studies. My view is that evidence-based practice can also be the product of local action research studies that are a part of the counseling practitioner’s evaluation/accountability function. If counseling practitioners are able to collect volumes of evidence that their local interventions work, then those data could also qualify as evidence to support their local evidence-based practice.

The typical format for outcome research is to conduct tightly controlled studies with random sampling from targeted populations, control groups and inferential statistical analyses. Often, these studies are not replicated, and the findings are generalized to a population similar to the one used in the sample. These samples and populations may or may not be similar to those in many local settings.

On the other hand, although local action research may be less rigorous, the samples and populations are indeed of interest to local counseling practitioners. And if the interventions are repeated and evaluated many times, evidence accumulates. Therefore, the action research paradigm provides volumes of relevant local data, as opposed to the findings of a single rigorous outcome study that may not always have applicable samples and populations.

To be clear, I do not intend to replace or diminish the value of outcome research. It is important to understand, however, that rigorous outcome research has its limitations as a source of evidence-based practice in the counseling profession. My goal is to add local action research data to the evidence-based practice information that counseling practitioners are already able to locate in outcome research publications.

Two examples

Two of the most common interventions are individual counseling and psychoeducational group interventions. How the action research framework can be used for evidence-based practice is described briefly for each practice.

Individual counseling: My recommendation for individual counseling is to apply the AB single-subject design to counseling interventions and to encourage and assist clients in engaging in self-monitoring between counseling sessions. Counseling goals would determine what behaviors are to be changed (for example, reducing the number of negative thoughts per day) and what attitudes are to be influenced (for example, rating one’s negative or positive affect about his or her job on a scale of 1 to 10). The clients would record the self-monitoring data. The data could be presented graphically, starting with a baseline and then continuing throughout the counseling process. The axes of the graph would be number of data-gathering points (horizontal axis) and points on the behavior or attitude scale (vertical axis). The evidence would be visible in the graphic representations of the self-monitoring process.

This process is clearly within the sophistication domain of entry-level counselors and could be used in a number of individual counseling interventions. The data could be accumulated over time to provide both accountability data and evidence of the
effectiveness of one’s practice.

Psychoeducational group interventions: I would recommend a pre-experimental pretest/posttest design for psychoeducational group interventions. Control groups are unnecessary in action research because multiple groups are presented with the same proactively planned interventions. Similar to the instruments that schoolteachers develop to test their students, practitioners can design instruments to assess knowledge and attitudes. Simulations can be established to assess acquisition of targeted behaviors. Pretest data can be collected before the intervention begins, and posttest data can be collected at the end of the intervention program.

Correlated t tests can be used to compare the pretest and posttest scores to determine if desired changes occurred. Although the correlated t tests require application of statistical knowledge, that knowledge is within the competency range of entry-level counselors.

Answering the challenge

In his 2009 critique of the state of published research in counseling journals and of the attitudes of counselor educators toward research, David Kaplan, chief professional officer of the American Counseling Association and a past president of ACA, called on the profession to be primarily engaged in evaluating the effects of our counseling interventions. I do not know if he was promoting outcome or action research, but both paradigms are applicable to answering his call.

Publication in the counseling journals requires outcome research studies conducted primarily by counselor educators. To meet Kaplan’s challenge, however, a larger volume of counseling practice evaluations likely need to be addressed via the action research paradigm — and done so by counseling practitioners.

It therefore behooves our profession to inform practitioners that action research is a road both to accountability and to evidence-based practice, and to encourage them to travel that road. It may currently be the road less traveled, but it does not have to remain so.

 

****

Stanley B. Baker is a professor of counselor education in the Department of Curriculum, Instruction and Counselor Education at North Carolina State University. Contact him at sbaker@ncsu.edu.

Letters to the editor: ct@counseling.org.

5 Comments

  1. Ron Del Hampton

    As a professional counselor of 20 + years, I would find the notion that I “depend” on counselor educators insulting, if I didn’t find it so comical. Your article highlights the gaping disconnect between the academy and the realities of practice. I live near two large CACREP counseling education departments, and frequently supervise interns from each. That the trainees have little in the way of training that is of any practical use to their clients is to be expected. More troubling, is the fact that at both of these institutions less than half of the counselor education faculty are licensed, and none (zero) currently see clients. While and introduction to the field, basic skills, and ethical competence (especially) seem to be adequately addressed, the notion the Counselor Educators provide anything of use to counselors who have completed their training and are in the field is simply a fantasy.

    Reply
    1. Michael Morad-McCoy

      Having just attended the 2015 ACES conference I was extremely disturbed at the way so-called EBP seems to be a driving paradigm of so many in the Counselor Education field. My disturbance began with a presenter who decried EBP’s critics as being “simplistic” and who claimed EBP does include consideration of PBE (practice-based evidence). Yet this presenter, and every other presenter I saw who was promoting EBP, focused exclusively on teaching counseling students so-called empirically-supported treatments with no discussion whatsoever of any other evidence that might be considered. Which leads to the obvious question of who, exactly, is the one being simplistic here. Given the large body of evidence that suggests that common factors, not any specific treatment approach or theory, accounts for most of the change in therapy, it was rather depressing that only one session at ACES addressed an evidence-based approach based on the common factors research. Sadly, it appears our profession is being taken over by those who cannot see the reality that what we do is just as much art as it is science.

    2. Dr. Robin Shepherd

      You are heavily putting your opinion based upon the Counselor Educators who are not clinically involved at that time they are ʻtrainingʻ the counsellor.

  2. Karla Roeglin

    I believe the author’s main point might actually be in line with many of the thoughts of the commentators who previously posted more than they are realizing. From what I am reading, I believe his main assertation is that counselor educators may have the credentials and the “clout” to conduct and publish research (which can sometimes be helpful to counselors — or not), but that, because counselor educators are not usually practicing with clients, they lack the clients so THEY are dependent upon practicing COUNSELORS. The author then goes on to say that counselors should be considered qualified to perform action research which, when repeated over time, can be more reliable than even outcome research, rendering the action research of counselors reliable and worthy of constituting evidence-based practices. In other words, I believe, if I am reading correctly, that the author is trying to be helpful in finding valid pathways to make the practical things counselors do and use (including common factors) be included in the list of legitimately researched and accepted evidence-based practices.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *