PRESIDENT'S MESSAGE

Moving research into practice: Defining EBIs versus EBPs as a starting point

Karen Callan Stoiber discusses the importance of defining the difference between EBIs and EBPs and their usefulness in school psychology research and practice

By Karen Callan Stoiber

I began my Division 16 presidency with a commitment to further the understanding and promotion of evidence-based practices (EBPs) and the translation of research to practice within the School Psychology community. Now, as I approach the last month of my year as D16 president, the notion and need for evidence-based practice (EBP) feels all the more real and intense, for both professional and personal reasons.

My professional basis for EBPs has been intensified by “think-tank” discussions and activities of the D16 Translation of Research to Practice Workgroup.1 These lively and engaging discussions helped clarify how little we know about implementation of Evidence-Based Interventions (EBIs) and EBPs in school settings, and how far we still have to go as a profession. In addition, the work I am doing with my colleague Maribeth Gettinger in fostering Head Start teachers’ implementation of evidence-based, early literacy practices in urban Milwaukee has made me realize how difficult it is to achieve such EBP goals as treatment integrity and progress monitoring in reallife, and often chaotic, classrooms. As researchers we can’t assume that our vision for collaboration and facilitation of EBPs will be readily embraced, especially when it involves a great amount of teacher and classroom change! Together these professional activities have led me to believe School Psychology must address three big issues that have impeded our capacity to implement EBIs and EBPs: (1) the availability of high-quality research, especially in the areas of early intervention and socialbehavioral concerns, which school-based practitioners can draw upon when selecting and implementing interventions; (2) the complex ecological aspects of schools and diverse students they serve, which may vary greatly from research setting characteristics; (3) the lack of treatment integrity and other monitoring measures used to examine factors that may moderate or mediate the effects of interventions. We must also work toward developing consensus among school psychology trainers and practitioners regarding what evidence-based practices and interventions represent and whether and how research-based approaches should be taught and implemented. Obviously, it is difficult to expect school psychologists to routinely apply and evaluate research-based practices, without greater attention to the above constraints.

Both evidence-based interventions (EBIs) and evidence-based practices (EBPs) refer to programs, products, practices, or policies that are researchbased and intended to optimally increase the skills, competencies, or outcomes in targeted areas. To move forward an agenda toward school-based use of EBIs and EBPs, perhaps the first step is to gather consensus in defining specifically what we mean by EBIs versus EBPs. In past writings, I first argued for greater implementation of EBIs in our schools, and EBIs were defined as research-based prevention and intervention programs with a strong empirical basis. A set of criteria are typically applied by an objective party to determine the degree or level of evidence in support of the program or treatment approach (that is, meets criteria to be considered an EBI). More recently, I have rethought and refined my beliefs about EBIs in school settings. Due to the current urgency of improving educational outcomes for children and families and the constraints surrounding the use and implementation of EBIs, I have argued for both the need and desirability of evidencebase practices (EBPs) or evidence-base applied-to-practice (EBAP) approaches whereby the “practitioner functions as researcher” (see for example, Stoiber & DeSmet, 2010; Stoiber & Gettinger, 2011). An EBP or EBAP approach recognizes that different schools reflect diverse student bodies and ecological qualities, ones that often do not match laboratorylike procedures and methodologies. This approach acknowledges the importance of integrating science and clinical practice and judgment, and importantly recognizes that practitioners play a key role in this integration. It also is consistent with the APA Policy Statement of Evidence- Based Practice in Psychology (EBPP) , which was approved as policy by the APA Council of Representatives during its August 2005 meeting (APA, 2006):

Evidence-based practice in psychology (EBPP) is the integration of the best available research with clinical expertise in the context of patient (student) characteristics, culture, and preferences.2 Psychological services are most effective when responsive to the patient’s (student’s) specific problems, strengths, personality, sociocultural context, and preferences. ….Some effective treatments involve interventions directed toward others in the patient’s (student’s) environment, such as parents, teachers, and caregivers. (p. 284)

An EBP approach incorporates forms of evidence stemming from diverse methodologies and sources, rather than those based primarily on randomized controlled treatments/trials (RCT) studies. These array of sources include data from one’s own application of the intervention (e.g., via progress monitoring or outcome evaluation) as well as data from clinical observations, qualitative approaches, process-outcome studies, single-participant designs, RCTs, quasiexperimental program evaluation, and President’s Message: Moving Research into Practice: Defining EBIs versus EBPs as a Starting Point summary meta-analyses. Importantly, the 2006 APA Presidential Task Force on Evidence-Based Practice report emphasizes the essential role of clinical judgment and clinical expertise across the steps of evidence-based practice, including assessment and diagnosis, case formulation, intervention design and implementation, monitoring of progress, and decision making; thus, it is consistent with the concept of EBP/EBAP for School Psychology.

Thus, EBP/EBAP approaches potentially have broader application than EBIs in actual classrooms as they include intervention strategies based on scientific principles and empirical data, which can include data collected by the practitioner for progress-monitoring or program evaluation purposes. In this regard, prevention or intervention strategies with a strong theoretical base, such as specific self-regulation or angermanagement strategies, may be evaluated using data-based decision making. When implementing interventions within an EBP/EBAP framework, a scientific basis informs practice, and practice outcomes inform ongoing and future decision making. As such, practitioners function as researchers by applying data-based approaches for systematic planning, monitoring, and evaluating outcomes of their own service delivery. As school-based practitioners witness the positive outcomes associated with their scientifically- and data-informed practices, they are more likely to use them and sustain them.

It is useful to note that, by design, the activities of practitioners as researchers within the described EBP/EBAP/EBPP framework, is somewhat different than the goal of translational work as specified by the National Institute of Health (NIH). More specifically, NIH defines the role of translational work as “ improving the health and mental health of our nation requires taking new discoveries from basic ‘bench science’ and translating them into practical applications that can be used to prevent problems and help people function more effectively (NIH, 2006a, 2006b). The notions of EBP/EBAP/EBPP rely less on basic “bench science” as the sole resource or basis of translation, but rather specify that research-based prevention and intervention strategies be tested in classrooms and schools via the application of rigorous data-based methods, such as structured progress-monitoring and program evaluation protocols (Stoiber, 2011).

And now I explain my personal basis for believing in the work surrounding EBIs and EBPs. Approximately midway into my year as Division 16 President, I was diagnosed with breast cancer and developed a deepened respect for the scientific knowledge stemming from randomized clinical trials in the medical field. My cancer treatment is based on the current status of medical science. Importantly, as I discussed treatment options and decisions with my doctor, I also came to understand the need for “individualized” or personalized treatment programs following an EBP approach that takes into account personal qualities such as one’s overall health history, genetic dispositions and cancer typing, capacity to cope, level of personal support, etc. I am thankful for the advancements in medical science, which is demonstrated and available on websites such as Science Daily and also appreciate the importance of personalized treatments and choices. In the medical arena, as in educational and clinical settings, EBIs and EBPs both hold an important role and function.

Yet to further facilitate the science underlying both EBIs and EBPs, regardless of the setting in which they occur-- medical, educational, or clinical—more funding is urgently needed. The current annual budget for NIH is equivalent to two and a half months of current U.S. military spending in Afghanistan. Funding for the U.S. Department of Education continues to be decreased and correspondingly, funding cuts are occurring at many state and district education levels. As a profession, School Psychology will need to prioritize research and practice activities so as to figure out how to improve the science underlying EBPs and EBIs so that they are more closely aligned with the level of scientific knowledge in other fields, such as medical science and clinical research. Clearly, our work needs to incorporate a focus on policies aimed at improved funding for EBIs and EBPs. It is only through such commitment that our profession can achieve greater confidence in our prevention and intervention practices. I remain optimistic and hopeful that we will prioritize such a commitment, which has been endorsed and furthered by the current Division 16 Workgroups.

References

American Psychological Association Presidential Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. American Psychologist, 61, 271- 285.

Stoiber, K. C. (2011). Translating knowledge of social emotional learning and evidence-based practice into responsive school innovations. Journal of Educational and Psychological Consultation, 21, 46-55.

Stoiber, K. C., & DeSmet, J. (2010). Guidelines for evidencebased practice in selecting interventions. In R. Ervin, G. Peacock, E. Daly, & K. Merrell (Eds). Practical Handbook of School Psychology (pp. 213-234), NY: Guilford.

Stoiber, K. C., & Gettinger, M. (2011). Functional assessment and positive support strategies for promoting resilience: Effects on teachers and high-risk children. Psychology in the Schools, 48, 686-706.

1) Participants of the D16 Translation of Research to Practice workgroup include: Sylvia Rosenfield and Susan Forman (Co-Chairs), Robin Codding, Jorge Gonzalez, Renee Jorisch, Gretchen Lewis-Snyder, Linda Reddy, Ed Shapiro and Karen Stoiber.
2) Italics indicates substitution of students for patients.