Borrowing from related fields to advance intervention implementation in education

How we can advance our understanding of intervention implementation in education by considering behavior change theory and research from related fields

By Lisa M. Hagermoser Sanetti, PhD

"I not only use all the brains I have, but all that I can borrow."
– Woodrow Wilson

At this point in time, it is fairly widely accepted that evidence-based interventions (EBIs) should be prioritized for implementation in schools (American Psychological Association, 2005; Individuals with Disabilities Education Act of 2004, No Child Left Behind Act of 2001). Yet, these EBIs aren't likely to have their intended effect unless they are implemented as planned. Research results consistently indicate that we can't assume EBIs will be implemented as planned without systematic, on-going support (Gresham, 1989; Sanetti & Kratochwill, 2009). Unfortunately, we have largely assumed adequate levels of intervention implementation, and as a result, the field of education is considerably farther behind in our sophistication with regard to our understanding of how to efficiently and effectively support EBI implementation than other service-oriented fields (e.g., medicine, health psychology) (Sanetti & Kratochwill, 2009). The purpose of this paper is to highlight how we can advance our understanding of intervention implementation in education by considering behavior change theory and research from related fields. To this end, I (a) review advances related to implementation processes in education based on behavioral theory; (b) discuss the Health Action Process Approach (Schwarzer, 1992), an empirically supported theory of adult behavior change from health psychology; and (c) provide an example of how this "borrowed" theory may provide additional advances in understanding implementation processes in education.

Behavioral theory

Two of the most widely researched and cited strategies for supporting intervention implementation are performance feedback and direct training with on-going support (Sanetti & Kratochwill, 2009). Generally, performance feedback is any information that is provided to an implementer about the quantity or quality of their intervention behavior that provides information about how well they are doing (Noell, 2011). Typically, researchers have provided implementers with graphed intervention plan adherence data on a regular (i.e., daily, weekly) or response-dependent (i.e., only when implementation decreased below an acceptable level) basis (Noell & Gansle, in press). Research results consistently have demonstrated performance feedback as an effective strategy to increase teachers' intervention implementation (Noell, 2011); a recent meta-analysis further supports the effectiveness of this approach (Solomon, Klein, & Politylo, 2012). Direct training, which includes modeling, behavioral rehearsal, and performance feedback can lead to an intervention being implemented with a high level of treatment integrity (Sterling-Turner, Watson, & Moore, 2002), however on-going support (e.g., coaching), is typically necessary to maintain intervention implementation (Joyce & Showers, 2002). Given their basis in behavioral theory, it is to be expected that evaluations of both of these strategies focus on observable intervention behaviors and social validity (e.g., acceptability of intervention). Skill proficiency, however, is only one of the many interventionist-level factors that are hypothesized to influence intervention implementation (Sanetti & Kratochwill, 2009). All of the other interventionist-level factors are cognitive in nature (see Table 1). This suggests that theories of behavior change that include a wider range of behavior determinants may be useful in developing strategies to promote intervention implementation.

Health Action Process Approach (HAPA)

The HAPA is an empirically supported theory of adult behavior change developed in the health psychology field (Schwarzer, 1992). There is extensive empirical support for the HAPA across a wide variety of health-related behaviors (e.g., breast cancer screening, exercise, diet modification; see Schwarzer et al., 2008 for a review). The HAPA is unique in that it predicts not only one's intention to change their behavior (motivational stage), but also one's ability to initiate and maintain the new behavior across time (volitional stage). According to the HAPA (see top of Figure 1), in the motivational phase, one's intention to change their behavior is directly influenced by one's (a) perception of a problem, or the belief that there is a problem to be addressed; (b) outcome expectations, or the believe that behavior change will have positive outcomes; and (c) action self-efficacy, or the confidence in their ability to perform the new behaviors. Once someone has an intention to change their behavior, the volitional phase begins. The HAPA posits that (a) action planning, or detailed logistical planning of behavior change (e.g., when, where, how long); and (b) coping planning, identifying likely barriers to behavior change and strategies to address those barriers, are critical to bridging the gap between behavioral intention and implementation. Once the new behavior is demonstrated, the HAPA posits that (a) maintenance self-efficacy, or confidence in one's ability to continue the behavior across time; and (b) recovery self-efficacy, or confidence in one's ability to re-start the behavior after a lapse, are critical to sustaining behavior change.

Translation of the HAPA to education

The considerable empirical support for the HAPA, combined with the fact that it addresses the interventionist-level factors believed to influence intervention implementation (see Table 1), led to the development of Planning Realistic Intervention Implementation and Maintenance by Educators (PRIME). PRIME is a simple, feasible system of supports for adapting interventions to fit the implementation context, planning logistics of implementation, and identifying and addressing barriers to implementation. Through this Institute of Education Sciences-funded grant project, we have (a) translated the HAPA model to education, (b) developed educator-friendly materials and a psychometrically sound measure to implement PRIME, and (c) conducted initial evaluations of PRIME components (see Sanetti, Kratochwill, & Long, in press for more detailed description); a second round of evaluations is on-going. In translating the HAPA to develop PRIME, we integrated the HAPA factors within a problem-solving process aligned with best practices in designing and implementing interventions (see bottom of Figure 1; Upah, 2008). More specifically, in the PRIME model, once an EBI is selected, the educator completes Implementation Planning, which is a structured process for (a) identifying all of the intervention steps (facilitates clarity on behavioral expectations), (b) making minor adaptations to intervention steps to better align with the implementation context (facilitates buy-in and controlled, documented adaptation of the intervention), (c) answering logistical questions regarding implementation of each intervention step (i.e., when, how often, for how long, where, resources needed), (d) identifying up to four potential barriers to implementation and (e) developing strategies to maintain implementation when faced with each barrier. In initial evaluations, teachers' adherence levels increased and were sustained at two-month follow up after completing implementation planning collaboratively with a consultant (Sanetti et al., in press). Evaluations of a computer-based protocol that can be independently completed by implementers are underway. After implementation planning, direct training is provided (facilitating intervention skill development). With a complete understanding of the requirements of intervention implementation, implementers complete an Implementation Beliefs Assessment (scale?) (Sanetti, Long, Neugebaur, & Kratochwill, 2012), which provides data on their behavioral intentions and sustainability self-efficacy. For those whose scores on the IBA indicate a need for further support, a host of empirically supported strategies (e.g., participant modeling), detailed in "strategy guides," are available for consultants or coaches to use.

Perceived effectiveness of the intervention

  Components of HAPA and PRIME
Interventionist-level factor Perception of a
Action &
Perceived Need for the Intervention X        
Motivation to Implement the Intervention X X      
Willingness to Try the Intervention   X      
Perceptions of Role Compatibility X X X    
Perceptions of Relative Advantage X X      
Self-Efficacy     X    
Shared Decision Making       X  
Perceptions of the Intervention Recipient X X X    
Skill Proficiency         X

Note: HAPA= Health Action Process Approach, PRIME= Planning Realistic Intervention Implementation and Maintenance by Educators
*Direct training is not explicitly addressed in HAPA, but is included in PRIME.

Certainly, the HAPA is only one of many potential theories or adult behavior change that could be adapted for use in the education context. Considerably more research is needed, much of which is on-going, to further support the components of PRIME. However, the PRIME model and initial empirical support provides a valuable example of how theories and research in related fields can facilitate a more comprehensive approach to addressing the numerous interventionist-level factors that may influence implementation. Paying attention to and selectively borrowing from implementation science as a field as well as implementation advances in related human services fields may be the most efficient method for rapidly advancing the sophistication with which educators actively address, rather than assume, implementation processes in the field of education.

Figure 1: The Health Action Process Approach and PRIME Models

Health Action Process Approach Model

Figure 1: The Health Action Process Approach and PRIME Models

Preparation of this article was supposed by the Institute of Education Sciences, U.S. Department of Education, through Grant R324A10005 to the University of Connecticut. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.Correspondence regarding this article should be addressed to Lisa M. H. Sanetti at the University of Connecticut, Department of Educational Psychology, U-3064, Storrs, CT 06269-3064.


American Psychological Association. (2005). Policy Statement on evidence-based practice in psychology. Retrieved from: http://www.apa.org/practice/resources/evidence/evidence-based-statement.pdf (PDF, 126KB)

Gresham, F.M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18, 37-50.

Individuals with Disabilities Education Improvement Act, 20 U.S.C. § 1400 et seq. (2004)

Joyce, B., & Showers, B. (2002). Student achievement through staff development (3rd ed.). Alexandria, VA: Association for Supervision and Curriculum Development.

Noell, G.H. (2011). Empirical and pragmatic issues in assessing and supporting intervention implementation in school. In G.G. Peackock, R.A. Ervin, E.J. Daly, & K.W. Merrell (Ed.), Practical handbook in school psychology (pp. 513-530). New York, NY: Guilford Publications.

Noell, G.H., & Gansle, K. (in press). The use of performance feedback to improve intervention implementation in schools. In L.M.H. Sanetti, T.R. Kratochwill (Eds.), Treatment integrity: Conceptual, methodological, and applied considerations for practitioners and researchers. Washington, DC: American Psychological Association Press

No Child Left Behind, 20 U.S.C. § 16301 et seq. (2001).

Sanetti, L.M.H., & Kratochwill, T.R. (2009a). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38, 445-459.

Sanetti, L.M.H., Kratochwill, T.R., & Long, A.C.J. (in press). Applying adult behavior change theory to support mediator-based intervention implementation. School Psychology Quarterly.

Sanetti, L.M.H., Long, A.C.J., Neugebaur, S.R., & Kratochwill, T.R. (2012). Implementation Beliefs Assessment. Storrs, CT: University of Connecticut.

Schwarzer, R. (1992). Self-efficacy in the adoption and maintenance of health behaviors: Theoretical approaches and a new model. In R. Schwarzer (Ed.), Self-efficacy: Thought control of action (pp. 217-243). Washington, DC: Hemisphere.

Schwarzer, R. (2008). Modeling health behavior change: How to predict and modify the adoption and maintenance of health behaviors. Applied Psychology: An International Review, 57, 1-29. doi: 10.1111/j.1464-0597.2007.00325.x

Solomon, B.G., Klein, S.A., & Politylo, B.C. (2012). The effect of performance feedback on teachers' treatment integrity: A meta-analysis of the single-case literature. School Psychology Review, 41, 160-175.

Sterling-Turner, H.E., Watson, T.S., & Moore, J.W. (2002). The effects of direct training and treatment integrity on treatment outcomes in school consultation. School Psychology Quarterly, 17, 47-77.

Upah, (2008). Best practices in designing, implementing, and evaluating quality interventions. In A. Thomas and J. Grimes (Eds.) Best practices in school psychology (V). Washington, DC: National Association of School Psychologists (pp. 209-224).

Certificate text

President Shane Jimerson (left) and Stacey Overstreet (right) present Lisa Hagermoser Sanetti with the Lightner Witmer awardDr.Lisa Sanetti’s outstanding research program focuses on treatment integrity of intervention implementation, including the assessment of treatment integrity as well as strategies to promote treatment integrity among educators. Given the impact that consultation can have on changing teacher behavior, a critical arena has been the development of strategies to support the implementation of interventions by teacher consultees. She approaches this set of issues from multiple perspectives, combining a practitioner’s sensitivity to the difficulties and a researcher’s perspective on quality methodology. The quality of her work has been recognized by prestigious organizations, institutions, and funding sources.