LESSONS FROM THE FIELD

Weighing the importance of methodological precision and student success

I collected data for a reading intervention study conducted by a special education teacher, featuring auditory training, basal reading, reading-based games and at-home parent practice

By Michael Frank, MA

Testing interventions for reading presents a significant challenge to education researchers. With the high stakes of end-of-year testing, wait-lists and control groups are a hard sell to the teachers and parents of children recruited for research. If a child is denied treatment for even six weeks, their opportunity for growth may be limited, and if the state has mandatory retention laws, the stakes are even higher. However, implementing a multi-faceted intervention with varying components that are simultaneously delivered makes the interpretation of effectiveness difficult.

This year, I collected data for a reading intervention study conducted by a special education teacher. The intervention featured auditory training, basal reading, reading-based games, and at-home parent practice and was evaluated based on improvement in the Big 5 reading skills (i.e., phonics, phonological awareness, fluency, vocabulary, and comprehension) as indicated by the Woodcock Johnson Reading Mastery Tests – Third Edition (WJRMT-III). Due to the limited time available for Tier II intervention and the aforementioned pressure for reading improvement, the many facets of this intervention had to be implemented simultaneously.

When I began working on this project, I expressed concerns about interpreting the unique influence of each component; does one component (e.g., games) improve all five skills alone, or did each component boost one? The special educator and her co-author (my adviser) explained to me that the goal was not to create a taxonomy of multiple interventions and their unique effects, but rather to demonstrate the successes and challenges faced by a realistic intervention in a realistic school. Having only completed my first year of graduate school, my understanding of science still outweighed my understanding of practice. After observing interventions at my practicum site, I began to realize what they meant.

In order to understand best practice in the field of education, it is important to have reliable, valid and clear data on effective programs. However, once those are established, researchers are charged with the task of making these programs accessible to schools with a variety of needs. As it turned out, each of the individual components of the intervention has its own research base. Games have strong evidence for increasing motivation to read (Charlton, Williams, & Mclaughlin, 2005; Wells & Narkon, 2011), while basal reading has been shown to increase phonics, comprehension, and vocabulary (Briggs, Clark, & Texas Center for Educational Research, 1997). Auditory training can increase comprehension and fluency (Hawkins et al., 2011), and parent involvement has positive effects on generalizing skills learned in school and maintaining gains (Lignugaris-Kraft et al., 2001). What if a teacher wants their student to improve in all of those areas?

It is true that simultaneous interventions are difficult to interpret separately, but I found that this was not the goal my co-authors were attempting to achieve. Rather, the goal was to implement a dynamic intervention that met the needs of students that presented varying challenges. Those who had difficulty paying attention were reined in by the games, while the students who had trouble decoding text benefitted from the basal reading. Instead of having three groups meet separately to work on their various needs, an efficient environment was created in which everyone grew across the board.

No study is without flaw; small sample size and confounded techniques reduced the empirical robustness of our study. However, we discovered remarkably positive effects that may be the impetus for future research for integrating intervention techniques. As a graduate student learning about scienceinformed practice, it can be difficult to compromise the desire for statistical and methodological perfection. Nevertheless, I now consider that statistical and methodological precision is not always necessary for an intervention. Of course, some precision is absolutely necessary; one needs to know that an intervention will work, and even the best intervention is unlikely to yield results if it is not implemented with fidelity. Those needs aside, it does not always matter which portion of the intervention was responsible for the growth as long as the children get better. Furthermore, keeping the training of various skills separate for the sake of precision may prevent a student from reaching their full potential.

I am very grateful to have been given the opportunity to work alongside a practitioner in my research because it has taught me a great deal about how to transition from science to practice. As I conduct future research, I will continue to aspire toward pristine research methodology. However, I will bear in mind the difference between testing individual constructs and integrated teaching strategies. Most of all, I will not forget that the goal of all of our work in education research is not only to understand learning, but to create positive outcomes for students.

References

Briggs, K.L., & Clark, C. (1997). Reading Programs for Students in the Lower Elementary Grades: What Does the Research Say? Austin: Texas Center for Educational Research.

Charlton, B., Williams, R.L., & Mclaughlin, T. F. (2005). Educational games: a technique to accelerate the acquisition of reading skills of children with learning disabilities. International Journal of Special Education, 20, 66-72.

Hawkins, R.O., McCallum, E., McGuire, S., Barkley, E., Berry, L., & Hailley, J. (2011). Adding listening previewing to decrease reading errors during peer tutoring and increase reading fluency and comprehension. Journal Of Evidence-Based Practices For Schools, 12, 151-175.

Lignugaris-Kraft, B., Findlay, P., Major, J., Gilberts, G., & Hofmeister, A. (2001). The association between a home reading program and young children's early reading skill. Journal of Direct Instruction, 1, 117-136.

Wells, J.C., & Narkon, D.E. (2011). Motivate students to engage in word study using vocabulary games. Intervention In School And Clinic, 47, 45-49.

About the author

Michael Frank, MA, is a second-year PhD track graduate student at the University of South Florida who received his Bachelor's Degree of Psychology at Edinboro University of Pennsylvania. He is currently the President of USF SASP and serves as a volunteer for NASP's Student Development Workgroup. Michael's current research interests include ADHD, positive psychology, school climate, and school-based mental health.

Special thanks to Cathy Pelzmann, MS, and Linda Raffaele Mendez, PhD, the intervention designer and my professor, respectively.