Shari Plowman, MPH, Senior Evaluator, University of Minnesota-Twin Cities, Minneapolis, MN
Jennifer Oliphant, EdD, MPH, Research Associate, University of Minnesota-Twin Cities, Minneapolis, MN
Kara Beckman, MA, Evaluator and Project Director, University of Minnesota-Twin Cities, Minneapolis, MN
Paul Snyder, MSW, MDiv, LGSW, Project Director, University of Minnesota-Twin Cities, Minneapolis, MN
Amy Gower, PhD, Research Associate, University of Minnesota-Twin Cities, Minneapolis, MN
Glynis Shea, BA, Communications Director, University of Minnesota-Twin Cities, Minneapolis, MN
Renee E. Sieving, PhD, FAAN, FSAHM, Professor, University of Minnesota-Twin Cities, Minneapolis, MN
Abigail Gadea, MSP, LSW, MPP, Program Coordinator, University of Minnesota-Twin Cities, Minneapolis, MN
Barbara McMorris, PhD, Associate Professor, University of Minnesota, Minneapolis, MN
Introduction: Even though implementation fidelity is critical for intervention effectiveness, implementation and delivery outcomes of youth-focused evidence-based interventions (EBIs) remain understudied. This poster discusses the importance and utility of incorporating implementation measures into EBI program planning and evaluation. We provide a unique case study to advocate for the collection of process-level data during initial intervention implementation. Data are from a three-year comparative effectiveness trial including implementation of an evidence-based, social emotional learning (SEL) program by core subject teachers in 6
th, 7
th, and 8
th grades in two metro schools. One year into the project, teachers at School 1 requested a program that was more teacher-friendly and relevant to their diverse students. We worked with teachers from both schools to select a new EBI for the second year and engaged with local service clubs for financial support. The purpose of this study is to evaluate similarities and differences in implementation outcomes across two years of this effectiveness trial with two different EBIs.
Methods: Based on the RE-AIM framework, implementation outcomes are assessed with measures of program adoption (e.g., teacher participation, belief in program efficacy, ease/clarity of program) and delivery (quality, fidelity) gathered through surveys and classroom observations of teachers. Qualitative data were also collected. Analyses consisted of descriptive statistics, chi-square tests, and t-tests. We examined differences in implementation outcomes between EBI 1 and EBI 2. We then conducted analyses for each school separately to determine how implementation outcomes between EBIs differed across schools. Content analysis was used for qualitative data.
Results: Adoption and program delivery measures improved after the switch to EBI 2 compared to EBI 1. Data from School 1 primarily drove these improvements. School 2 had relatively good adoption and delivery outcomes for both EBI 1 and 2; while School 1 had relatively poor adoption and delivery outcomes associated with Program 1, and substantially better outcomes associated with Program 2. Qualitative data from teachers confirmed findings from the quantitative analysis.
Conclusions: Results underscore the importance of measuring and evaluating implementation fidelity of youth-focused EBIs in real-world settings. This process evaluation allowed us to document how well EBIs translate to real-world settings of two metro middle schools, where our initial EBI choice was not an ideal match for teachers and students. The new EBI led to significantly better adoption and delivery outcomes. Implementation assessment was key to understanding how processes were operating, which led to adjustments and better implementation outcomes for the trial.