Abstract: Evaluating the Fidelity of the Cultivating Awareness and Resilience in Education (CARE) Program (Society for Prevention Research 22nd Annual Meeting)

143 Evaluating the Fidelity of the Cultivating Awareness and Resilience in Education (CARE) Program

Schedule:
Wednesday, May 28, 2014
Columbia A/B (Hyatt Regency Washington)
* noted as presenting author
Patricia A. Jennings, MEd, PhD, Research Assistant Professor, The Pennsylvania State University, University Park, PA
Sebrina L. Doyle, MS, Project Coordinator, Pennsylvania State University, University Park, PA
Anna DeWeese, MA, Project Coordinator, Garrison Institute, Garrison, NY
Jennifer L. Frank, PhD, Research Assistant Professor, Pennsylvania State University, University Park, PA
Introduction: Fidelity refers to the degree to which an intervention is conducted as designed and is critical for any comprehensive prevention trial. During the early stages of development, refinement and evaluation, developers present the intervention themselves. When an intervention trial grows beyond the capacity of the developers alone, fidelity monitoring tools are needed that fully identify and operationalize the components necessary to achieve results. As new facilitators are trained to deliver the intervention, these tools can be used for training and for monitoring of the level at which they can deliver the program as intended.

 

Method: This paper reports on the development of a fidelity measurement system to monitor the delivery of the CARE for Teachers professional development program for an IES-funded efficacy trial. The first cohort (C1) was recruited in 2012 and included 51 teachers from 8 schools. Teachers in C1 were randomly assigned within schools to treatment (N=25) or waitlist control (N=26). CARE was presented to C1 treatment teachers over five day-long sessions during the 2012-2013 school year by a program developer and two intern facilitators. Fidelity measures were developed and piloted in preparation for the 2013 cohort; C2 (171 teachers from 28 schools) which will require three CARE programs delivered concurrently, only one of which will be presented by a developer. The CARE Daily Session Fidelity Rating Form was created to evaluate the percentage of core intervention components covered, the degree to which participant objectives were achieved and time spent on each activity. Quality of delivery was assessed with the CARE Facilitator Skill Rating form. Participant engagement, knowledge of concepts, and satisfaction with the program were also assessed. Two project staff who helped develop the fidelity measures observed and rated the C1 CARE program with high interrater reliability (> 80%).

 

Results: The C1 CARE Program was presented with a high degree of fidelity to the intervention model. 90% of the core components were covered and most participant objectives were met each day.  As expected, of the 30 hours spent in training, 6.5 were spent engaging in experiential and mindfulness practices. Participant engagement was high. Teachers attended an average of 4.22 (out of 5) days and scored high on the knowledge assessments (M = 95%). The overall mean facilitator rating for the entire program was 3.66 out of 4. Results from the second cohort will be available at the time of presentation.

 

Discussion: The CARE fidelity monitoring system functioned as intended. The CARE program was presented with a high degree of fidelity. The instruments are ready to be used during the next presentation of CARE for cohort 2 that will involve three concurrent presentations of the CARE program.