Abstract: Assessing Fidelity of Interventions in Schools Using a Multi-Tiered Prevention Framework (Society for Prevention Research 21st Annual Meeting)

464 Assessing Fidelity of Interventions in Schools Using a Multi-Tiered Prevention Framework

Schedule:
Friday, May 31, 2013
Grand Ballroom C (Hyatt Regency San Francisco)
* noted as presenting author
Katrina Joy Debnam, PhD, Research Associate, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD
Elise Touris Pas, PhD, Assistant Scientist, Johns Hopkins University Bloomberg School of Public Health, Washington, DC
Catherine Bradshaw, PhD, Associate Professor, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD
Introduction: Given the growing concerns about behavior problems in schools, it is recommended that schools implement a set of integrated prevention programs consistent with the tiered public health framework. Specifically, universal prevention programs should be implemented in schools to ensure that consistent expectations and behaviors are taught and reinforced with all students.  However, it is expected that approximately 15% of students will require selective interventions, and 5% will need intensive, indicated preventive interventions. While a multi-tiered framework is a desirable means for promoting positive student outcomes, the measurement of schools’ adoption and implementation processes within this framework poses several challenges. Though many evidence-based interventions have their own fidelity measure, these measures are limited because they do not take into account the broader prevention framework within the school, allow for across-school comparisons, and can be burdensome and difficult for practitioners to use. Currently, few measures adequately assess the implementation fidelity of multi-tiered systems of supports across schools. 

Method: This presentation will provide a detailed description of two measures which can be used to assess a wide range of evidence-based practices at all three levels. The School-wide Evaluation Tool (SET) and the Individual Student Systems Evaluation Tool (I-SSET) were developed to measure the implementation of key features of the universal prevention efforts and to document the characteristics of selective and indicated preventive interventions, respectively. Together, these measures provide a multi-tiered assessment of schools’ preventive interventions; these measures are broad in their approach in order to assess fidelity of a variety of different programs. Data will be presented from 58 Maryland high schools across 2 time points using these measures to assess the quality of implementation of evidence-based programs.

Results: Preliminary analyses of the SET and ISSET data show high internal consistency (α = .93 & .92, respectively).  Overall SET scores ranged from 19.6% to 98.6% (M=57.8%) with highest ratings on schools’ systems for responding to behavioral violations (M=81.9%).  Overall ISSET scores ranged from 16.2% to 96.6% (M=56.8%) with highest ratings on characteristics of their intensive indicated preventive interventions (M=77.1%).

Discussion: With an increasing focus on schools to implement evidence-based practices, measures which can inform our knowledge in these areas are needed. Implications for using the SET/ISSET to document the evidence-based, selective and indicated programs in place at schools as well as their fidelity to the intervention design will also be discussed.