Methods: We specified the activities (form) and their assumed functions of PBL, and subsequently operationalized these as questionnaire items. Data on the implementation of PBL activities and the presence of functions were collected in four Studio Schools from October 2014 to January 2015. Data sources include qualitative and quantitative checklists completed by teachers and students, and quantitative checklists completed by researchers. All checklists comprise of scales structured per PBL component, with separate subscales for activities and functions. Respondents were asked to rate if they agree that an item (activity or function) is implemented or otherwise present on a 4-point scale ranging from ‘completely disagree’ to ‘completely agree’.
Results: Factor analysis employed on student-reported data suggested that form- and function-focused fidelity can be measured simultaneously with validity, although limitations apply. Inter-rater reliability analysis of responses by participants and researchers (expressed as ICCs) resulted in differences in interpretation between different observers. This demonstrates the importance of actively involving target communities as partners in fidelity assessments.
Conclusion: This study demonstrates that the specification and subsequent monitoring of program functions is feasible. This approach may allow for increased flexibility in implementation across sites without necessarily compromising implementation fidelity and its monitoring in prevention science. Overcoming this challenge is paramount building an evidence base for programs and theories that can be adapted for specific and/or marginalized communities.