Abstract: Implementing Early Childhood Preventive Programs in Community Settings: Validating an Implementation Framework Utilizing Qualitative Methods (Society for Prevention Research 24th Annual Meeting)

20 Implementing Early Childhood Preventive Programs in Community Settings: Validating an Implementation Framework Utilizing Qualitative Methods

Schedule:
Tuesday, May 31, 2016
Pacific D/L (Hyatt Regency San Francisco)
* noted as presenting author
Karyn Hartz-Mandell, PhD, Postdoctoral Fellow, Bradley Hospital/Brown University, Providence, RI
Rebecca Silver, PhD, Asst. Professor (Research), Bradley Hospital/Brown University, Providence, RI
Rebecca Newland, PhD, Postdoctoral Fellow, Bradley Hospital/Brown University, Providence, RI
Ronald Seifer, PhD, Professor of Psychiatry & Human Behavior, Brown University, Riverside, RI
Barbara Jandasek, PhD, Assistant Professor (Research), Rhode Island Hospital/Brown University, Providence, RI
Leandra Godoy, PhD, Predoctoral Resident, Brown University, East Providence, RI
Introduction: Implementation of evidence-based programs (EBPs) in community settings could increase access to early childhood prevention practices that promote healthy development. To fully realize the potential of taking EBPs to scale in the “real world,” it is critical to understand what promotes or hinders high quality implementation. Using qualitative methods, this study examines the validity of a framework for identifying such implementation processes.  

We utilize program evaluation data from two large-scale projects implementing EBPs in the community: 1) Implementation of universal developmental/behavioral health screening and integrated behavioral health in pediatric primary care (LAUNCH); 2) Implementation of evidence-based home visiting programs (MIECHV).

Methods:  Implementation science points to many processes associated with successful community implementation. From this literature, we developed a framework of ten potential implementation processes (e.g., Material Resources, Staff Resources, Financial), which guided qualitative analysis of Key Informant interviews.

Key Informants were identified from project collaborators, including a State Health Department, a research center, and community partners (e.g., clinic staff, home visitors, administrators). LAUNCH informants from the Health Department and research center were interviewed six times (Ns= 7-16) and clinic staff were interviewed twice (Ns = 29, 28). To date, MIECHV informants were interviewed twice (Ns = 25, 18). Third interviews with these informants are ongoing (anticipated N = 20).

Interviewers used semi-structured interviews with open-ended prompts. Completed interviews were transcribed, coded using the above-mentioned framework, and synthesized using qualitative analysis best practices (e.g., concordance, NVIVO). Ongoing interviews will be transcribed, coded, and synthesized by the conference.

Results: We assessed validity by using qualitative methods to examine whether Key Informants identified similar processes in their implementation descriptions. Data from one project were fully examined. Although some aspects of the framework were coded more prominently, similar implementation processes were identified across settings (different clinics), roles (e.g., pediatrician, administrator), implementation activities (screening, integrated behavioral health), and time (beginning, middle, end of implementation). We will examine similarities and differences in implementation processes identified in the second project.

Conclusion: Preliminary findings support the validity of this framework for understanding processes contributing to implementation. We will describe this framework in detail and explore possible applications for implementation science research, practice, and policy.