Abstract: Abstract of Distinction: The Identification of Valuable Implementation Science Constructs Among Federal Agencies (Society for Prevention Research 25th Annual Meeting)

130 Abstract of Distinction: The Identification of Valuable Implementation Science Constructs Among Federal Agencies

Schedule:
Wednesday, May 31, 2017
Yosemite (Hyatt Regency Washington, Washington, DC)
* noted as presenting author
Allison Dymnicki, PhD, Senior Researcher, American Institutes of Research, Washington, DC
David Osher, PhD, Vice President & Institute Fellow, American Institutes for Research, Washington, DC
Abraham H Wandersman, PhD, Professor, University of South Carolina, Columbia, SC
Michelle J Boyd, PhD, Social Science Analyst, U.S. Department of Health and Human Services, Washington, DC
Amanda Cash, DrPH, Senior Advisor for Evaluation and Evidence, U.S. Department of Health and Human Services, Washington, DC
Daniel Duplantier, MA, Social Science Analyst, U.S. Department of Health and Human Services, Washington, DC
Robin E Bzura, BS, Research Associate, American Institutes for Research, Waltham, MA
Introduction: Federal agencies in the United States have been interested for several years in understanding how to successfully implement and scale up evidence-based interventions and programs (EBIs). Multiple initiatives led by the Department of Health and Human Services (DHHS) have been designed to increase the department’s knowledge of EBIs in different program areas. The Office of the Assistant Secretary for Planning and Evaluation (ASPE) within DHHS has been engaged in efforts to understand more about factors associated with successful implementation of federally funded initiatives. The goal for the current contract, awarded to the American Institutes for Research (AIR), was to identify a set of implementation science constructs that will help federal staff select, support, and monitor grantees in federally funded initiatives.

Methods: AIR completed three phases, with the guidance of ASPE, to identify the set of implementation science constructs. First, we conducted a systematic literature review (where over 1600 abstracts were reviewed and 125 articles were coded) to identify the most studied and researched implementation science constructs. Second, we conducted 18 individual or group interviews with federal and non-federal experts about the results of the environmental scan with the goal of validating its findings. Third, we convened 16 individuals representing 7 divisions within the Department of Health and Human Services and 1 additional federal agency using a Delphi process. This approach helped the group to reach consensus regarding a set of implementation constructs that they would find useful when selecting, supporting, and monitoring grantees in federally funded initiatives.

Results: Results from this three-phase approach led to the identification of 11 implementation science constructs, which will be defined and described during the presentation. These were: evidence strength, evidence relevance, implementation complexity, general capacity, intervention-specific capacity, perceived advantage, contextual fit, adaptation, fidelity of implementation, theoretical clarity, and data driven quality improvement.

 Conclusions: This project’s findings highlight implementation science constructs that are of value to U.S. federal staff when selecting, supporting, and monitoring grantees as well as a scientifically rigorous approach that was used to identify and define this set of constructs. This type of applied research has the potential to advance the extent to which dissemination and implementation science are incorporated into the practices and policies of federal agencies and the systems in which they work.