Methods: Twenty-five SafeCare sessions between home visitors and parents were video-recorded. Trained coders were randomly assigned to score sessions, either using both the video and audio portions of the recording, or only the audio. Sessions were coded using fidelity checklists consisting of 11 process items, and 17 content items. Each item was coded as having occurred or not. In addition, coders could rate an item as “technological limitation”, in that they could not code an item because of the method. Analyses compared levels of agreement and disagreement between audio and video coders across process and content items.
Results: Analyses indicate overall agreement between coders at 71.3% (SD=10.8). Agreement was higher for content items (M= 78.8%, SD=13.1) than process items (59.2%, SD=12.9). Disagreements due to technology limitations among audio coders was noted among 13 items; technology limitations were reported much more frequently among process items (54.5%) than content items (11.8%). Video and audio coders experienced 100% disagreement due to these limitations across 3 items, all classified as process.
Conclusions: In contrast to video, monitoring fidelity via audio recordings is associated with some loss of process-related fidelity. One possible solution is to employ audio recordings to monitor fidelity but in conjunction with alternative, supplementary methods, such as participant surveys, to better capture process items. Research should also examine the extent to which content and process fidelity relate to changes in family behavior to further inform optimal fidelity monitoring methods for program use.