Abstract: Explaining Item Bias in a Student Risk and Needs Assessment Using School and Community Data (Society for Prevention Research 27th Annual Meeting)

141 Explaining Item Bias in a Student Risk and Needs Assessment Using School and Community Data

Wednesday, May 29, 2019
Seacliff B (Hyatt Regency San Francisco)
* noted as presenting author
Brian French, PhD, Professor, Washington State University, Pullman, WA
Thao T. Vo, B.S., Graduate Student, Washington State University, Pullman, WA
Introduction: Differential item functioning (DIF) represents differences between groups in item responses when groups are matched on ability. Such differences represent irrelevant variance that can influence item responding, test scores, and threaten score validity. DIF typically examines person-level variables (e.g., gender, race) to ensure items are fair. However, the challenge with DIF is understanding the causes of such differences. Our work tackles this issue in the context of school truancy, shown to negatively impact social, financial, and psychological well-being among students (Rocque et al., 2017). We examine the Washington Assessment of Risk and Needs of Students (WARNS) for DIF in a two-step method where (a) DIF is detected for groups with high rates of truancy (i.e., African American and Hispanic) and (b) state-level proxy variables are applied to explain sources of DIF through an ecological multilevel model (Zumbo et al., 2015). We explore and showcase how this novel confirmatory approach of combining contextual state-gathered data with local assessment data can be used to understand DIF.

Methods: Data were obtained from a sample of Washington and Georgia state high school students (n= 1,468; n= 337 African American, n= 597 Hispanic, n= 534 Caucasian) who completed WARNS as a standard practice in their schools. First, an item response theory graded response model was used to detect WARNS items for DIF on six domains using statistical significance and effect sizes. Items identified with large effects will be analyzed with a multilevel logistic regression model with contextual proxy variables (e.g., school climate rating, school SES) serving as level two predictors to explain the sources of DIF.

Results: On the Aggression-Defiance domain, 4 of 8 items showed DIF with large effects (>.80) between African American, Caucasian, and Hispanic students. The Family Environment domain had 1 of 5 items with large DIF. A single large DIF item was identified on the Peer Deviance domain between African American and Caucasian students, and 1 item on the School Engagement domain between African American and Hispanic students. The presentation will report the use of an ecological model of item responding (Zumbo et al., 2015) focusing on contextual variables as explanatory sources of DIF.

Conclusion: We demonstrate a novel method of how school and community data can be used with local assessment data to better understanding item bias in a confirmatory manner.