Employing differential item function analysis in survey research

Loading...
Thumbnail Image
Can’t use the file because of accessibility barriers? Contact us with the title of the item, permanent link, and specifics of your accommodation need.

Date

2019-04-06

Journal Title

Journal ISSN

Volume Title

Publisher

American Educational Research Association Annual Meeting

Abstract

One of the key assumptions involved in using any survey or measurement is that the instrument works consistently across groups. This is particularly important in survey research where group comparisons are ubiquitous. Differential item functioning analysis examines whether the instrument systematically biases in favor of one group. The findings from such an analysis are unattainable in traditional approaches to examining instrument validity, and yet, it is rare to find DIF analysis in surveys. This process illustrates DIF analysis with logistic regression using the Faculty Survey of Student Engagement. We find FSSE items did not show the presence of DIF. This provides confidence to users of this instrument that it measures the same constructs in the same way across different groups.

Description

Paper presented at the 2019 American Educational Research Association Annual Meeting, Toronto, Canada.

Keywords

Citation

Journal

DOI

Link(s) to data and video for this item

Relation

Type

Presentation