Employing differential item function analysis in survey research

dc.contributor.authorPaulsen, J.
dc.contributor.authorMerckle, R.
dc.contributor.authorBrckaLorenz, A.
dc.date.accessioned2019-09-10T17:36:06Z
dc.date.available2019-09-10T17:36:06Z
dc.date.issued2019-04-06
dc.descriptionPaper presented at the 2019 American Educational Research Association Annual Meeting, Toronto, Canada.
dc.description.abstractOne of the key assumptions involved in using any survey or measurement is that the instrument works consistently across groups. This is particularly important in survey research where group comparisons are ubiquitous. Differential item functioning analysis examines whether the instrument systematically biases in favor of one group. The findings from such an analysis are unattainable in traditional approaches to examining instrument validity, and yet, it is rare to find DIF analysis in surveys. This process illustrates DIF analysis with logistic regression using the Faculty Survey of Student Engagement. We find FSSE items did not show the presence of DIF. This provides confidence to users of this instrument that it measures the same constructs in the same way across different groups.
dc.identifier.urihttps://hdl.handle.net/2022/23775
dc.language.isoen
dc.publisherAmerican Educational Research Association Annual Meeting
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleEmploying differential item function analysis in survey research
dc.typePresentation

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Employing differential item function analysis in survey research.pdf
Size:
318.03 KB
Format:
Adobe Portable Document Format
Description:
Can’t use the file because of accessibility barriers? Contact us