Employing differential item function analysis in survey research
dc.contributor.author | Paulsen, J. | |
dc.contributor.author | Merckle, R. | |
dc.contributor.author | BrckaLorenz, A. | |
dc.date.accessioned | 2019-09-10T17:36:06Z | |
dc.date.available | 2019-09-10T17:36:06Z | |
dc.date.issued | 2019-04-06 | |
dc.description | Paper presented at the 2019 American Educational Research Association Annual Meeting, Toronto, Canada. | |
dc.description.abstract | One of the key assumptions involved in using any survey or measurement is that the instrument works consistently across groups. This is particularly important in survey research where group comparisons are ubiquitous. Differential item functioning analysis examines whether the instrument systematically biases in favor of one group. The findings from such an analysis are unattainable in traditional approaches to examining instrument validity, and yet, it is rare to find DIF analysis in surveys. This process illustrates DIF analysis with logistic regression using the Faculty Survey of Student Engagement. We find FSSE items did not show the presence of DIF. This provides confidence to users of this instrument that it measures the same constructs in the same way across different groups. | |
dc.identifier.uri | https://hdl.handle.net/2022/23775 | |
dc.language.iso | en | |
dc.publisher | American Educational Research Association Annual Meeting | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.title | Employing differential item function analysis in survey research | |
dc.type | Presentation |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Employing differential item function analysis in survey research.pdf
- Size:
- 318.03 KB
- Format:
- Adobe Portable Document Format
- Description:
Collections
Can’t use the file because of accessibility barriers? Contact us