The Effects of Collapsing Ordered Categorical Variables on Tests of Measurement Invariance
Loading...
Can’t use the file because of accessibility barriers? Contact us with the title of the item, permanent link, and specifics of your accommodation need.
Date
2019
Journal Title
Journal ISSN
Volume Title
Publisher
Taylor & Francis
Permanent Link
Abstract
Cross-cultural comparisons of latent variable means demands equivalent loadings and intercepts or thresholds. Although equivalence generally emphasizes items as originally designed, researchers sometimes modify response options in categorical items. For example, substantive research interests drive decisions to reduce the number of item categories. Further, categorical multiple-group confirmatory factor analysis (MG-CFA) methods generally require that the number of indicator categories is equal across groups; however, categories with few observations in at least one group can cause challenges. In the current paper, we examine the impact of collapsing ordinal response categories in MG-CFA. An empirical analysis and a complementary simulation study suggests meaningful impacts on model fit due to collapsing categories. We also found reduced scale reliability, measured as a function of Fisher’s information. Our findings further illustrate artifactual fit improvement, pointing to the possibility of data dredging for improved model-data consistency in challenging invariance contexts with large numbers of groups.
Description
This is supplementary material, including appendices, data, and syntax to reproduce all empirical and simulation results.
Keywords
multiple-groups models; ordinal variables; collapsing categories; model fit
Citation
Journal
DOI
Link(s) to data and video for this item
Relation
Rights
Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
Type
Dataset
Software
Other
Software
Other