Psychometric Portfolio
Permanent link for this collectionhttps://hdl.handle.net/2022/24482
Browse
Browsing Psychometric Portfolio by Author "Paulsen, Justin"
Now showing 1 - 5 of 5
- Results Per Page
- Sort Options
Item Construct validity: 2017 scales(Faculty Survey of Student Engagement, 2018) Paulsen, Justin; BrckaLorenz, AllisonFSSE 2017 grouped 50 survey items into several scales: Higher-Order Learning, Reflective and Integrative Learning, Learning Strategies, Quantitative Reasoning, Collaborative Learning, Discussions with Diverse Others, Student-Faculty Interaction, Effective Teaching Practices, Quality of Interactions, and Supportive Environment. The purpose of this study was to evaluate the quality of these scales, with particular focus on their internal structure.Item Internal Consistency(Faculty Survey of Student Engagement, 2017) Paulsen, Justin; BrckaLorenz, AllisonOne way to estimate reliability, specifically the internal consistency, of FSSE results is by calculating Cronbach’s alphas and intercorrelations for the FSSE scales. Internal consistency is the extent to which a group of items measure the same construct, as evidenced by how well they vary together, or intercorrelate. A high degree of internal consistency enables the researcher to interpret the composite score as a measure of the construct (Henson, 2001). Assuming the FSSE scales effectively measure an underlying construct, we would expect to find high estimates of their internal consistency.Item Internal Consistency Statistics(Faculty Survey of Student Engagement, 2017) Paulsen, Justin; BrckaLorenz, AllisonItem Known-Groups Validity: 2017 FSSE Measurement Invariance(Faculty Survey of Student Engagement, 2018) Paulsen, Justin; BrckaLorenz, AllisonA key assumption of any latent measure (any questionnaire trying to assess an unobservable construct) is that it functions equally across all different groups. In psychometrics this is called measurement invariance and means that members of different groups understand and respond to the scales similarly and that items have the same relationship with the latent measure across groups (Embretson and Reise, 2000). Having ascertained this, data users can confidently assert that differences between groups are actual differences unrelated to any measurement error.Item Relations to other Variables: 2018 Convergent & Divergent Correlations(Faculty Survey of Student Engagement, 2018) Paulsen, Justin; BrckaLorenz, AllisonOne of the primary validity evidences established in The Standards for Educational and Psychological Testing (AERA, APA, NCME, 2014) comes via relations between the measurement of interest and other established constructs. This is accomplished through convergent and divergent correlations. The idea behind this type of evidence is that instruments should positively correlate with similar but established constructs and not correlate with unrelated constructs. Ideally, convergent correlation would be positive but without being extremely high as correlations too high would suggest the instrument under review offers little beyond what the established instrument already measures. FSSE’s relationship to other variables has not been demonstrated. Thus, this study examines whether FSSE scales relate to other variables as expected.