Analysis of the ratings and interrater reliability at high school choral festivals in Indiana

Loading...
Thumbnail Image
Can’t use the file because of accessibility barriers? Contact us with the title of the item, permanent link, and specifics of your accommodation need.

Date

2015-05-07

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This study investigated the interrater reliabilities of large-group choral festivals and provides data for comparison with similar events. Data for this study included ratings and points awarded by a total of 58 panels (of three adjudicators each), to 925 choir performances by 689 discrete high school choirs at the Indiana State Schools Music Association-sponsored choral festivals in 2012, 2013, and 2014. The research investigated (a) frequency distributions for ratings, types of choirs, and group level self-selection; (b) pairwise interrater correlations of ratings and points awarded; (c) interrater reliability and panel internal consistency, and; (d) differences in points awarded between adjudicators in each panel. Results indicated that a significantly higher proportion of choirs were awarded Gold (77%) and Silver (22%) ratings than other types of awards. There were significantly more mixed (60%) and treble (34%) than there were mens (6%) choirs. There were also more choirs entering at Group I (39%) and Group III (30%) levels than Group II (17%) and Group IV (15%) levels. Percentage agreements of ratings were mainly high, with 41 out of the 58 panels (71%) having a mean percentage agreement of >70%. 155 out of 174 pairs of adjudicators (89%) had pairwise percentage agreement of >60%. While mean Interrater Reliability Coefficients (IRCs) were almost all positive, there was a large range for correlation coefficients from weak (rs = .155) to very strong (rs = .939). Internal consistency ranged from moderate (α = .48) to high (α = .96) over the three years. A majority of the Intraclass Correlation Coefficients (ICCs) were in the strong (0.7 - 0.8) to almost perfect (> 0.8) agreement ra nges, indicating very good agreement by panel. While there were instances of significant differences (p < .01) found over the three years, in general the panel members seemed to agree on points awarded.

Description

Keywords

ratings, interrater, reliability, choral, festivals, indiana

Citation

Journal

DOI

Link(s) to data and video for this item

Relation

Rights

No license.

Type

Thesis