The Academic Experiences Survey (AES): Measuring Perceptions of Academic Climate in Liberal Arts Institutions

In today’s educational climate, liberal arts institutions must demonstrate that their educational goals are being met. This paper presents reliability and stability testing of a concise, research-based survey instrument designed to examine student perceptions of academic experiences that is particularly suited to institutions rooted in the liberal arts. The Academic Experiences Survey (AES) consists of five scales: Comfortable in College, Skills, Interdisciplinary Understanding, Liberal Arts Integration, and Future Academic Plans. We present data from administrations at two institutions; examining the instrument’s validity as prediction of college retention and graduation, enrollment in STEM courses, and longitudinal changes over the course of students’ first year in college

The development of the AES was embedded in longitudinal studies examining the academic experiences and decision-making processes of first year students. This afforded us the ability to examine the correlates of AES scale scores, as well as to explore their predictive powers.
Given the prominence of the NSSE (Kuh, 2001) as an institutional-level assessment tool further comparison between it and the AES is warranted. Recent revisions of the core survey and topical modules for the current NSSE provide only very modest overlap with the AES. For example, the Development of Transferable Skills (Center for Postsecondary Research, 2016) module asks about the frequency with which students have written something that "included ideas from more than one academic discipline." In contrast, the AES "Interdisciplinary Understanding" scale, described below, prompts student responses to nine elements and lends itself to a more full examination of student perceptions of interdisciplinarity. For institutions already using the NSSE, the AES could provide an important complement to the NSSE to more fully examine epistemological development. For institutions committed to assessment instruments other than the NSSE, the AES may provide an even more critical complement to their existing survey instruments.

Participants
Participants were first-year students from both Carleton College and Colby-Sawyer College The first cohort from Carleton College, who were recruited in the fall of 2009, consisted of 101 firstyear students (39 male) who responded to a letter of invitation sent to all 520 first-year students. Of these 101 students, 94 (35 male) returned for a second session in the winter of 2010; 93 (34 male) returned for a third session in the spring of 2010, and 88 (33 male) returned for a fourth session in the fall of 2010. Although specific racial/ethnicity data were not collected on the sample, the overall population of first-year Carleton College students in the fall of 2009 included 22% who self-reported their racial group as African American, Asian American, Hispanic/Latino/Chicano, or Native American (Lawrence, 2010a ).
The second cohort of participants from Carleton College, who were recruited in the fall of 2010, consisted of 48 first-year students (12 male). Of these, 45 (12 male) returned for a second and third session in the winter and spring of 2011 (respectively), and 44 (12 male) returned for a fourth session in the fall of 2011. Students were recruited through a letter of invitation that was sent to 200 quasi-randomly selected first-year students 5 . The composition of the first-year class of students entering in 2010 was also 22% students of color, not including international students (Lawrence, 2010b) Participants from both cohorts were paid $10.00 for the first two sessions, $12.50 for the third session, and $25.00 for the fourth session.
The third and fourth cohorts of participants from Colby-Sawyer College were recruited in the fall of 2011 and fall of 2012 (respectively). The third cohort consisted of 375 first-year students (98 male) of the entire class of 476. The fourth cohort consisted of 386 first-year students (85 male) of the entire class of 437. Students were recruited from a required first year experience Journal of the Scholarship of Teaching and Learning, Vol. 16, No. 5, October 2016. josotl.indiana.edu 35 course for all incoming students. Students younger than 18 years of age (the legal age to sign an informed consent document) were excluded, as were any students who declined to participate (less than 1%) or were absent the day data were collected. The overall population of all Colby-Sawyer College students included approximately nine percent international students and 12% who identified themselves as ethnic minority students (Colby-Sawyer 2013).

Materials and Procedure
At Carleton College, trained undergraduate research assistants ran sessions in small groups. A number of different instruments were administered during the different sessions. 6 At Colby-Sawyer College, sessions were run in class where each student received a packet with several surveys 7 counterbalanced for order. Each student was given as much time as needed to complete the questionnaires, typically in the range of 20 to 25 minutes. A brief description of the instruments used in this paper follows. Academic Experiences Survey (AES). This instrument, developed for this study, consists of 45 items, divided equally into five scales. These scales, with example items, include the following: Comfortable at College ("I feel comfortable/at ease most of the time when I am at [College 1]"), Skills ("I have strong critical thinking skills"), Interdisciplinary Understanding ("I understand the connections among different disciplines, such as the humanities, sciences, and social sciences"), Liberal Arts Understanding ("My classes this term offer me a reasonable and broad view of intellectual inquiry"), and Future (Academic) Plans ("In the future I plan to take classes that build on what I'm learning in my current classes"). This measure was used at both Carleton College (during sessions two through four) and Colby-Sawyer College (during a single session).
Need for Cognition (NFC). This instrument, developed by Cacioppo and Petty (1982), consists of 34 statements that measure a person's tendency to engage in and enjoy effortful cognitive activity, such as reasoning or problem solving. Example items include "I really enjoy a task that involves coming up with new solutions to problems", or "I prefer my life to be filled with puzzles that I must solve". This measure was used at both Carleton College (in two sessions-the first and fourth) and Colby-Sawyer College (once).
Planning Survey (PS). The planning survey (Simons & Galotti, 1992) presents statements about the subject's self-reported use of planning strategies and organization of information. Example statements include "I write down appointments and meetings on a calendar", and "I have a written list of goals for each day". This measure was used at both Carleton College (in two sessions-the first and fourth) and Colby-Sawyer College (once).
Factors and Options Worksheet (FAOW). This instrument, adapted from previous research, ( Galotti, 1999( Galotti, , 2007Galotti & Tinkelenberg, 2009) was used to provide a systematic way for participants to describe the options under active consideration, as well as the criteria they reported using to evaluate those options for different upcoming decisions. These decisions were: choosing courses for the following term (participants did an FAOW for this decision in the first three sessions), choosing a major (done twice, once in the second session and once in the fourth), choosing housing for the following year, choosing plans for the summer (these last two decisions were surveyed both during the third session). The worksheet consisted of a grid containing ten columns of blanks. In the second column participants were asked to list the criteria by which they were currently evaluating their options. Each criterion was rated for its importance on a ten-point scale ("1" = "Not very important", "10" = "Extremely important"), and these weights were placed in the first column. At the top of the third through tenth column, the options under active consideration were listed. Participants rated the options based on how well they fulfilled each of these criteria using a ten-point scale. Table 1 provides a fictitious example of a version of this instrument for the decision, choosing a major. This measure was used only at Carleton College.

Internal Reliabilities of AES Scales
We first ran internal reliability analyses on the five AES scales and the overall total AES score from the two Carleton College cohorts combined (total number of included participants was 137 or 138, depending on the specific scale, due to missing responses

Longitudinal Stability of AES Scale Scores
We next looked to see how stable the AES scale scores were over time, using just the Carleton College sample, who responded to the scale on three occasions (sessions 2-4). Table 2 presents correlations among AES scores over different intervals of time. Unsurprisingly, correlations are strongest over adjacent intervals. Scores for all scales except Future Planning are moderately strong, even across long intervals. We also analyzed the changes in scale scores for the 131 students who participated in the second, third, and fourth sessions. Table 3 Table 3.

Correlation of AES Scores with GPA, Retention, and Graduation Rates
The Colby-Sawyer sample included a little less than 80% of their fall 2011 entering class, for whom we also had data on grade point averages, retention, and graduation. We found none of the five scale scores correlating significantly with first-year GPA. The Skills score and the Overall score correlated with high school GPA (r = .13, p < .05, and r = .12, p < .05, respectively) but no other AES score showed statistically significantly correlations. The Comfortable in College score correlated with fall-to-spring retention during the first year (r = .14, p < .01) as well as with springfall (sophomore year) retention (r = .13, p < .05). The Overall AES score also correlated significantly with spring-fall (sophomore year) retention (r = .11, p < .05). No other AES scores correlated significantly with retention measures. The Interdisciplinary Understanding scale (r = .11, p < .05), the Comfortable in College scale (r = .12, p < .05), and Future Planning scale (r = .12, p < .05) correlated significantly with 4-year graduation rates.

Correlation of AES Scores with Enrollment in STEM Courses
We calculated the number of science, technology, and mathematics courses each Carleton College participant took, as a fraction of the total number of courses they enrolled in during their first year of college. Carleton College does not offer engineering courses). Two of the five AES scores correlated at a statistically significant or marginally statistically significant level with this measure: Skills (r = .17, p < .06) and Interdisciplinary Understanding (r = .19, p < .05).

Correlation of AES Scores with Decision-Making Performance
Carleton College students were surveyed about seven different decisions they were in the process of making over the interval studied. For each of these decisions, we calculated the "decision map size" consisting of the number of different options the participant reported under active consideration, multiplied by the number of different criteria or factors they reported using to decide among these options. Decision map size indexes the number of pieces of information a person considers when making a decision. We computed the mean decision map size across all seven decisions. All five of the AES scales correlated with this measure as follows: Comfort, r = .31, p < .001; Skills, r = .27, p < .01; Liberal Arts, r = .26, p < .01; Interdisciplinary Understanding, r = .30, p < .001; Future Planning, r = .33, p < .001. The Overall AES score also correlated significantly with this measure, r = .34, p < .001. Table 4 shows a breakdown of correlations among AES scores and decision map sizes for individual decisions. As can be seen, correlations are stronger for academic decisions made later in the first year (course selection and major)-that is, during the period in which students can reasonably be expected to be thinking actively about making their major decision. Galotti, Clare, McManus, and Nixon Journal of the Scholarship of Teaching and Learning, Vol. 16, No. 5, October 2016. josotl.indiana.edu 40

Correlations of AES Scores with Other Individual Differences Variables
We computed correlations among the five AES scale scores and the other self-reported individual differences measures: Need for Cognition, and Planning, separately for each institution. Table 5 presents these correlations and shows moderately strong correlations throughout. The pattern of correlations shows a slight difference between the two institutions, and the correlations are uniformly stronger in the Colby-Sawyer sample.

Discussion
Data from two independent and relatively large samples indicates that we have developed an instrument with reasonable internal reliability and test-retest reliability with scales that predict some relevant behaviors in college students in liberal arts settings. Different scales show different patterns of correlations with behavioral variables and different patterns of stability and change.
The Comfortable in College scale scores, for example, correlated significantly with retention measures, whereas no other AES scale scores did. These scale scores were also particularly stable over time, showing very little change from session to session. This in turn suggests that the initial level of ease students feel on a campus-their perception that they can speak freely, approach instructors, feel included as part of a learning community-remains stable from early on in their academic career. Whether this reflects something more than the common finding that first impressions carry a disproportionate weight in the formation of attitudes remains for future investigation. It might be possible to use these scores to identify students who are at risk of dropping out early on in the first year experience.
The Skills scale scores, in contrast, tended to decline over time. Inspection of Appendix 1 shows that these scores reflect students' sense of their own reading, problem-solving, data interpretation, writing, communication, and other academic skills. These scores showed particularly strong correlations with other self-reported measures of cognitive preference, specifically, Need for Cognition and Planning scores. They also correlated marginally significantly with STEM course enrollment. Why would students perceive their academic skills as declining? We speculate that it may be that experience in college broadens and deepens their view of what strong writing or clear understanding are. Put another way, liberal arts college experience may help students come to grips with the idea that there is a vast amount to learn, and may address some initial overconfidence in their own knowledge and self-assessment.
Liberal Arts Understanding scale scores reflect a student's comprehension of what a liberal arts education is, as distinguished from a pre-professional course of study. They also indicate a student's self-reported ability to see a "big picture" of an institution's curriculum-why different parts of the curriculum are included and how they fit together. Liberal Arts scale scores showed the highest correlation over the longest intervals, but a very different pattern of longitudinal change, that of an inverted-U. We wonder again if this reflects a maturation of students' understanding and a lessening of their initial overconfidence in their own appreciation of what a liberal arts education entails.
The Interdisciplinary Understanding scale scores reflect students' self-reported ability to draw connections across individual courses, and to understand that different questions can be meaningfully approached from multiple vantage points. These scores correlated significantly with STEM enrollments, which in turn might suggest ways of encouraging students to take more STEM courses: Having instructors make explicit note of interdisciplinary connections of STEM fields to those in the humanities and social sciences.
Interdisciplinary scale scores declined slightly over time, and showed moderate correlations with decision map sizes for important academic decisions faced in the latter part of the students' first year of college, such as choosing a major or choosing courses late in the first year and early in the second. One explanation for these correlations is that toward the end of their first year students are thinking about their major, and those who are thinking about it in the most complex ways-considering the most information-are also likely to seek out connections across courses. Further work is required to explore and test this possibility. Galotti, Clare, McManus, and Nixon Journal of the Scholarship of Teaching and Learning, Vol. 16, No. 5, October 2016. josotl.indiana.edu 43 Lastly, Future Planning scale scores showed some of the least amount of longitudinal stability, but showed a pattern of overall growth over time. One likely explanation of the low stability is that as students progress from their first to their second year, their plans for future courses naturally change, as they complete different requirements, explore different majors, and discover new interests (one of the goals of a liberal arts education). These scores also showed moderately strong correlations with decision map sizes for important academic decisions faced in the latter part of the students' first year of college, likely because as students start considering possible majors, they focus on requirements more intentionally.
In the study, we included possible correlates of two different kinds: self-report and behavioral. We believe that both kinds of measures are important. Self-report measures provide information about how students describe their own attitudes, values, abilities, and proclivities-in short, they reflect internal insights of the participant. Behavioral measures provide less insight, but more publically observable indices of performance in real-life contexts.
Of course, standard cautions apply when evaluating our work. We collected data from students at only two institutions, both of which were residential liberal arts colleges. How well the scales would do at a larger university or other institution is as yet unknown. Further adoption of the AES would help us to address this question.
Insights derived from the AES have the potential to serve as an important complement to existing assessment efforts in cases in which colleges and universities rooted in the liberal arts seek to examine student perceptions of the purposes and goals of a liberal arts education as they progress through their undergraduate careers. We hope that this instrument can be of both theoretical and practical help to faculty and staff working on assessment issues within liberal arts contexts. We believe the AES measures a student's perception and understanding of important dimensions of a liberal arts education, and predicts some important educationally relevant behaviors. Longitudinal and cohort responses to the AES could help institutions to understand how well they accomplish the goals of liberal education in the eyes of the student, as well as measure any changes in how well those goals are met as program or curriculum changes are implemented.