Leveling Up an Award-Winning Undergraduate Research Program: A Case Study From Furman University

This case study delineates the process that a small, private liberal arts university employed to amplify its high-impact practices in an already award-winning undergraduate research (UR) program. The process was catalyzed by combined institutional factors: the start of a new accreditation cycle and the launch of our university’s strategic vision, The Furman Advantage (TFA). Established in 2016–2017, TFA ensures all students have access to a high-impact engaged learning experience— UR, study away, and/or an internship. This institutional imperative provided an opportunity to assess the degree to which Furman’s UR program was meeting high-impact criteria. We compared Furman’s summer UR program against the emerging research on high-impact practices and made changes to enhance learning and to close equity gaps in access. We reoriented our UR program to focus on the characteristics of high-impact practices, particularly the mentoring relationship between faculty and students and the importance of student self-reflection. We reviewed improvements to our summer fellowship program, namely, changes in the application and review process, professional development for faculty, pre-experience training for summer research fellows, and modifications to our survey and self-analysis instruments. Broader programmatic changes included articulating common learning outcomes for engaged learning experiences and creating an evidence-driven assessment mechanism to help us meet learning outcomes and institutional objectives. Implementation of these changes required sustained collaboration at the institutional level between the Offices of Undergraduate Research, the Center for Engaged Learning, the Office of Institutional Research and Assessment, and the Faculty Development Center. In addition to measuring changes within UR over time, we have also been able to make comparisons across different engaged learning experiences, principally study away and internships, and then use this data to continue TFA improvements. Preliminary findings indicate that we have successfully enhanced our implementation of high-impact practices.

learning, to oversee it. This restructuring not only centrali ed space and logistics, allowing us to better track student participation and meet our every-student promise, but also encouraged each office within the CEL to find common ground, namely, around the characteristics of IPs.
Focusing on the IPs literature promoted the student-centeredness necessary to improve upon our already robust U program. Nonetheless, while IPs show promise for creating equitable learning experiences for students, the unevenness with which some of their components (active learning, institutional commitment to structural support of IPs) are implemented make scaling up challenging ( uh, 2008; uh, O' onnell, & eed, 2013). For example, while we know that U , now one of 11 IPs, promotes student self-efficacy, disciplinary knowledge, and research skills, and critical thinking (Linn, Palmer, Baranger, Gerard, & Stone, 2015;ilgo, C. A., Sheets, E., & Pascarella, E. T., 2015: Lopatto, 2006, we also know that, historically, underrepresented students are less likely to engage in U (Finley & McNair, 2013;Finley, 2019;O onnell, Botelho, Brown, Gon le , & ead , 2015;Shanahan, 201 ). The recognition of these disparities in student participation spurred a national conversation about making IPs, and U specifically, more student centered ( in ie & Zilvinskis, 2016). To address this issue, Furman opted to reorient its U program toward mentorship, in part because of the mutual benefits of mentoring reported by faculty and students: Faculty saw gains in research and satisfaction in helping students develop, and students perceived gains in developing a scholarly identity (Linn, M. C., Palmer, E., Baranger, A., Gerard, E., & Stone, E., 2015;Potter, S. J., Abrams, E., Townson, L., & Williams, J. E., 2009). especially at a small liberal arts college, where mentorship is structured by faculty rather than graduate students (Behar-orenstein, L. S., oberts, . W., & ix, A. C., 2010). Centering the mentoring relationship resulted in, but was not limited to, the following changes: amending the application for both mentors and mentees; revising the criteria employed by the summer research faculty-review committee; educating mentors on IPs; creating a common language around the characteristics and assessment of high-impact experiences; introducing a mandatory training and enhancement program for the summer research fellows and changing the survey and self-reflection instruments that they are required to complete; and creating an evidencedriven assessment mechanism that allows us to determine if we are meeting our objectives. We describe each of these in more detail below.

Leveling Up Furman s UR Program Application Review Process Funding tudent Engagement
We refer to our goal of improving the student research experience and assessing our progress along the way as leveling up. We set out to increase quality, enhance access, and use data to make programmatic changes. One of our main leveling-up strategies was to place the mentoring relationship at the center of our initiatives.

Application Changes
Although many faculty and students collaborate on U projects during the academic year, we focus here on our summer research program. Changes to the application and review process offer a clear example of our shifting priorities toward mentoring and the characteristics of IPs. Previously, our application process focused on the faculty member s project and its merits as measured against leading scholarship in the field. The review committee consisted of as many as 10 faculty members from across campus representing the four academic divisions (humanities, fine arts, social sciences, and natural sciences). The construction of the committee was designed to ensure that one or more experts in an associated field would assess every application. The committee functioned, essentially, as a grant- Ching, Pontari, lonis, and Boyd Journal of the Scholarship of Teaching and Learning, Vol. 21, No. 1, April 2021. josotl.indiana.edu providing agency. It had a defined budget and an excess of applications, so its goal was to determine which projects merited funding. The main standards for award were the project s prospect to advance scholarship and the faculty member s accomplishments in research and publication.
Notably absent from the application were questions related to mentorship and other features of IPs, in regard to either the faculty members ability to provide evidence of quality mentorship in the past or their intended approach to mentoring their current research fellows. egardless, and much to the credit of Furman s faculty members, high-quality mentorship occurred, as evidenced by the number of awards our students received for conference presentations, the number of joint publications between students and faculty and Furman s receipt of the Council on Undergraduate esearch s AU A in 2016. The convergence of the EP, TFA, and the hiring of new administrative personnel (a new director of undergraduate research and the new associate provost for engaged learning) provided the opportunity to assess the intended outcomes of our summer research program and to amend the application and review process accordingly. In the first year (2018), we retained much of the application content as it existed but added a series of seven questions related to IPs and the faculty applicant s intention to meet their criteria (See Table 1).

Table . Furman undergraduate research faculty application uestions.
IP characteristic Item used to assess characteristic Preparation escribe the types of preparatory work (e.g., training, describing experience timeline, etc.) the fellow(s) will do before or at the beginning of the experience to ensure they get the most out of it. Be sure to include how you will make student-learning outcomes clear to the participant(s). elationships escribe how this experience will help the fellow(s) build substantive (ongoing, meaningful) relationships, e.g., with faculty, staff, mentors/supervisor(s), peers, community members, etc. Also describe any opportunities the fellow(s) will have to collaborate with these parties. iversity escribe how this experience will facilitate the fellow(s) engagement across differences, through contact with people with different ideas, backgrounds, and experiences. Feedback escribe how you will provide the fellow(s) with feedback about their performance, including the frequency and level of detail and formality of the feedback. Will you give the fellow(s) the opportunity to make changes/adjustments based on the feedback you provide eal-world application escribe how the fellow(s) will apply, integrate, and synthesi e knowledge in the context of this experience. Will they have the opportunity to apply their knowledge to a novel problem or setting, and if so, please describe. eflection escribe how you will ask the fellow(s) to reflect on their learning and development, including the frequency, format (e.g., video diary, journal, etc.), and topic (e.g., problems encountered, how problems were dealt with, connection to academic work, knowledge/skills gained, etc.) of these reflections. Presentations escribe the frequency and format of oral presentations (formal and informal) you will expect your fellow(s) to make about their experience and/or knowledge they have gained. At the minimum this would include presenting at Furman Engaged ay on Tuesday April , 20 . Note. IP igh-impact practice. The new questions were designed broadly on conversations by several Furman faculty and staff (described in a later section), the work of George uh, among other scholars, and noted later, reflection questions asked of students before and after their U experience ( uh 2008; uh et al., 2013). Of course, we could not assume that every prospective faculty mentor was familiar with said scholarship, so this change to the application was preceded by outreach. The outreach efforts included but were not limited to focusing the preceding annual faculty fall retreat on ELEs and IPs; integrating summary descriptions of IPs into the application process and requiring both faculty and student applicants to read them before completing their applications; and holding targeted meetings by the director of undergraduate research with prospective faculty mentors, especially in disciplines comparatively underrepresented in summer research, for example, the humanities and fine arts.

Application Review Process and Funding Changes
In the 1st year of the transition, members of the review committee retained the traditional scholarship standards of the past but also considered the applicants responses to these new items. Furthermore, they focused less on reasons to deny an applicant and instead provided constructive feedback such that in a revise-and-resubmit process (newly introduced), a slightly subpar application might raise itself to the high standards of approval. In short, the review process became more of an educational process to instruct our campus community on mentorship and IPs. To acknowledge faculty effort in addressing these areas and the carefulness with which they responded to these items, once a faculty member s responses to the IP questions are approved, they only have to complete them once every 3 years.
This transition to focusing on providing feedback and not denying funding requests was facilitated by an infusion of extramural funds from the uke Endowment as part of TFA. We found ourselves in the privileged position of potentially funding any application that rose to the merit of approval, rather than looking for reasons to deny applications due to a lack of funding. egardless, even if that infusion of funding had not existed, we were shifting our priorities and our campus community s consciousness and culture toward mentorship and IPs. While not abandoning standards of scholarship, we were trusting faculty members, and the departments who hire them and evaluate them for promotion, to serve as the arbiters of scholarship.
In the 2nd year (2019) of the revised application and review process, we made additional changes that further emphasi ed mentorship and IPs. We reduced the questions relating to scholarship on the faculty application to a single 250-word summary and instead relied on the faculty members curriculum vitae to provide evidence of their ability to contribute to their respective field. In the 3rd year (2020), we eliminated the curriculum vitae requirement and instead introduced new questions on the student portion of the application that related to the project s content and its potential to make an original contribution. We presumed that the student fellows could answer those questions only after consulting with their faculty mentors, so our intention was to create infrastructural conditions that would promote mentorship.

Student Engagement
Prior to these various transitions, the level of direct contact between the OU and the various summer research fellows was rather limited. The primary point of contact for fellows was their respective faculty mentor. We introduced a few modest but substantive requirements into the fellowship program aimed to prime students for reflection. First, we changed the April contract-signing meetings to emphasi e IPs. All summer research fellows are required to gather in a room together with the director of undergraduate research to sign their contracts. We retained this tradition for practical Journal of the Scholarship of Teaching and Learning, Vol. 21, No. 1, April 2021. josotl.indiana.edu purposes, but we shifted the purpose of the meeting toward introducing the students to self-reflection practices and career competencies, such as those outlined by the National Association of Colleges and Employers (NACE).
We built upon these changes to the contract-signing meetings by creating a new training/enhancement program that required every summer research fellow to attend three 1-hr sessions that focused on strategies of self-reflection and recogni ing the ways in which a summer research experience would contribute to the fellows growth beyond disciplinary confines. The third of these three sessions consisted of the fellows being divided into interdisciplinary subgroups of 10 each and then describing the ways in which their project enhanced their career competencies in a 4min elevator pitch. Anecdotal evidence suggests that these interventions succeeded in reframing the students approach. At the very least, compared to prior years research fellows, they were able to describe their research experience in broader and more diversified ways, which we believe will enhance their ability to make their research experience more relevant in job interviews and graduate school applications.
We also looked for ways to celebrate mentorship and reinforce best practices, without imposing on faculty members time or academic freedom. One method was to establish a Faculty Mentors Appreciation Luncheon. uring this luncheon, we recogni e accomplishments in mentoring during the prior year, such as faculty and student awards and joint publications, and high-level administrators extend their appreciation to the mentors for their efforts. A faculty member with a proven record of high-quality mentoring also gives a keynote address.
As we have refocused our summer U program on mentoring students, the number of summer research fellows has grown steadily over the past 4 years (see Table 2). In 2020, our growth would have been exponential had it not been for the COVI -19 pandemic, which forced Furman to close its campus for the summer and allowed us to fund only those projects that could convert to an entirely remote format. We still had a substantive increase of 32 projects from 2019, even though 46 projects had to be canceled owing to the pandemic.
The faculty in Table 2 represent each of our university s academic divisions. uring this 4year period, on average 60 of projects came from the natural sciences, 20 from the arts and humanities, and 20 from the social sciences. Every one of our 26 academic departments hosted at least one U project in each of those years. Various factors account for the steady growth shown in Table 2, not the least of which is ample funding. But previously, one of the main hindrances to growth in U was the number of faculty members willing and able to take on summer research fellows. The increase in the number of faculty mentors is perhaps the most striking aspect of Table 2. It reflects, among other things, an institutional commitment to and culture of providing students with IPs as part of TFA. As just one small but Ching, Pontari, lonis, and Boyd Journal of the Scholarship of Teaching and Learning, Vol. 21, No. 1, April 2021. josotl.indiana.edu representative example, nontenured faculty members account for much of the increase in fellows, likely because prospective faculty members are asked during their job interviews about their ability to incorporate students into research projects. For ease, we summari e the changes made to our U program in Table 3. Although we have increased student participation in summer research, one issue facing many U programs is participation by underrepresented students, and Furman is no exception, despite our Ching, Pontari, lonis, and Boyd Journal of the Scholarship of Teaching and Learning, Vol. 21, No. 1, April 2021. josotl.indiana.edu efforts to ensure that all students have access to IPs. Traditionally the Furman mentor mentee relationship is established through informal mechanisms or interpersonal relations established during the academic year. eliance on these methods can hinder access for some students, especially those traditionally underrepresented. We attempted to rectify this by creating a central place a link on the U website where faculty mentors who had a project, but not yet an established fellow, could advertise their availability. Participation in the 1st year was modest only 10 of more than 100 mentors but it is a start.
The sum of these changes is that an already strong U program has become even more robust in a relatively short period of time, in part because of the focus on IP alignment, detailed above and in Table 3. An unexpected benefit of shifting our focus to IPs and centering on the mentor relationship is that our faculty were primed to respond with agility to the COVI -19 pandemic. Indeed, most faculty mentors revised their original projects in some cases to areas outside their expertise to provide a remote experience for their students. Anecdotal evidence indicates that a portion of those mentors would have canceled their summer research program if their sole focus had been their own research agenda, but because the student experience took precedence, they made the pivot. To facilitate this process, the Faculty evelopment Center provided support workshops in April, May, and June 2020 to help faculty envision conducting research remotely for Summer 2020; 42 faculty participated in them.
Ironically, the pandemic provided us with the opportunity to become more proactive about tracking mentoring and encouraging best practices. For example, the OU distributed a survey to all the faculty mentors asking them to describe their research plans, including any particular professional strengths they possessed that they would be willing to share with colleagues, and any training needs they might have as a result of taking on a research program outside their area of expertise. The results of the survey were shared with all mentors. The objective was to encourage collaboration among faculty mentors. Indeed, in addition to reports of numerous faculty mentors reaching out to one another regarding research methodologies and mentoring strategies, often across disciplinary lines, we saw the emergence of some cross-disciplinary research communities, in which student researchers collaborated and reported out to one another. The qualitative-research group, for example, included faculty and students from mathematics, modern languages, sociology, sustainability studies, and the Faculty evelopment Center. One outgrowth of these activities was a professional development webinar, cohosted with neighboring Wofford College, entitled, ow the COVI -19 Pandemic Made Me a Better esearch Mentor. Eight research mentors shared vignettes of new mentoring strategies they adopted as a consequence of shifting to virtual projects, and then, notably, which ones they intend to retain even after they are able to return to in-person mentoring.
While the pandemic was costly, our end-of-summer-student surveys reveal that the high quality of our faculty mentoring was not only retained but even improved in key areas. For example, in response to the question, ow often did you receive substantive feedback (either in-person or virtual) from your faculty mentor the percentage of students responding with very often increased from 2 in 2019 to 82 in 2020. And similarly, in response to the prompt, The preparation for this experience from my faculty mentor..., the percentage of students responding with was about right increased from 82 in 2019 to 91 in 2020. While various factors might account for these improvements amidst the pandemic, we like to think that our efforts to center mentoring in our leveling-up activities bear some responsibility. Ching, Pontari, lonis,

How We Count and Track UR Experiences
Not only have we shifted the U culture to integrate student-centeredness and IPs, but also we have redesigned the way we assess experiences by hearing students and making programmatic changes. A first step is to accurately identify which students engage in IPs. The CEL, in partnership with the Office of Institutional Assessment and esearch, created a tracking system that involves several data collection points (see Table 1 for these data). For U , tracked experiences include semester creditbearing (e.g., senior thesis) and full-time summer experiences that are vetted by the CEL and described in the section above. On a senior survey completed right before graduation (averaging an 8 response rate), students also self-report whether or not (and when) they had an ELE (including a summer or academic year U experience). Finally, students (except for 1st-year students) complete an engaged learning checklist during their fall meetings with their academic advisors. Using a comprehensive list of possible experiences, advisors and students discuss (and check off) which ELEs the student had during the previous academic year and summer (including U ). These forms have a response rate of approximately 54 and are submitted to the CEL. Student self-reports via the senior survey and the checklist data allow us to triangulate on experiences and check to ensure the accuracy of our tracking data and methodology. Throughout the assessment portion, unless otherwise indicated, we report data on U (and other ELEs) based on tracking data. As depicted in Table 4, tracking data is a more conservative approach to counting ELEs. Note. Class of 201 is not included because the senior survey was not administered that year. a Tracked experiences rely on data from transcripts and participation data from the Center for Engaged Learning, thereby making these data more reliable in terms of the nature and quality of the experience. We use the discrepancies in self-report and tracking data to refine and check our tracking system. We do not report the engaged learning checklist data here because of the low response rate. Those data are similarly used to refine and check our tracking system.

Overview of Our Assessment Plan
Our assessment plan (which addresses the outcomes we proposed to monitor and improve upon in our EP), includes measuring (1) student perceptions of how well their experience aligned with the characteristics of IPs, (2) the impact of the experience, based on students expectations before and perceptions after the experience, as well as a postexperience reflection, and (3) postgraduation outcomes such as having a job at graduation or being enrolled in postgraduate study. Including these diverse constructs as well as assessment types provides a robust, evidence-based, student-centered approach to continue to improve upon our U program. Furthermore, both the quantitative and the qualitative aspects of our assessment process foster student reflection, thereby enhancing students tendency to think critically about their experience. Ching, Pontari, lonis,  Although it may be the case that certain types of students pursue U experiences and thus assessing the impact on those students provides a biased or perhaps inflated view of the impact of U , below we compare key U outcomes to outcomes for students who had internships and other ELEs, putting these data into a broader context. Furthermore, for those that do have a summer U experience, we have several mechanisms in place to ensure a high response rate. For example, students receiving summer research fellowships and/or those who choose to document their ELE on their official transcript as a ero-credit course are required to complete a presurvey, a postsurvey, and a written reflection about their experience.

Assessment of How Well UR Experiences Conform to Furman HIPs or Engaged Learning Characteristics
The postsurvey asks students whether the experience conformed to Furman-defined characteristics of IPs. We determined these characteristics by consulting the literature on IPS ( uh, 2008; uh et al., 2013), experiential learning (Evans, Forney, Guido, Patton, & enn, 2010;olb, Boyat is, & Mainemelis, 2001), and applied learning (National Society for Experiential Education, 1998) note, these characteristics match the questions in Table 1 used for summer research applications. Although these different categories of immersive learning overlap more than they differ, they use different labels to represent similar concepts (e.g., monitoring experiences vs. receiving substantive feedback), they emphasi e different elements of the experience (e.g., interaction with diverse groups or ideas), and some have unique features (e.g., applied learning includes authenticity as a key element; the others do not). Using this information, a committee of faculty and staff, as well as a team that attended the Association of American Colleges and Universities IP Institute, contributed to the discussion and final determination of IP characteristics that best fit with the goals and learning objectives outlined in the EP. These were shared with faculty and staff in various forums to obtain feedback.
These characteristics and the items used to assess them are presented in Table 5. A large percentage of students report their mentor prepared them for the experience and gave the right amount of feedback to allow them to make changes and improve which suggests that faculty are engaged in the mentoring process. Indeed, the results of a simple linear regression show that on the senior survey, participation in U positively predicted students responses to the question, At Furman, I had a mentor who encouraged me to pursue my goals and dreams (rated on a 1 to 5 scale from strongly disagree to strongly agree (R 2 .02, F(1,500) 10.0 , p .01). The number of reported undergraduate research experiences significantly predicted mentorship (β .13, 95 confidence interval .048, .205 , p .01). These data also show areas for improvement, including incorporating more reflection during their experience. Note that students having summer U experiences were required to reflect at the completion of their experience as part of our effort to incorporate reflection systematically into ELEs. Furthermore, because we use the same assessment for all ELEs, we can compare U experiences with other ELEs to provide more context (as presented in Table 5). In sum, measuring how students perceive the presence or absence of Furman-valued characteristics of IPs provided a way to check if our efforts to move U to a more student-centered approach is working and focused our efforts on professional development for both faculty and students.

Assessment of Impact
Self-report. In addition to understanding how students perceive factors such as engagement of their mentor or helpfulness of feedback, we also assessed their perception of the impact the experience had on them. The survey described above asks students to self-report what level of impact their research experience had on their future plans. This assessment is unique in that it asks students in the presurvey to reflect on and indicate what level of impact they expect the experience to have on a scale from 1 HIP characteristics

Reflections
How often were you asked to write reflections on your learning and development? (% "weekly or daily") h 29% a 52% b 79% c 48% ab Note : Percentages with different subscripts differ based on 99% confidence intervals. HIP = High-impact practice; ELE = engaged learning experience. a Scale: Fewer than 10 hr a week; 10-19 hr a week; 20-30 hr a week; more than 30 hours a week.
b Scale: Was far too short; was too short; was about right; was too long; was far too long.  (5 ) reported the experience met those expectations; 25 reported their U experience had a higher level of impact, while 1 chose a lower level, which is about the same as the other experiences cataloged. To reliably examine the presurvey versus postsurvey impact data and compare U to other ELEs, as well as to examine if the characteristic of IPs reported in Table 5 predict U impact, we needed to collect more data. One challenge of this analysis strategy is that it requires student responses for the pre-and postassessment. Below, we report more impact data but only for the postassessment, for which we have more responses.
In the postsurvey, we also asked students how much the experience changed their worldview, to what extent it allowed them to apply what they learned in the classroom, and how much it influenced their career plans. See Table 6 for the results on these items and how they compare to students reactions to internship, study away, and 1st-year writing seminar experiences. These results suggest that impact is not a singular construct and that each kind of ELE may have a unique kind of impact on students. For U , these data, taken together with the data on reflection in Table 5, may indicate that we could be more intentional about having students reflect on how their research fits into a broader context, and how their experience can influence their perception of their place in the world, regardless of the content of their research.

Table . Postsurvey self-report of impact.
Impact on careers and transition to life after college-First destinations and clearinghouse data. In addition to using student self-reports about their experience to assess the impact of U , as part of Furman s EP, we proposed that ELEs should impact postgraduate plans, providing a concrete measure of impact. ere, we include preliminary results that show that students who participate in U are more likely to pursue postgraduate education. ata for the graduating classes of 2018 and 2019 were compiled by comparing self-reported postgraduate plans on Furman s senior survey; the First Impact items

Engaged learning experience
estinations Survey, which is completed 6 months after graduation; the National Student Clearinghouse records; which indicate subsequent enrollment at another institution of higher education; and LinkedIn records where available. This multidata approach should provide an accurate representation of postgraduate outcomes often absent from other articles reporting similar outcomes (see Table for a summary).

Table . Postgraduate outcomes by ELE participation based on tracking data
Table shows that students who participated in at least one research experience were more likely to continue their studies (in graduate, medical, or law school) than graduates who had study away or internship experiences, but they were less likely to pursue employment. Note that students could be included in more than one category of ELE because they often participate in multiple ELEs while at Furman.
Another way to capture the effect of the number of ELEs (specifically research, internship, and study away) on postgraduation outcomes was to conduct a binary logistic regression, using continuing education and full-time employment as the outcomes (0 for employment, 1 for continuing education) and three different ELE types (research, study away, and internship) as predictors. As shown in Table 8, of the three ELE types included, only research was a significant positive predictor of continuing education, such that as the number of research experiences increases, so too does the likelihood of continuing on to graduate or professional school.

Table . Predicting postgraduation outcomes from the sum of ELEs
For those graduates who do not continue on to graduate school (or pursue another professional degree), the remaining options are (primarily) employment or nonemployment. We could then also test if U predicts full-time employment postgraduation. To this end, we conducted a binary logistic regression, excluding all graduates who went on to graduate school, where full-time employment (coded as 1) or not employed (coded as 0) were the outcomes, and included the same three ELE types as predictors. predict employment, but internship experiences do; that is, as the number of internships increases, so too does the likelihood of finding full-time employment).

Reflection as Assessment
As described at the outset of the paper, one way we modified our approach to U was to emphasi e to students the importance of reflecting on their U experience before, during, and after it and providing them with some guidance on how to reflect through required professional-development sessions. Although the U students self-reports of how often they engaged in reflection suggest that this is an area we can improve upon, reflection is an integral part of our assessment process. We have been able to collect robust quantitative data on student s perceptions of the U experience. The structure of our assessments fosters reflection because students articulate their expectations for the experience at the outset and then reflect at the end of the experience on if it met those expectations. Students respond to one of two prompts aimed to encourage reflection on either (1) their sense of purpose or (2) integrative learning. Both of these learning outcomes were a part of our EP application. Thus, the reflection assignment ensures that students reflect on their experience and it provides a way to assess their experience. We are in the process of reviewing reflections, of which to date we have more than 200.
To summari e, our assessment of U , which includes preexperience expectations and postexperience perceptions as well as reflections and postgraduate outcomes (1) mirrors the changes we made to focus ELEs on the student experience and (2) provides us an evidence-based approach to continue to refine our student-centered approach. onclusion igher education institutions face integrated yet competing challenges to demonstrate their value. Whether it is by proving student learning or career preparedness in students or continuing to find innovative ways to tell the institutional story, they are struggling, now more than ever in the wake of the COVI -19 pandemic, to illustrate their value to the public. By placing students at the center via IP alignment at every level of collaboration (students, faculty, department units, and university programs) we were able to accomplish positive outcomes in a compressed time frame although we still have work to do. We share this case study as an example of one way to refine rapidly (by higher education standards) an already strong program in support of student success. As we collectively embark on reimagining higher education in the post-COVI years, keeping students and IPs at the center of the story will serve us well.