Browsing by Author "McCormick, A."
Now showing 1 - 18 of 18
- Results Per Page
- Sort Options
Item Assessing and enhancing student engagement and success(2009-10-26) Kinzie, J.; McCormick, A.Item Assessing inclusiveness and engagement with cultural diversity: Assuring success for all(2018-01-25) Kinzie, J.; McCormick, A.; Gonyea, R.; BrckaLorenz, A.Institutional support for diversity, inclusivity, and cultural responsiveness represents an imperative for higher education given demographic projections and the needs of a pluralist society. In 2017, the National Survey of Student Engagement (NSSE) added an optional question set asking students more about inclusive teaching practices in courses, intercultural learning, and perceptions of their institution's cultural responsiveness. This session highlights findings from this item set, discusses the relationship between these activities and other effective educational practices, examines how these relationships vary between traditionally marginalized students and more privileged students as well as by major field, and includes a discussion of the opportunities and challenges educators face as they seek to improve inclusion, engagement with diversity, and cultural responsiveness. Discussion includes how campus leaders can use these findings to create environments that more fully support students of all backgrounds, leverage the educational benefits of diversity, and promote transformative learning outcomes.Item Civic learning and effective educational practice: A focus on service-learning and civic engagement(Association of American Colleges & Universities Annual Meeting, 2014-01-24) Kinzie, J.; McCormick, A.; Stevens, M."Fostering civic learning is a central purpose of higher education. As campuses seek to strengthen democratic engagement, it is useful to understand the extent to which students have access to the experiences that develop civic engagement skills. This session will promote discussion about educational practices that build students' capacity for civic learning by exploring National Survey of Student Engagement (NSSE) findings from an in-depth service-learning item set and the new civic engagement module. Join us to exchange ideas about results in relation to efforts to enhance service learning, strengthening connections between course content and service experiences, and outcomes associated with civic engagement skills."Item College students' experiences with writing: What do we know, and how are institutions applying local findings?(Association of American Colleges & Universities Annual Meeting, 2016-01-21) Kinzie, J.; Gonyea, R. M.; McCormick, A.; Paine, C.; Blake, L. P."AAC&U's Essential Learning Outcomes and the Degree Qualifications Profile identify writing as a key outcome. Virtually all colleges and universities aim to develop proficient writers. Recent evidence suggests that experiences critical to developing writing competence correspond to broader benefits in student learning. And "Writing Across the Curriculum" initiatives make clear that the responsibility to develop this important competency is shared across departments. Consistent with AAC&U's emphasis on enhancing institutional structures and practices to support student success, many institutions monitor students' exposure to and participation in effective educational practices through the use of student surveys such as the National Survey of Student Engagement (NSSE). To enable deeper examination of specific practices and experiences, in 2013 NSSE began offering a menu of topical modules to complement the core survey's breadth of focus. Results from topical modules provide a fresh opportunity to "drill down" on educational quality and make targeted improvements in teaching and learning. One of the new modules investigates students' experiences with writing. Whereas the core survey focuses on the number of assigned papers of various lengths, the writing module probes a range of activities and experiences promoted by those who teach composition--interactive writing processes, meaning-making tasks, and clarity of instructor expectations for writing assignments. This research-informed panel presentation session (1) highlights recent findings from NSSE's Experiences with Writing module, including how these experiences vary across subpopulations and major fields, and (2) provides examples of how institutions are making productive use of their results. NSSE researchers report on large-scale findings, and two panelists share what they have learned and how they are using results to guide improvement."Item Exploring how course evaluation outcomes are collected, shared, and used(Association for Institutional Research Annual Forum, 2014-05-28) BrckaLorenz, A.; McCormick, A.; Peck, L.End of course evaluations are a widely used means of assessing student learning experiences and provide opportunities for faculty to refine their teaching and course content. However, the way institutions collect and share those results varies. Using data from the 2013 National Survey of Student Engagement and Faculty Survey of Student Engagement administrations, this presentation examines how different types of institutions collect and distribute course evaluation results, how much students access course evaluation information, and how much faculty use course evaluation information to improve their courses and teaching. Student use of external evaluation sources (e.g., ratemyprofessor.com) to select courses is also be examined.Item Getting to use: What stimulates and impedes use of student engagement results?(Association for Institutional Research Annual Forum, 2015-05-27) Kinzie, J.; McCormick, A.; Olsen, D.; Blaich, C.; Wise, K.The ultimate goal of assessment projects, including the National Survey of Student Engagement (NSSE), is not to gather data. It's to catalyze improvement in undergraduate education. Yet, moving from data to campus action is challenging. This session addresses the challenges of data use, blending expert panelist insights with focused audience discussion about what stimulates and impedes action. With the updated NSSE in mind, panelists and the audience consider broad topics about using evidence, including sharing results, anticipating evidence use, striving for perfect data, involving students, and planning for action, and also discuss what promotes effective data use.Item High school and expected first-year engagement: A motivation perspective(Association for Institutional Research Annual Forum, 2009-06-03) Cole, J.; McCormick, A.Item Influences on course evaluation: Student and faculty perspectives(Professional & Organizational Development Conference, 2013-11-07) BrckaLorenz, A.; Peck, L.; McCormick, A.End-of-course evaluations are often used as stand-alone indicators of an instructor's overall effectiveness and as input for faculty promotion and tenure purposes. In this session, we will examine the results from a large-scale multi-institution survey of students and faculty responding to questions about the uses of and influences on students' ratings on course evaluations. Participants will discuss findings on the perceptions and uses of course evaluations from both the student and faculty perspective, will share examples of the course-evaluation issues and solutions on their campuses, and will generate ideas for future research on course evaluations.Item Making the most of NSSE: An overview of updates, customization options, reports, and applications(Association for Institutional Research Annual Forum, 2016-06-01) Gonyea, R.; Kinzie, J.; McCormick, A.This session highlights new features, including the mobile-optimized survey and LMS/portal promotion, and uses of student engagement results. Participants and NSSE staff will exchange ideas about the project and reports. Current and new users are encouraged to attend!Item Measuring change: Using multi-year analysis of National Survey of Student Engagement results to assess educational improvement(2009-06-03) Kinzie, J.; McCormick, A.; Korkmaz, A.; Buckley, J.Item Putting NSSE Results to Use: Don’t Stop with Your “Average Student”(2015 NSSE User Workshop at Bucknell University, 2015-04) McCormick, A.Item Rating my professors: Influences on student ratings and faculty beliefs about those influences(American Educational Research Association Annual Meeting, 2014-04-05) BrckaLorenz, A.; McCormick, A.; Peck, L.Little research exists regarding the differences between student and faculty perceptions of what influences how students respond to end-of-course evaluations. This study provides evidence of both similarities and differences between what influences students' course evaluation ratings and faculty members' perceptions of what influences student ratings. The differences offer insight into how both faculty and students perceive the purpose, use, and results of these evaluations. Findings also indicate that different types of students have different perceptions in how evaluation results are used, and different types of faculty use these results more or less frequently. Finally, connections are made between perceptions of campus support and perceptions of whether or not results are used to improved courses and teaching.Item Supporting first-generation students: The role of learning communities to promote deep learning(Association of American Colleges & Universities Annual Meeting, 2009-01-23) Cole, J. S.; McCormick, A.; Kinzie, J.Item The updated NSSE: Fresh opportunities to engage faculty in assessment results(Association of American Colleges & Universities Annual Meeting, 2015-01-23) Dueweke, A.; Hutchings, P.; Kinzie, J.; McCormick, A.The updated National Survey of Student Engagement (NSSE) provides greater specificity on measures that matter for improving student learning. Yet, too often results reside at the institution-level, demonstrating grand measures of educational quality, and only occasionally getting to faculty to influence teaching and learning practice. The promise of assessment depends on growing and deepening faculty involvement and use of results. This session explores the question: "What do NSSE results mean for faculty?" Panelists and participants will address the topic and discuss ways to leverage student engagement results to inform instruction and efforts to enhance high-impact practices, guide faculty development initiatives, and connect to the scholarship of teaching and learning and projects to improve educational quality.Item Using evidence to promote effective educational practice and the success of all students(SHEEO Higher Education Policy Conference, 2018-08-07) Hayek, J.; Kinzie, J.; McCormick, A.Combining findings from the National Survey of Student Engagement (NSSE) with insights from a system-level chief academic officer, this session first provides an overview of public institutions' student engagement results by race/ethnicity and first-generation status, including results suggesting progress in American higher education in providing welcoming, supportive environments for all students and a positive association between participation in high-impact practices and higher levels of satisfaction and perceived support for all racial/ethnic groups. The presenters then highlight new evidence regarding students' experiences with a variety of inclusive and culturally engaging practices, showing- by student characteristics such as racial/ethnic identity, gender identity, and sexual orientation- the relationships of these activities to educational practices that promote learning and development and to students' perceived gains in areas such as informed, active citizenship and understanding people of other backgrounds. The session concludes by discussing how institutions and states can best promote equitable experiences and what the findings on inclusivity and cultural diversity suggest for preparing students to participate in a diverse workplace and a globally interconnected world.Item Using NSSE to Explore Campus Issues and Take Action!(2015 NSSE User Workshop at Bucknell University, 2015-04) Kinzie, J.; McCormick, A.; Harring, K.; Liu, R.; Horissian, K.The ultimate goal of NSSE is not to gather data. It is to catalyze improvement in undergraduate education. What fosters the shift from data to action? This session provides an opportunity to practice applying NSSE results, to learn about approaches taken by two NSSE institutions to make effective use of results, and to discuss strategies for action. Please be prepared to share your NSSE challenges and successes—and ask questions!Item What influences end-of-course evaluations? Teaching and learning vs. instrumental factors(American Educational Research Association Annual Meeting, 2015-04-18) McCormick, A.; BrckaLorenz, A.Student evaluations of courses and teaching in the form of end-of-course surveys are ubiquitous in higher education, and at many institutions they serve as the primary basis for evaluating teaching effectiveness in the promotion and tenure process. Course evaluations of teaching are also controversial. It is often asserted that students use them to reward professors for easy courses and punish them for demanding ones, and many faculty believe that students' evaluations are influenced by their expected grade. This study investigates the relative influence of teaching and learning versus instrumental influences in students' overall course evaluation ratings using data from a diverse sample of 44 four-year institutions.Item What's your threshold? How international students use vague quantifiers of behavioral frequency in student surveys(Association for Institutional Research Annual Forum, 2019-05-30) McCormick, A.; Rocconi, L.; Dumford, A.Many student surveys on the undergraduate experience ask respondents to quantify various aspects of their collegiate experiences. Studies of the survey response process have shown that enumeration is cognitively challenging and subject to various forms of error. As an alternative to enumeration, many surveys use vague quantifiers (VQs) such as "very often," "often," "sometimes," "rarely," or "never." Previous research findings generally support the use of VQs from a concurrent validity standpoint. One important area that has gone unexplored involves international students- how they interpret and use VQs and whether these interpretations and uses differ from those of domestic students. A related concern involves heterogeneity within the international student population. To the extent that the use of VQs differs between international and domestic students, or differs among international student populations, users of student survey data should be aware of these differences and their implications for the validity and interpretation of results. Informed by these concerns, we ask the following research questions: 1. To what extent do domestic and international students interpret VQs differently? 2. Among international students, to what extent do students from Western and non-Western countries interpret VQs differently?