Deconstructing the Testing Mode Effect: Analyzing the Difference Between Writing and No Writing on the Test

Main Article Content

Daniel M. Settlage
Jim R. Wollscheid
https://orcid.org/0000-0002-0667-7536

Abstract

The examination of the testing mode effect has received increased attention as higher education has shifted to remote testing during the Covid-19 pandemic. We argue that the testing mode effect should be broken into four distinct subparts: the ability to physically write on the test, the method of answer recording, the proctoring/testing environment, and the effect testing mode has on instructor question selection. This paper examines an area largely neglected by the literature surrounding the testing mode effect, the ability (or lack thereof) to write on the test. Using a normalization technique to control for student aptitude and instructor bias, we find that removing the ability of students to physically write on the test significantly lowers student performance. This finding holds across multiple question types classified by difficulty level, Bloom’s taxonomy, and on figure/graph-based questions, and has implications for testing in both face-to-face and online environments.

Downloads

Download data is not yet available.

Article Details

How to Cite
Settlage, D. M., & Wollscheid, J. (2024). Deconstructing the Testing Mode Effect: Analyzing the Difference Between Writing and No Writing on the Test. Journal of the Scholarship of Teaching and Learning, 24(2). https://doi.org/10.14434/josotl.v24i2.35209
Section
Articles
Author Biographies

Daniel M. Settlage, University of Arkansas-Fort Smith

Professor of Economics, University of Arkansas-Fort Smith College of Business

Jim R. Wollscheid, University of Arkansas-Fort Smith

College of Business and Industry-Professor of Economics

References

Abdul, Baba, Olusola O. Adesope, David B. Thiessen, and Bernard J. Van Wie. 2016. Comparing the effects of two active learning approaches. International Journal of Engineering Education 32, no. 2: 654–669.

Backes, Ben, and James Cowan. 2019. Is the pen mightier than the keyboard? The effect of online testing on measured student achievement. Economics of Education Review 68, 89–103. https://doi.org/10.1016/j.econedurev.2018.12.007

Bretz, Stacey Lowery, and LaKeisha McClary. 2015. Students’ Understandings of Acid Strength: How Meaningful Is Reliability When Measuring Alternative Conceptions? Journal of Chemical Education 92, no, 2: 212–219. https://doi.org/10.1021/ed5005195

Clariana, Roy, and Patricia Wallace. 2002. Paper-based versus computer-based assessment: key factors associated with the test mode effect. British Journal of Educational Technology 33, no. 5: 593–602. https://doi.org/10.1111/1467-8535.00294

Clark, Richard E. 1994. Media will never influence learning. Educational Technology Research and Development 42, no. 2: 21–29. https://doi.org/10.1007/BF02299088

Hake, Richard R. 1998. Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics 66, no. 1: 64–74. https://doi.org/10.1119/1.18809

The International Test Commission. 2006. International Guidelines on Computer-Based and Internet-Delivered Testing. International Journal of Testing 6, no. 2: 143-171. http://dx.doi.org/10.1207/s15327574ijt0602_4

Katz, Irvin R., Randy Elliot Bennett, and Aliza E. Berger. 2000. Effects of Response Format on Difficulty of SAT-Mathematics Items: It’s Not the Strategy. Journal of Educational Measurement 37, no. 1: 39–57. https://doi.org/10.1111/j.1745-3984.2000.tb01075.x

Marshman, Emily, and Chandralekha Singh. 2016. Interactive tutorial to improve student understanding of single photon experiments involving a Mach–Zehnder interferometer. European Journal of Physics 37, no. 2: 1-22. https://doi.org/10.1088/0143-0807/37/2/024001

Prisacari, Anna Agripina, and Jared Danielson. 2017(a). Rethinking testing mode: Should I offer my next chemistry test on paper or computer? Computers & Education 106. 1–12. https://doi.org/10.1016/j.compedu.2016.11.008

Prisacari, Anna Agripina, and Jared Danielson. 2017(b). Computer-based versus paper-based testing: Investigating testing mode with cognitive load and scratch paper use. Computers in Human Behavior 77. 1–10. https://doi.org/10.1016/j.chb.2017.07.044

Randall, Jennifer, Stephen Sireci, Xueming Li, and Leah Kaira. 2012. Evaluating the Comparability of Paper‐and Computer‐Based Science Tests Across Sex and SES Subgroups. Educational Measurement: Issues and Practice 31, no. 4: 2–12.

Settlage, Daniel M., and Jim R. Wollscheid. 2019. An Analysis of the Effect of Student Prepared Notecards on Exam Performance. College Teaching 67, no. 1: 15–22. https://doi.org/10.1080/87567555.2018.1514485

Sherman, Tyler J., Tanner M. Harvey, Emily A. Royse, Ashley B. Heim, Cara F. Smith, Alicia B. Romano, Aspen E. King, David O. Lyons, and Emily A. Holt. 2019. Effect of quiz format on student performance and answer-changing behaviour on formative assessments. Journal of Biological Education 55, no. 3: 1–15. https://doi.org/10.1080/00219266.2019.1687106

Siegfried, John J., and Peter E. Kennedy. 1995. Does Pedagogy Vary with Class Size in Introductory Economics? The American Economic Review 85, no. 2: 347–351.

Smolinsky, Lawrence, Brian D. Marx, Gestur Olafsson, and Yanxia A. Ma. 2020. Computer-Based and Paper-and-Pencil Tests: A Study in Calculus for STEM Majors. Journal of Educational Computing Research 58, no. 7: 1256–1278. https://doi.org/10.1177/0735633120930235

Vispoel, Walter P., Carrie A. Morris, and Sara J. Clough. 2019. Interchangeability of Results From Computerized and Traditional Administration of the BIDR: Convenience Can Match Reality. Journal of Personality Assessment 101, no. 3: 237–252. https://doi.org/10.1080/00223891.2017.1406361

Wainer, Howard, and David Thissen, D. 1993. Combining Multiple-Choice and Constructed-Response Test Scores: Toward a Marxist Theory of Test Construction. Applied Measurement in Education 6, no. 2: 103–118. https://doi.org/10.1207/s15324818ame0602_1

Walstad, William. B. 1998. Multiple choice tests for the economics course. In W. B. Walstad & P. Saunders (Eds.), Teaching undergraduate economics: A handbook for instructors, 287–304. McGraw - Hill.

Walstad, William B., and William E. Becker. 1994. Achievement Differences on Multiple-Choice and Essay Tests in Economics. The American Economic Review 84, no. 2: 193–196.