Scantron and Modern MCQs

I spent thirteen years in university, and Scantron tests were a part of the woodwork. This Atlantic article says that with digital tests they are a thing of the past. And so are multiple choice questions. I doubt it.

Sure, the big exams are on computer. That doesn’t apply to every exam. Universities probably hold hundreds of tests on some days. Computing has limited spaces in its labs and many tests certainly don’t require it. Not all schools or countries have equal access to computers. And multiple choice is a thing since it is easy to mark, digitally or otherwise, well doing an adequate job of testing concepts. Essays are still tougher to mark.

So am I wrong? Is the article true, today’s kindergarteners may never see a Scantron or even a multiple choice question?

Given that virtually all students have their own laptops, I assume the tests are on those, rather than on university computers in the computer lab. And the tests may in fact be taken remotely, though proctoring would be easier if the tests are conducted in the classroom, (But I haven’t been in college in decades so I may have no idea how things work now.)

I attended college from 2017 to 2021. One class had scantrons.

During COVID, tests were ultimately restructured to explicitly allow open notes (on the assumption that students were going to look at their notes no matter what). These were done online through the school’s selected coursework management system (Canvas).

When I took my fundamentals of engineering exam in 2020 (an MCQ exam), it was done on a computer at a testing facility.

I don’t have any problem believing that people would just take tests using computers. Even if you don’t have a laptop, a tablet would suffice.

But it would seem odd to me for multiple choice to die out. It still is easier to grade than alternatives. I could see fill-in-the-blank being a larger portion, since that wouldn’t be too hard to grade (assuming you can handle spelling issues and synonyms). But I wouldn’t expect that could take the place of all multiple choice.

And any other style I can think of would not be significantly easier to grade accurately in digital form. I guess maybe it could help out by highlighting key words and such, but you’d still need to manually grade anything involving sentences or essays.

Won’t be long until AI is grading short answer tests. No need for MCQ at that point.

After students use AI to generate the answers to the tests, of course.

In response to the suggestion that AI “will be the death of learning," the philosopher Slavoj Žižek has allegedly said “NO! My student brings me their essay, which has been written by AI, & I plug it into my grading AI, & we are free! While the ‘learning’ happens, our superego satisfied, we are free now to learn whatever we want.” While my usual response to Žižek is “meh,” this has some value.

I was required to buy scantrons this past spring semester. The professor handed out test packets and we used #2 pencils.

My ballot (for elections) is a scantron.

~Max

Lots of standardized tests are still multiple choice, at least in part. The AP calculus tests, for instance, are half MC, half FRQ (Free Response Questions). And it’s taken on paper.

I’m sure Scantron really is on its way out or gone by now, but how would that equate to multiple choice tests being outdated? That seems like a complete non sequitur, especially given how easy they are to administer and mark online. (FWIW, the only time I do use multiple choice quizzes is when I’m teaching online, because it’s the only way to make sure students are actually listening to the lecture material. In a face-to-face class, I can see whether they’re there and listening, so I don’t bother.)

Take my experience with a grain of salt but my online tests are mostly true or false. Apparently there’s a real problem with students copying the online test and posting it online. I’m guessing true/false questions are easier to compose.

It’s kind of annoying because growing up, we were taught to make a best guess when stuck because you can always go back and change the answer. In online tests once you submit an answer the question is forever locked and you can’t go back and change it later. Also we aren’t getting our graded tests back - because of cheating - so there’s no way to know what questions we missed. Only the final score comes back. We are advised to write down the questions we have trouble with and email the professor, but there’s a time limit of about a minute per question so I wonder what the kids do who have lots of trouble with a test. The test isn’t supposed to be the end of the learning experience, IMO.

I think the online test format, at least as I’ve experienced it, is still experiencing growing pains. :frowning:

~Max

Computers have been grading essays for 25 years or more. I’m sure AI based grading will do a better job than some of the old computer keyword based grading.

Scantron and multiple choice tests are two different issues - it is very possible to give a multiple choice test on a computer either at a designated site or on-line. My city has changed all or nearly all of its civil service tests to be online. Way back when, you were told to report to a particular high school at 8am on some Saturday or Sunday. Now, you are given a date and time to report to a testing center, where you will take the exam on a computer. Since everyone isn’t tested at exactly the same time, the testing can be done Mon-Fri - there is no need to pay proctors for Sat and Sun and candidates are given scores immediately so there is no need for any human involvement in the initial scoring, not even to collect the tests.

“You can’t go back and change previous answers” is a feature of the way some tests are set up, but in my experience, not most of them. And the ones that are set up that way, it’s usually because the test is designed to give you more or less difficult questions, based on how you’re doing thus far, so as to get a more precise measurement of your ability.

As for tests being copied and shared, that’s much less of an issue nowadays than it used to be, at least in math and the sciences. Most textbooks now come with an online question system, where each question is actually a question template: It’ll give the same basic question to everyone, but with everyone getting different numbers in that question, and hence a different final answer. And of course, even in the humanities, it’s trivially easy for an online test to randomize which letter goes with which answer option.

It probably varies. We had one course over the summer where the professor was doing her first semester in this system and the curtain was drawn all the way back - she had loaded the test bank with questions directly from the publisher, a number of which were poorly written (like the textbook) and required corrections. Apparently the set of questions the student sees is a random sample from the test bank. Also, our online multiple choice answers aren’t marked by letter.

My professors are explicit that the questions lock after each answer because of rampant cheating, and that they’re tired of rewriting the questions. All the courses in my degree program have been this way. (On the other hand, the Spanish courses I took fall and spring used the publisher’s website for assignments and tests, except for the midterm which was written. And the speech class, mentioned above, used scantrons)

~Max

I’m sure that Covid greatly increased digital use and that most university students and many younger ones have tablets, chromebooks or laptops. Scantron would seem to offer a little more security but I can see them becoming rarer.

Multiple choice questions, however, will always be a big part of testing precisely because they are easy to mark in any format. I think they can do a reasonably good job of testing concepts, disagreeing with the article in this respect. I assure you medical MCQs can be very tricky even when you understand the concept well.

It’s been a few decades, but during my first degree the chemistry department was very fond of multiple choice tests where most questions had twelve to fourteen possible answers (only one right one, of course). Guessing would barely help. They also had so many answers they claimed (credibly) to be able to detect cheating by comparing Scantrons of people seating close to each other.

AI is unlikely to change the need for MCQs. If anything it might make them more used. Education seems to change very slowly, especially in professional testing, though I can’t claim to be au fait with the latest trends. But essays will always take more effort to read and mark unless phoning it in. Academia already has enough superegos, surely.

I will say that multiple choice questions are relatively rare nowadays on online math and physics assessments, because it’s fairly easy to just leave a blank space where the student can type in whatever answer they got, and still grade it automatically. The good systems are also capable of recognizing multiple forms of the correct answer, like 2(x+3) or 2x+6 (and also know when only one specific form is acceptable, like when the question says to perform a specific sort of simplification).

Sometimes a student does still lose points from an answer that should be correct, or for some minor syntax detail like missing a comma. But the teacher can still go in and tweak the grade, if necessary.

Actual, name brand Scantrons are all but gone. The machines are expensive, and the answer documents are expensive. They are a pain in the ass, because you have to travel to whichever workroom has the reader in it. Then you have to hand-enter the number correct into whatever system you are using to record grades. For $7 a year, you can download Zipgrade, an app that lets you create answer sheets of whatever configuration you want (including grid-in numbers) and print them on your own printer, you have the student bubble in their ID number, and you can scan them with your phone. They auto-populate a roster you uploaded, and then you access the spreadsheet on the web, or download it in whatever format you want. Any half-decent learning management system will have these same capabilities built in, though Zipgrade is shockingly user-friendly for a cheap app.

The SAT has been 100% digital, and adaptive, internationally, since last Spring. The PSAT is digital in the US starting next month, and the SAT in the spring. Because they are adaptive, they will be two hours instead of 3.

My son is 11 and in 6th grade. Every standardized test he’s ever taken was on computer.

AP exams will all be digital in the near future. Eight are already available in digital format–including the largest ones, AP English Lang and AP US History. Right now, only Chinese is all digital–for the others, it’s an opt-in system. But we did digital last year and I LOVED it. The kids loved it. One real surprise to me was how many reporting saving time: filling in bubbles takes up more time than you realize, and flipping back in forth in a test document is also a time sink. We are just used to having that baked in.

If anyone wants to see what a modern digital MC test is like, the app College Board developed is called Bluebook, and anyone can download it and answer the samples questions. There’s lots of demo materials. College Board uses it for the AP exams and the SAT. I talked to the designers, and they are aware that 1) it’s a great name for the app but 2) no kid gets that. It really is a nice app, and addresses most of the problems listed here. You can move around, you can answer out of order, you can take notes, you can flag questions, you can cross out answer choices. It’s fast–no lag at all.

The big hurdle in digital testing for AP is in things like chemistry, calculus, or economics where you pretty much have to draw models or show work for the free response questions. They will probably eventually go to a sort of generic blank answer document, where the question was on the screen but they put the answers in the, well, old school blue book, and that’s all we ship back.

The real impetus for doing this is test security. Right now, the big testing companies really have no plan for if, say, someone steals a copy of an AP test out of a school office, photographs each page, and posts them all over social media. They have additional tests written, but they can’t get half a million tests printed and distributed. Electronic tests are more secure, and easier to change if security is broached.

A secondary, but significant, concern is the cost. Handling paper tests, hundreds of thousands of paper tests, is a nightmare for everyone involved. It’s not just the printing and the shipping, it’s the counting and documenting chain of custody and following protocol and following up on incident reports.

And yes, there are technical problems with digital tests–but apparently a UPS truck full of AP or SAT tests catches fire on a yearly basis. So there’s an offset there.

As an English teacher, it’s all really interesting. It profoundly changes the way we teach writing. The old writing cycle, which emphasized careful planning before you started, was really an attempt to deal with the limitations of paper and pen. On a screen, it’s okay to dive in, to have a provisional thesis and see where it goes, to leave a paragraph unfinished and go back to flesh it out. In fact, those are good strategies. When I am preparing them to take a digital writing test, it feels more like preparing them to write in general.

Anyway, I for one welcome our digital overlords.

Wow, I would have thought it happened much earlier.

In 1992(!), I was one of the first group to take the GRE in a digital format, during a computerized testing beta. That was either the first or second year that it was offered on a computer, and it was an option when registering. I took it in a small office with five or so other people sitting at computers.

The computers were Windows 3.1, and I was terrified I would do something to crash it. They sat me down at the one computer that happened to have the modem they used to upload everyone’s results. I really hope they had somehow disabled the ability to ALT-TAB out of the test and dial-in to the early internet to lookup answers, but I was there to take the test, not red team it.

I believe the test was simply a reproduction of the paper version put into a digital format.

Using the computer did raise my score. There was a countdown timer in the corner of the screen, so with 30 seconds left, on sections where wrong answers did not have a penalty, I could randomly choose an answer for any questions I hadn’t gotten to. That tactic can be used on the paper test, but probably not with the timing precision of on the computer.

Anyway, that whole anecdote is just my long way of saying, wow, 30+ years later, and we’re just now getting rid of the last paper tests.

Those are good points. But I know teachers who taught essentially the same lesson plan each year for thirty years. MCQs are often validated over studies. It may be true one can simply input a bunch of similar written answers, but that does not mean that will overcome reluctance to change or tradition. MCQs have the advantage of possibly queuing the right answer, which is helpful if you want the majority of people to actually pass the exam.