Robot Eyes As Good As Humans When Grading Essays
MELISSA BLOCK, HOST:
A computer program designed to score student essays would hate what I've written here: short sentences, no five-dollar words, verb-less fragments. And another demerit right there for a sentence starting with the word and. But a recent study finds that computers do just as well as humans at scoring essays on standardized tests. We learned all this in an article by the New York Times education columnist Michael Winerip, who joins me now.
Michael, welcome to the program.
MICHAEL WINERIP: Thank you, Melissa. Nice to be here.
BLOCK: And let's talk about that study I mentioned. It's from the University of Akron. What did it conclude? It looked at the performance of computer scoring software on essays for standardized tests.
WINERIP: It looked at 16,000 essays from students in six states and had nine automated readers score them and took those results and compared them with human scored essays and found that there was a high correlation and that, in terms of consistency, the automated readers might have done just a little better, even.
BLOCK: Consistency meaning not - they would look at the same essay and grade it the same way?
WINERIP: That's correct.
BLOCK: And what are they looking for? Are they looking for traits that we would consider to be good writing or what exactly is it?
WINERIP: Well, they're looking for sentence structure. They're looking for usage, agreement between subjects and verbs. They're looking for sophistication of word use, a lot of the same things that a human editor or reader would look for.
BLOCK: Are they looking for anything that would approximate critical thinking? You know, something that shows, aha, this is a person who has drawn conclusions or has a certain elegance, maybe, in how they write about them?
WINERIP: That may be a weakness. They are not very, very good on - as the eRater people who work with it from ETS say, they're not very good on poetry. They're not very good on comprehension. They truly don't understand what they're reading.
BLOCK: Michael, you talked to a director of writing at MIT who tested one of these software programs, the eRater from the Educational Testing Service, and he found a lot of flaws and a lot of ways he could really trip the system up. Right?
WINERIP: Yes. If it had anything to do with understanding truth, the system had a big problem. You could say, as he said, that the war of 1812 started in 1945. You could say that the biggest problem we have in America is candy bars that are too long. There are all kinds of things you could say that have little or nothing to do with reality that could receive a high score.
BLOCK: And there are a lot of companies who offer this technology. Right? It's not new.
WINERIP: No, no. It's been going on for quite a while. Pearson, which is the biggest education company in America - virtually, every education company has a model and there's lots of money to be made on this stuff. You know, we're moving towards what's called the common core standards, more and more under President Obama's Race to the Top, more and more standardized testing. And so someway there's got to be some way to keep up with this stuff and the more homogenized and the more standardized tests, the more that corporations stand to make on all this.
BLOCK: You know, Michael, listening to you describe this, I'm not sure I can see a value to these computer programs beyond speed and just cranking out results really, really fast. What do you think?
WINERIP: I think it's scary, Melissa. I think that what's scary about it is, if this becomes the standard way of correcting these, then teachers are going to teach to them and a lot of juice of the English language is going to disappear because, if you're not allowed to use a sentence fragment, if you're not allowed to use a, you know, short paragraph, sometimes, that can be very dramatic. And, if those are breaking the rules, you're going to get a more and more homogenized form of writing when the joy of writing is surprise.
BLOCK: Well, Michael Winerip, thanks for talking to us about it.
WINERIP: Melissa, thank you for having me.
BLOCK: Michael Winerip is national education columnist for the New York Times.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.