Automated Grading Software In Development To Score Essays As Accurately As Humans
April 30 marks the deadline for a contest challenging software developers to create an automated scorer of student essays, otherwise known as a roboreader, that performs as good as a human expert grader. In January, the Hewlett Foundation of Hewlett-Packard fame introduced the Automated Student Assessment Prize (ASAP…get it?) offering up $100,000 in awards to “data scientists and machine learning specialists” to develop the application. In sponsoring this contest, the Foundation has two goals in mind: improve the standardized testing industry and advance technology in public education.
The contest is only the first of three, with the others aimed at developing automated graders for short answers and charts and graphs. But the first challenge for the nearly 150 teams participating is to prove their software has the spell checking capabilities of Google, the insights of Grammar Girl, and the English language chops of Strunk’s Elements of Style. Yet the stakes are much higher for developing automated essay scoring software than the relatively paltry $60,000 first-place prize reflects.