Less Grading, More Teaching, Deeper Learning

Learning

Less grading, more teaching.  More feedback, less waiting. Fewer worksheets, more writing.  Less multiple choice, deeper learning.  These are the reasons I’m spending a good portion of 2012 working on online assessment.  Better assessment tools means better state tests and richer teaching and learning.

I’ve had the good fortune to spend six months working with experts in the field.  They are sick of the dumb jokes about robots.  They just want students do more authentic writing with feedback.

A few years ago Pearson’s Peter Foltz was a university professor.  He used automated essay scoring to give out a weekly writing assignment to a class with close to 200 students.  Students were allowed to revise and resubmit their essays several times.  If they were unhappy with the computer’s grade, they  could send Dr. Foltz the essay and he would grade it.  Over three years, no student took him up on the offer.

I think a good deal of the the skepticism about essay scoring is with reporters not teachers.  A Wired reporter wrote, “it’s difficult to listen to Foltz boast about his essay grader without feeling trespassed upon by a machine.”

But it is clear that automated essay scoring works.  Dr. Foltz notes that with about four iterations of feedback and revisions using Pearson’s WriteToLearn, “the quality of a student’s writing on that essay improves by almost a whole letter grade and that skills can transfer to later writing tasks.”

“One middle school teacher tells us that he assigns 23 essays a year to 142 students and each student does about 8 revisions. That’s about 27,000 essays.” Foltz adds, “He gets more time to focus on students that need help with writing and instantly can monitor what kind of problems that the individual students and the class as a whole are having.   When students write more, they learn more.”  See what the kids say in this video.

A big study found there were two strategies that produced a big effect size in writing instruction: 1) teaching students strategies for planning, revising, and editing their compositions (effect size 0.82) and 2) explicitly and systematically teaching students how to summarize texts (effect size 0.82).  Products like WriteToLearn supports these important strategies for reading and writing. Immediate and specific feedback helps students practice writing through multiple cycles.

There have been a number of other efficacy studies.  One examined performance gains in statewide administration of formative writing assessment in South Dakota.

The Hewlett-funded Automated Student Assessment Prize (ASAP) will be reporting out impressive results of a vendor demonstration next week and will be awarding a $100,000 prize purse on May 9.

Online assessment–particularly automated essay scoring–hold the promise of better state tests and, more importantly, better teaching and learning.

 

Open Education Solutions, where Tom is CEO , is managing ASAP with support from the Hewlett Foundation.  Pearson is a limited partner in Learn Capital where Tom is a partner.  

Tom Vander Ark

Tom Vander Ark

Tom Vander Ark is founder and CEO of Getting Smart. He is also a partner in Learn Capital and a director of iNACOL, Digital Learning Institute, Imagination Foundation, Charter Board Partners, Strive for College, and Bloomboard.

2 Comments

Keven Kroehler /

I would be curious to see the computerized results of Mr. Vander Ark’s blog post. Would it get an “A”? Were the ideas well organized? Did the author’s intent come through? … Just what would be included in the computer’s analysis?

Tom Vander Ark /

I’ve tried several engines and my blogs (as you suspect) could always use some work. It depends on the engine, but they usually provide feedback on sentence structure, word choice, the extent to which it stayed on topic. Current engines aren’t great at grading the logic of an argument.