Grading the Common Core Graders
The New York Times' Motoko Rich today explains who is scoring the PARCC assessment.
So the standardized tests given in most states this year required fewer multiple choice questions and far more writing on topics like this one posed to elementary school students: Read a passage from a novel written in the first person, and a poem written in the third person, and describe how the poem might change if it were written in the first person.
Pearson, which operates 21 scoring centers around the country, hired 14,500 temporary scorers throughout the scoring season, which began in April and will continue through July. About three-quarters of the scorers work from home.
So the standardized tests given in most states this year required fewer multiple choice questions and far more writing on topics like this one posed to elementary school students: Read a passage from a novel written in the first person, and a poem written in the third person, and describe how the poem might change if it were written in the first person.
But the results are not necessarily judged by teachers.
Some are retired teachers with extensive classroom experience, but one
scorer in San Antonio, for example, had one year of teaching experience,
45 years ago.
Pearson, which operates 21 scoring centers around the country, hired 14,500 temporary scorers throughout the scoring season, which began in April and will continue through July. About three-quarters of the scorers work from home.
I'll interject here that Common Core standards were not even written by teachers so why would anyone be concerned if there were teachers grading the assessments?
There was a onetime wedding planner, a retired medical technologist and a
former Pearson saleswoman with a master’s degree in marital counseling.
To get the job, like other scorers nationwide, they needed a four-year
college degree with relevant coursework, but no teaching experience.
They earned $12 to $14 an hour, with the possibility of small bonuses if
they hit daily quality and volume targets.
At times, the scoring process can evoke the way a restaurant chain
monitors the work of its employees and the quality of its products.
Concerns from real teachers:
Still, educators like Lindsey Siemens, a
special-education teacher at Edgebrook Elementary School in Chicago, see
a problem if the tests are not primarily scored by teachers.
“Even
as teachers, we’re still learning what the Common Core state standards
are asking,” Ms. Siemens said. “So to take somebody who is not in the
field and ask them to assess student progress or success seems a little
iffy.”
For exams like the Advanced Placement tests
given by the College Board, scorers must be current college professors
or high school teachers who have at least three years of experience
teaching the subject they are scoring.
One tester says:
She acknowledged that scoring was challenging. “Only after all these
weeks being here,” Ms. Gomm said, “I am finally getting it.”
Good luck you kids who were her "test" scores.
Comments
(or, not).
As if the whole excessive exercise wasn't nutty enough!
So now the 3rd grader's essay written in Seattle is going to be judged by a non-teacher wedding planner in Texas looking to grind through X number of essays in order to make quota and earn a bonus on top of their $14/hour.
Maybe they could use TFAers in training, and have them like monkeys whip through a 1,000 of these to earn their 'stripes' and carry on into our actual classrooms. Its a thought. File it under vertical integration. Synergistic cost efficiency.
What has any of this to do with teaching my child? With educating my child??
How would any of this nonsense 'help' my kids' teachers modify or adapt their instruction to my individual children?
Are my kids' teacher so friable, so inept, that they need this 'help' from Texas for them to understand where my kids are at and what they need support with?
This seems like it would make for one FANTASTIC Monty Python Mockumentary.
Maybe with technology, the essays could be machine read and google translated into Urdu or Hindi and dialed over to India, where the mass call center could be used to outsource the grading, with the 'marks' electronically emailed directly to parents, teachers, JSCEE, OSPI, Arne Duncan, Disney, Nickelodeon, MacDonalds, the military, the FBI, and whoever else is entitled to these or pays for them. (Yes, I am being facetious).
Sooooooo glad we opted our kids out. Not so much dodged a bullet, but, skipped the idiotic pointless corporate meat grinder.
We will now be opting out of Amplify Beacon as well.
Maybe next year, instead of the 3 kids in one of my children's classrooms who opted out, it will be more like 15. I certainly hope so.
This is a gigantic waste of teacher and student time (and administrator and JSCEE staff time), money, computing resources, and instructional hours.
I am not against standardized testing. A straight forward, well-normed, multiple choice test as a sort of 'temperature reading' of where a student, classroom, grade, building, and district is at would be okay by me. As imperfect as MAP is (not aligned to curriculum, etc), it was 40 straightforward multiple choice questions that could be done in less than 30 minutes, had a smooth interface, was adaptive, never crashed, plus, results were immediate. Not perfect, but again, it is way better than these other time-heavy confusing and confused tests that are nothing more than a bad beta run gone amok. And no, test scores used to grade teachers makes no sense, and is really quiet destructive.
NE family
btw. Now it's Amplify, and that has checkpoint tests that are given between the 3 regular administrations.
Stop the Insanity.
At another one of our schools it was worse than useless. The teachers viewed it at best redundant validation of things they already knew, and when the results did not comport with their opinions of students' abilities(either high or low) viewed that as proof the test was flawed. The school was small(two classes per grade) and so they didn't need any more help balancing the classes. I eventually started opting my kids out at that school, because it was a complete waste of everyone's time. Which I think these tests mostly are unless the teachers are on board, which I know at least these teachers were with the MSP (which was not better written, though had none of these awful technological problems that really add up to be a fatal flaw). Right now I think the state is trying to get them on board by force, which is rarely effective. Having educators help design questions might be a good step, and at least be involved somehow in the grading process.
-sleeper
As for the group of people scoring writing, those examples are abominable. I've had many interns and our students - even college level and intern-teacher level - simply do not write well. Yes, they are passable but not really excellent like the teachers of old. To think the examples in the article reflect the kinds of people scoring student papers for these standardized tests is kind of frightening. There cannot be consistency nor can there be an expectation that these scorers really know what to look for. But like everything else in American society, wide profit margins and cheap goods. We will be the worse for it.