What Other People Say

The NY Times has had several articles on teacher evaluation. One was "Formula to Grade Teachers' Skill Gains Acceptance, and Critics" about using value-added data. The letters to the editor on this story were quite interesting and I thought I'd put some snips in and see what you think. (All italics and bold mine.)

First a few quotes from the story. Arne Duncan on the LA Times teacher assessment project:

Education Secretary Arne Duncan weighed in to support the newspaper’s work, calling it an exercise in healthy transparency. In a speech last week, though, he qualified that support, noting that he had never released to news media similar information on teachers when he was the Chicago schools superintendent.

On The Los Angeles Times’s publication of the teacher data, he added, “I don’t advocate that approach for other districts.”

Arne? Yes or no?

About value-added itself:

William L. Sanders, a senior research manager for a North Carolina company, SAS, that does value-added estimates for districts in North Carolina, Tennessee and other states, said that “if you use rigorous, robust methods and surround them with safeguards, you can reliably distinguish highly effective teachers from average teachers and from ineffective teachers.”

But when the method is used to evaluate individual teachers, many factors can lead to inaccuracies.
  • For example, two analysts might rank teachers in a district differently if one analyst took into account certain student characteristics, like which students were eligible for free lunch, and the other did not.
  • Millions of students change classes or schools each year, so teachers can be evaluated on the performance of students they have taught only briefly, after students’ records were linked to them in the fall.
  • In many schools, students receive instruction from multiple teachers, or from after-school tutors, making it difficult to attribute learning gains to a specific instructor. Another problem is known as the ceiling effect.
  • Advanced students can score so highly one year that standardized state tests are not sensitive enough to measure their learning gains a year later.
Comments from letters to the editor:
  • Advocates of the value-added model to evaluate teachers justify their position by claiming that is how business works. Yet all prospectuses of mutual funds warn in bold letters that past performance is no guarantee of future results.
  • Such data, combined with effective in-class observation of teaching skills, dramatically increases our understanding of how well teachers and schools help to advance students’ learning. The fact that it is not a perfect system should not disqualify it from use; no evaluation system in any profession is perfect.
  • The value-added method to grade teachers sounds fantastic! Now how about applying it to school administrators themselves, highly paid consultants brought in for “professional development” and expensive, corporation-developed textbooks?
  • Regarding the usefulness of value-added scores for teachers, why shouldn’t parents be given the opportunity to enroll their children for individual teachers, not just schools, using such a tool?

Comments

dan dempsey said…
So where are the Directors and TEAM MGJ on this:

"The value-added method to grade teachers sounds fantastic! Now how about applying it to school administrators themselves, highly paid consultants brought in for “professional development” and expensive, corporation-developed textbooks?

Then we could see where vast amounts of money are needlessly wasted by our decision-makers.

Comparing costs of Singapore Math vs. Everyday Math is absolutely obscene. So where are the high priced results from the expensive product. ..... How much value was added by the "Added Price" paid for Consultants, Coaches, Consumable materials, and higher initial cost?
SC Parent said…
Yes, we should absolutely apply value-added to administrators - especially where there is a broad enough sample to generate meaningful data. For example - the 5 regional directors.
dan dempsey said…
I am not so sure about ways to evaluate individual coaches using data .... but looking at what has happened with MGJ's spending on (1) coaches for teachers and district (2) instructional materials choices and (3) practices ==> brought overall results, which sure makes her massively expensive coaching and unproven instructional practices look mighty weak.

So what has happened to "everyone accountable"?

Can the "accountability" start at the top and not the bottom for a change?

Oh ... I forgot MGJ just got another year on her contract ... maybe we can start accountability in three years. The board sure did not want to start at the top now.

===========
State law prohibits the public appeal of a school board decision that involves the Superintendent's contract.

Popular posts from this blog

Tuesday Open Thread

Why the Majority of the Board Needs to be Filled with New Faces

Who Is A. J. Crabill (and why should you care)?