Debate the issues facing Seattle Public Schools, share your opinions, read the latest news. Organize and work for high quality public schools that educate all students to become passionate, lifelong learners.
I try really hard to be contemptuous, disparaging and condescending.
In real science, when a real scientist puts his/her name to some result or idea, then other real scientists look critically and ... if they accept those results or ideas, they accept 'em and move on.
In education "research" charlatans who can kind of do some algebra, and who can fiddle with spreadsheets, and who can muck with stat programs - these charlatans make idiotic claim after idiotic claim, and the charlatans defend their idiocy with "the research shows".
ewing is a real scientist.
the political incompetents of the NEA / WEA are allowing charlatans and their bought and paid for lackeys to define the terms of debate.
Real scientists are doing their jobs, charlatans and hucksters are doing their jobs - when will the unions
John Ewing says Seattle can’t use the data to draw conclusions about student growth. "One year of growth for one teacher is meaningless," he says. A teacher typically only has a student in their class for one year. The district is taking two of those sets of one-year growth data per teacher, and averaging them. =====
Several studies have reported that the annual student growth of a teacher's students ... varies considerably from year to year. In fact teachers scoring in the top 1/5 one year may be in the bottom 2/5 the next year.
BUT hey -- this is education and politics drives policy .... until decisions are made in a rational manner .... policies like this piece of nonsense will continue.
This policy generates good revenue for testing companies.
The Common Core State Standards are based on thinking that the unproven and untested will work.
CCSS will be great for vendors.
==== Allen says:
It’s not a matter of “Until decision makers concern themselves with how much the students are learning” because, unless the structure of public education changes that’ll happen never.
Those decision-makers are acting rationally by ignoring educational outcomes. They’re operating within the context of the current model of public education and that model has no incentive for anyone to concern themselves with educational outcomes. Since there’s no incentive to concern themselves with education they don’t.
What’s irrational is the expectation that any positive changes can be made if only the proper tinkering about the periphery of public education occurs. It won’t and the institutional indifference to educational outcomes is why no teacher accountability scheme or curriculum improvement or any other such gee-gaw will ever work; the lack of incentive to pursue educational efficacy remains and that will result in the undoing of every effort to improve the extant public education system.
========= There are lots of teachers concerned with student learning .... but most find themselves at odds with leadership. .... Too many leaders are pushing the "completely ineffective company line" .... which is great for vendors and badly underserves students.
So the teachers get "different" evaluations (woopi do)... but as W. Edwards Deming stated at most 14% of a system's problems are due inadequacies of the worker. The problem is the system and that is why these "different" evaluations are a "bogus" attempt.
---- Where is there a mechanism that holds the "administrative" decision-makers or "union" leaders responsible for producing an improved system?
===== Notice the claim that US ranks 24 out of 32 countries on PISA testing is based on "typical crappy" statistical analysis..... It is all politics.
NO experts anywhere in the field of K-12 educational assessment consider just two years of this type of data as valid.
The idea that the the leaders of our LMO (Labor Management Organization, the Seattle Education Association) would agree to this just shows how little concern they have for science, mathematics, statistics, or teachers in classrooms.
Seattle Public Schools should not release the two year VAM data on its teachers as planned.
“A study examining data from five separate school districts [over three years], found for example, that of teachers who scored in the bottom 20% of rankings in one year, only 20-30% had similar ratings the next year, while 25 – 45% of these teachers moved to the top part of the distribution, scoring well above average.” - National Research Council, Board on Testing and Assessment (2009). Report to the U.S. Department of Education.
Links to the full reports at The National Academy Press for source research publications and books: http://www.nap.edu/openbook.php?record_id=12780&page=1 http://216.78.200.159/Documents/RandD/Other/Getting%20Value%20out%20of%20Value-Added.pdf http://www.naeducation.org/Background_Paper_Getting_Teacher_Evaluation_Right.pdf\
The list below of peer-reviewed, academically sound research and reports on the use and abuse of VAM in K-12 in long and compelling. We don’t understand how or why anyone whose job it is within a school system to collect and meaningfully apply teacher and student assessments to improve student learning is allowed to keep their job without ever doing the needed due diligence and to inform themselves about the core facts of their work. Absurd, really. Virtually all the research VAM as applied to teacher evaluation indicates that the planned SPS action will seriously mislead the public - not inform them as apparently has been falsely assumed.
Economic Policy Institute: “Analyses of VAM results show that they are often unstable across time, classes and tests; thus, test scores, even with the addition of VAM, are not accurate indicators of teacher effectiveness. Student test scores, even with VAM, cannot fully account for the wide range of factors that influence student learning, particularly the backgrounds of students, school supports and the effects of summer learning loss…. Furthermore, VAM does not take into account nonrandom sorting of teachers to students across schools and students to teachers within schools.” http://www.epi.org/publication/bp278/
Annenberg Institute: this is an excellent recent and major review of current principles and practices of VAM measure as relevant to K-12 educational reform. “At least in the abstract, value-added assessment of teacher effectiveness has great potential to improve instruction and, ultimately, student achievement. However, the promise that value-added systems can provide such a precise, meaningful, and comprehensive picture is not supported by the data.” http://annenberginstitute.org/pdf/valueaddedreport.pdf
The Kappan: PDK International From “The Kappan,” the excellent magazine of PDK International (a must subscription for SPS board members and administrators in my view) is that after reviewing the critical problems with VAM… it does not abandon the idea improving teacher evaluations as part of the effort to improve K-12 education and instead presents practices that are more likely to actually accomplish those goals. 1. Value-added models of teacher effectiveness are inconsistent… 2. Teachers’ value-added performance is affected by the students assigned to them… 3. Value-Added Ratings Can’t Disentangle the Many Influences on Student Progress… http://www.edweek.org/ew/articles/2012/03/01/kappan_hammond.html
National Bureau of Economic Research: Student Storing and Bias in Value Added Estimation: “The results here suggest that it is hazardous to interpret typical value added estimates as indicative of causal effects… assumptions yield large biases…. More evidence, from studies more directly targeted at the assumptions of value added modeling, is badly needed, as are richer VAMs that can account for real world assignments. In the meantime, causal claims will be tenuous at best.” http://www.nber.org/papers/w14666
Test Score Ceiling Effects of Value Added Measures of School Quality From: U. of California, U. of Missouri, and the American Statistical Association This is a pure research that is often cited by experts but is not an easy read for a non-educator or lay person. Its critical findings are around test score ceilings and non-random populations of students (think Roosevelt vs Rainier Beach). This creates statistical problems and misconception when amalgamating or disaggregating student/teacher data from test scores. http://www.amstat.org/sections/srms/proceedings/y2008/Files/301495.pdf
The Problems with Value-Added Assessment - Diane Ravitch With her perspective as an education historian this is a recent, thoughtful and fact based review of VAM use. “I concluded that value-added assessment should not be used at all. Never. It has a wide margin of error. It is unstable. A teacher who is highly effective one year may get a different rating the next year depending on which students are assigned to his or her class. Ratings may differ if the tests differ. To the extent it is used, it will narrow the curriculum and promote teaching to tests. Teachers will be mislabeled and stigmatized. Many factors that influence student scores will not be counted at all.” http://blogs.edweek.org/edweek/Bridging-Differences/2010/10/dear_deborah_you_asked_what.html
Research Calls Data-Driven Education Reforms into Question Recent reports by National Academies, National Research Council and the National Center on Education and the Economy. “Both organizations are respected for their high quality, comprehensive, and non-ideological research. Together, they reach the undeniable conclusion that today's array of testing and choice fails to meet the instructional needs of American students and the national goal of broadly-based high academic achievement.” http://www.huffingtonpost.com/david-bloomfield/education-reform-standardized-testing_b_882718.html
Why Teacher Evaluation Shouldn’t Rest on Student Test Scores FairTest.Org has a clearly stated agenda, but that does not discount this excellent list of the practical problems applying VAM (as currently used) to teacher evaluation and concludes with a list of solid, academically sound research references. http://www.fairtest.org/sites/default/files/Teacher-Evaluation-FactSheet-10-12.pdf
Edutopia An excellent, unbiased resource on educational issues and the relevant research. The George Lucas Educational Foundation is dedicated to improving the K-12 learning process by documenting, disseminating, and advocating for innovative, replicable, and evidence-based strategies that prepare students to thrive in their future education, careers, and adult lives. Edutopia’s byline is, “What Works in Education.” “Value-added modeling” is indeed all the rage in teacher evaluation: The Obama administration supports it, and the Los Angeles Times used it to grade more than 6,000 California teachers in a controversial project. States are changing laws in order to make standardized tests an important part of teacher evaluation. Unfortunately, this rush is being done without evidence that it works well. “ http://www.edutopia.org/groups/race-top/30432
"Unfortunately, this rush is being done without evidence that it works well.“
Always leaping before looking --- for a look might reveal the real inadequacies and there is no interest in that on the part of decision makers.
I just posted data "Algebra for all 8th graders" in Tacoma under Open Thread Wednesday.-- zero evidence used to make this decision and zero used to evaluate its progress.
The SEA leadership made not a peep about Everyday Math or Discovering .... Yet the SEA leaders figured teacher-members should have part of their evaluation based on their ability to make defective materials work. Hey the defective evaluative system corresponds to the defective materials - very consistent.
After my campaign experience, I, too, wonder about union leadership. There must be something I missed because these people are not stupid nor inept. It does not all compute.
And, as Dan and Eric have forcefully pointed out, the teacher assessments using VAM don't compute either.
Also, I still have to do my thread on what Finland is doing but no, they don't assess their teachers like this.
The key is respect, not threats.
Anonymous said…
I would like to know how the district is handling teacher evaluations for those working with APP students. The MAP test is adaptive, but it still has a ceiling and some students are hitting it in 4th and 5th grade. Once a student hits the ceiling, measures of growth have little meaning.
For the reading MAP, NWEA states its ceiling is a RIT score of 245. Above that number, they can only say a student scored high, but not that one score above 245 is any more meaningful than another score above 245. When a 5th grader is getting questions on Shakespeare sonnets, can you fault the teacher for not showing growth in that student's scores? If you can't measure growth, how do you evaluate growth?
I do like many things about what I hear about Finland's approach to education .... but as the above analysis points out .... Finland has a 96% native born non-immigrant population = The lowest in Europe.... That fact makes a huge difference on the PISA test results and other assessments used to indicate Finnish superiority..
In the USA the level of teacher dissatisfaction is on the rise .... but apparently no one cares about that ... as blaming the teachers is in vogue.
The political tribes that control education decision making refuse to even discuss the following: Once we correct (even crudely) for demography in the 2009 PISA scores, American students outperform Western Europe by significant margins and tie with Asian students.
Because the tribes are all about manipulation to push an agenda. {{So like what's up with the NEA and WEA and SEA?}} {{ bought and sold?}}
Anonymous said…
What tests are they basing the evaluations on? And if students opt out of them, how will that affect the teachers scores?
wondering
seattle parent said…
Dan, what do you know about the claim in the linked CCSS video that geometry will be taught by an "experimental method?" Looking at the OSPI links, I see that Common Core includes probability in the geometry standards (which obviously isn't included in the 2008 WA standards for geometry).
Okay, it's just confusing. In the OSPI transition documents, it shows adding probability into Geometry, though in the CCSS, Probability is listed separately. Is WA adding a bit of probability into each year of high school math, whether it be Geometry, Algebra I, or Algebra II, etc?
Anonymous said…
Testing should be used to evaluate the student, not the teacher. When my sons were in high school, I never felt test results were used to help them individually. A wasted opportunity.
Using these results to punish teachers is an ed reform idea that will backfire. It will chase teachers out of the profession, and many will be excellent teachers.
As Dan noted, teachers should not be penalized for using lousy curricula. In math, it is the textbooks that should receive the failing grades. They are like lead weights attached to the students. This is the area that should be getting attention.
S Parent
Anonymous said…
I watched a web based video on the new value added evaluation model,along with the rest of the teaching staff at my school on the last early release day. The narrator spoke very slowly in a calm and soothing voice, perhaps as a way to convince us that there was nothing to be afraid of with these brilliant plans the SEA and district has in mind for us. Call to mind, Hal(sp?)the computer from 2001. Those unfortunate teachers who did not manage to student growth would be actively nurtured through increased observations and up to $500 in professional development money. We learned about a little girl named Emily, who had "a" learning disability, and how we teachers can add value to her scores. The narrator explained lots of charts and graphs, and was quite convincing that value added was a fair and honorable way to evaluate teachers. It was sickening and frightening.
One can't really question a smoothly narrated and slick video, and sadly no one tried.
I'm betting that Strat360 was paid $100Ks to produce that video. Meanwhile Korsmo and Morris want to know WTF you losers are doing with that $500 stipend!
Anonymous said…
Every certificated teacher in the district had to watch a series of videos explaining the student growth ratings. They were atrocious. The explanations were delivered in an exceedingly slooooooooooooooow monotone. The videos assumed that we teachers understood nothing, that we were very small children who needed to be taught what growth was...you know... a change over time. It was insulting as well as painful to sit through.
Only teachers of tested subjects (math and reading) in grades 4-8 and 9th grade Algebra teachers are affected. I felt really sorry for all the primary teachers, specialists, and other staff with certificates who had to endure an hour of this nonsense.
Beyond the well documented and substantial statistical flaws with VAM, there are some other issues. The most obvious is the shift occurring next year to Common Core standards AND a new test. How can student growth possibly be measured across a shift in standards and assessments? I am not a statistician by any stretch of the imagination, but I am fairly certain you have to have some baselines and norms to work with. All the MSP data will become meaningless next year. Does this mean we start all over? If that is the case, why spend so much money and time rolling out a system that becomes irrelevant. Oh wait...that's what the district does every few years.
Ostensibly the main point of the new evaluation system is to improve teaching quality by enabling administrators to identify and then support and potentially fire ineffective teachers. Yet, the process for removing a bad teacher is just as time consuming, complicated, and bureaucratic as it was before. Again, it seems like a huge waste of resources to deal with a very small group of teachers. Assuming we had a 100% competent and effective teaching corps, would we need this system? How exactly will a student growth rating help a teacher improve their practice if they are already an effective teacher who is actively working to improve every year?
Finally, I am not at all convinced that this information will remain private. The potential implications of it leaking out are huge. Imagine that your child is entering 4th grade. There are 2 teachers with "high" growth ratings, 2 with "typical" and 1 with "low". If you had access to this information, would you be okay if your child was in the "low" teacher's classroom? Many parents have been in this situation. I think everyone knows who the best and worst teachers are in any given school. Now there is another layer and it has numbers attached to it.
It has been said a million time, but I will say it again. Good teaching is dependent on cooperation, collaboration, the free exchange of information and ideas, and mutual professional respect. I am deeply concerned that a system that rates teachers, categorizes them, and ultimately encourages competition will undermine the relationships we have with one another. This will only hurt our students.
There has got to be a way to identify bad teachers and expedite their departure from the profession that doesn't cost millions of dollars, waste valuable time, and pit teachers against each other. Seriously.
-dismayed teacher
Anonymous said…
It was my understanding that NWEA MAP scores were the basis of teacher evaluations, not MSP. Isn't this why students that didn't have a Spring MAP score were pulled for Fall testing, so they had data points needed for year-to-year growth measurements?
The NWEA MAP isn't any more aligned to the WA standards than the CCSS, so the transition to new standards may be a moot point.
Did the videos/training not say what test would be used for evaluations?
dismayed parent
Anonymous said…
Dismayed Parent,
They use both the MSP and the MAP. Linking the MAP to the evaluation system all but ensures we have to continue our expensive contract with NWEA and continue to use MAP scores to evaluate teachers, which they are not intended or designed to do.
For a condescending video, put together with help from the Center For Teaching Quality, that explains in baby talk the PGE evaluation process, see the Seattle Education Association The Center For Teaching Quality was given over three million dollars by the Gates Foundation in August. So I guess the SEA is in bed with Gates, too. Who knew? From the CBA (Collective Bargaining Agreement): “Teachers of tested subjects and grade levels are those for whom two or more common state or district assessments are available. 2. Teachers of tested subjects and grade levels will receive a rating on their student academic growth of either low, typical, or high based on the assessments available to that teacher. Students will be compared to their academic peers – e.g. students in the same grade who performed at a similar level in the subject in previous years. 3. Student growth ratings will be based on a two-year rolling average. 4. Students must be enrolled 80% of the time and must be in attendance 80% of that time to have their assessments counted in the teacher’s growth rating. 5. SPS will calculate each teacher’s rating by using a valid, reliable and transparent methodology as agreed upon by SEA and SPS. 6. To ensure that teachers of challenging student populations are evaluated fairly, aggregate student growth results will factor in the student composition of the teacher’s classroom(s), including the proportion of English learners, students who qualify for free/reduced lunches, and students with disabilities…"
Note especially this part of the PGE eval on value added: "6. To ensure that teachers of challenging student populations are evaluated fairly, aggregate student growth results will factor in the student composition of the teacher’s classroom(s), including the proportion of English learners, students who qualify for free/reduced lunches, and students with disabilities…"
So Blackness will be assigned a value, as will poverty. A student who is "Asian" and "poor" will be pro-rated by a certain amount, whilst a student who is an English Language Learner and "White" will be assigned another value.
Opt out. These systems are racist and dehumanizing.
Anonymous said…
I'm a specialist, so I don't think this will personally effect me yet. However, I do share students with many challenges (academic and social/emotional) with general education teachers. Very few of my students have "a" learning disability. Most have a whole list of areas in which they need to be served. Even those who do not necessarily qualify, may be significantly behind. Often my students have the capacity (I'm not sure if capacity is the right word) to produce higher scores, but they are impulsive and rush through. Kids really can finish the MAP in under 6 minutes. They are also quite capable of handing in their MSP within minutes of hearing the instructions. MAP scores can bounce all over the place, seemed to reflect the moods of my students rather than their skills.
It can be hard enough, especially with the large class sizes in this district, for general education teachers to meet the educational and emotional needs of all the students in their classes. Even with special services, there is only so much I as a specialist can do. This new evaluation system is certainly not doing any teacher or student any favors.
As for the up to $500 of PD money, I bet that there's a similar catch as to the career ladder positions. I was excited when I learned that I qualified for a career ladder position because I was interested in doing some mentoring. Later, I learned that qualifying merely gave me the opportunity to compete with other "innovative" teachers for a limited number of career ladder positions. Well, maybe I'll win a million dollars if I keep ordering magazines.
One more thing, the amount of PD time that has been spent and will be spent on learning the new evaluation system--Charlotte Danielson and VAM really revolts me. I can think of a lot of other ways to actually spend time adding value to my practice.
"the student composition..." What ARE students composed of? What does it mean to identify a student as "Black" or White" or "F/RL" or "disabled"? Will the number crunchers merely make assumptions about a student based on these categories, or will they talk to the student and actually see what makes them tick? I'm guessing the former.
So teachers are evaluated on assumptions made about who a student is based on the checkboxes the student's parent/guardian checks.
How very regressive. How classist. How racist. How dismissive of the range of diabilities.
Speaking of "reform," here comes Lynne Varner - in the guise of the Times' editorial board - spouting about charters.
Anonymous said…
Are only teachers that teach math, science, reading, and writing evaluated according to student' test scores? What about elementary schools that do walk to math or reading in which students are ability grouped and the groups are flexible throughout the year? pt
Anonymous said…
It will be interesting see how this pans out. The evaluation system is highly subjective as it is. I am a former SPS teacher who received passing ratings from one princpal and all failing ratings from another. I was told I was failing a month into the school year and received minimal help (at times verbal abuse from my administrator) and no support from the union (they told me their hands were tied and they didn't like the new system). I'm skeptical about this $500 PD support...
-EvaluationIsFun!
Anonymous said…
Hmm. With an ineffective union, I am wondering if I should have my "dues" directed elsewhere. Why pay the piper if he won't come to the dance?
Molly
Fifth Grade Teacher said…
The Union is such a disappointment. How could they have bargained this? And the monthly dues are not cheap. Anyone on here want to share their student growth scores? I was given a growth score of 56. Which places me as a "typical" teacher. The growth scores are based entirely on an average of the MAP and MSP scores for two years. Here's the kicker--a student can score a "4" (advanced) one year and a "4" (advanced) the following year and be considered a negative growth because the advanced score was not advanced enough. If they missed no problems in fourth grade on the MSP but missed one problem in fifth grade (it is a much harder test in fifth grade) then they get negative growth. So stressful!
Statistically speaking, this stuff is meaningless. It is the worst kind of anti-intellectual-posing-as-educated play acting to wave around these numbers and pretend like they have meaning.
It is the EMPERORS NEW CLOTHES.
Teachers should know this research and refute it every time a puppet from Oz gets on stage to bloke more smoke & thunder.
Which raises the question: if you know this is manure, and you don't say or do anything, who are you, exactly?
Anonymous said…
On Thurs. at 1:58 SEA sent out a CYA email about the upcoming data release.
I did love this gem:
"If you have questions about your student growth rating you can email studentgrowth@seattleschools.org for more information. You can also use the same email and request a personal conversation with a district official who can explain how your particular growth rating was calculated. "
Why aren't the formulas public? Oh! Wait! Why isn't there a blog for SEA members do discuss issues? Why is everything from Jonathan as last minute as possible? Why did 60% of SEA members not bother voting in the March SEA officer elections?
My problem with most teachers is not that they are ineffective in the classroom, but that they are politically complacent. They had an opportunity to vote out the current union leadership earlier this year, but how many actually even bothered to vote?
It's time to wake up teachers--you got the leadership you deserve, and that's bad enough for you all, but it has severe negative impacts on the rest of SPS. You have a responsibility not just to yourselves but to the whole community.
"Good teaching is dependent on cooperation, collaboration, the free exchange of information and ideas, and mutual professional respect. I am deeply concerned that a system that rates teachers, categorizes them, and ultimately encourages competition will undermine the relationships we have with one another. This will only hurt our students."
This is precisely what Sahlberg (the Finnish Education department guy) said at his talk.
Anonymous said…
Amen, Jack Whelan.
However, I wouldn't say "teachers" in general are politically complacent--Wisconsin and Chicago are just too examples that refute that idea.
However, teachers in Seattle often act like the politics of the the job are beneath them. Then, when the chickens come home to roost, they act like victims.
What a pathetic shame they chose Jonathan Knapp over Eric Muh--that is, the ones who even bothered to vote.
Hmmm. I think these videos would be great on YouTube. Will have to look into that...
WV: edutosu - no kidding!
No Longer Complacent Teacher said…
There is a union meeting tomorrow at 5PM at Blaine Elementary. I'm planning on going. Yes, I didn't vote. I wasn't even aware we had a vote. Our building doesn't even have a union rep. ! I sent en e-mail to SEA askig about that fact and they never responded. I believe I'm paying $80 a month in union dues if not more. That's a lot for me who is on a single income with a family of four. Will the bloggers on here go to the meeting tomorrow to voice their concerns? I plan to go. Is there a way to get the word out about the meeting tomorrow?
My student's MAP scores did decline when she had an inexperienced teacher one year. The teacher was not bad - just inexperienced. I immediately hired a tutor to take up the slack and her MAP scores shot up again.
The MAP score fluctuations cannot be applied to rate teachers for precisely this reason. Most of the students in my student's school receive massive assistance and backup training by their parents. The district students are not a controlled population.
However, I do not support boycotting the exams because these data are useful to me as a parent to know when to take action.
I ran for president, after 3 years of activism. I have already taken one (many) for the team, and as far as I'm concerned, can sit on the sidelines for a while, until a few more teachers wake up.
3/4 of the SEA represented staff couldn't be bothered even to vote. That's while working under a super-crappy concession-filled and not even a penny's worth of cost of living allowance current contract delivered by this SEA leadership.
All signs point to another concession-filled SEA collapse at the negotiating table this August.
I'm depressed, and done with SEA-directed activism, until I see something more than the same small handful of teachers step up and resist.
Anonymous said…
No Longer Complacent ...
There is an alex someone on SEA's webpage in charge of member stuff.
IF you're building had representatives, then those people would have received the last minute last second agenda for the Monday meeting on late thurs or late fri, so that no one has time to discuss anything and no one had time to mount counter proposals to whatever Knapp-WEA-Crap is getting pushed down our throats.
I'd recommend sending an allstaff asking if people are interested in running for the AR positions (once you find out how many are open) and put the election date about 3 weeks out, and give people at least a week to nominate / step up.
At my school, not enough step up for an election, so whoever is willing gets 1 of the 4 AR spots we have.
There are all kinds of "duties" of Knapp-WEA-Crap which come down the pike which I'm supposed to do - I forward it to allstaff and forget about it. I tell ALL my staff if they want things done differently, they can be the AR - in 3 years NO ONE steps up - except for a few newbies & ... how long will they last at monthly meetings which are designed to insure that whoever is around at the end of the rambling Arne-love-fest wins?
pt writes, "Are only teachers that teach math, science, reading, and writing evaluated according to student' test scores? What about elementary schools that do walk to math or reading in which students are ability grouped and the groups are flexible throughout the year?"
and "word" wrote, "My student's MAP scores did decline when she had an inexperienced teacher one year. The teacher was not bad - just inexperienced. I immediately hired a tutor to take up the slack and her MAP scores shot up again. The MAP score fluctuations cannot be applied to rate teachers for precisely this reason"
Exactly. I have heard that someone asked District this question: If a student is in a remedial reading class, and a Langauge Arts class, who gets the credit (or discredit) for student growth or decline? Answer: The two teachers share it!
Now imagine: Student last year had two parents at home, one a reader with lots of books. Parents divorce. Reader leaves. THIS year, student isn't supported at home, but has an after-school tutor, a remedial reading class, and a regular LA class. On day of test, student has the flu.
What would you make of THOSE scores?
Lastly, the district is pushing for literacy across disciplines - standards-based literacy tools used in many different disciplines. So if History and Science are using tools to help students become more literate, generally, then how do they divide up THAT score? Why is district asking teachers to all teach literacy, then attributing growth in reading (and writing, as a corollary) to just one teacher?
And what DOES a value-added score look like in an art class? What test will they design to rate the art teacher?
I back up Eric M here. After fighting costly, burdensome battles re: MAP, TFA etc and getting NO interest from SEA representation, I have to agree why the H*ll should I worry if those directly impacted aren't interested enough to vote?! to care?! Most of us work fulltime, have mortgage$$ and worrisome responsibilities. But parents cannot be expected to bear the burdens of those teachers who do not wish to ensure their interests are fairly represented. Even now, more teachers unions in the area are too weak-kneed to ensure administrators follow terms of contracts...
Anonymous said…
Word, Miramac and Eric M. Word.
Teachers, you have made your bed; now you get to lie in it. Please don't bother to boo hoo to me (and I think that this evaluation system SUCKS!) if you haven't voted in your own damn union elections.
Moose
Parent of Two said…
I have two sons. One son had Teacher X and had a miserable year. He would come home with head-aches, confused about his school work, and spend hours crying about how he was failing. We helped him at home to explain the work he was doing so that he would not get so frustrated and have these melt-downs about not wanting to go to school. We requested the other teacher for our other son this year. This other teacher is just wonderful. Our younger son looks forward to going to school, comes home excited about what he is learning, has become a passionate reader and loves math for the first time. So, guess who gets the "poor teacher" rating and who gets the "strong" teacher rating? Our son's teacher confided at Parent-Teacher conferences on Friday that he was rated "Poor" and that he was asked to watch Teacher X so that he could become more like her. ! He was really torn up about this low rating because you can see how much work he puts into his teaching. He is always there late at night and pays home-visits to students to make sure they are doing well. So, I predict we will be producing some really boring, dull teachers in Seattle! It's not the teachers in the end who will suffer--its the next generation of children. I'm sure glad I don't have to go to school anymore. !
Anonymous said…
This contract to which we are referring (with the VAM and Creative Approach Schools included ) was brokered under Superintendent Maria Goodloe-Johnson's tenure (and on board where Glenn Bafia, Olga Addae + new VP Jonathan Knapp as replacement for the deceased Brian Lindquist).
What we are dealing now with is what happens when you don't have a chance to read the contract before you sign it. Some folks figure that what they don't know won't hurt them! Union representatives had a briefing on the tentative agreements two days before the general meeting. School had not started so even though there was a large group at the G.A., not all representatives were able to have meetings with building people to discuss it.
The members of the bargaining team will get up and start singing the blues about how hard they worked to get the tentative agreement that they did: "We got the best deal we could." The framing here is that voting against the proposed agreement means that we are being disrespectful of the bargaining team's hard work--so that if they are sent back to the drawing board this means that their time has been wasted.
Some of us tried to hold off the vote by a week so that there would be more time to go over the contract. Nothing doing.
At the looooooooong meeting, those who wanted to stop the examination of those troublesome contract areas tried to end debate. Those who were simply tired of being there voted to end debate. Sometimes when certain members get up to talk, it annoys other members who will automatically vote against them. This is all business as usual.
When asked about the VAM, Glenn Bafia's response was that principals can already look at that data.
Bilingual IAs along with Special Ed IAs were thrown under the bus.
Please note that the vote on this current contract was done by voice and not by secret ballot. Yes, it passed. Look at the context of its passage. And yes, to our peril.
I usually go to RA unless I am sick. I participate and will continue to do so. I contribute to this blog using this name and one other. I voted for Eric.
CORE in Chicago has won two hard battles : 1) They won control over their union. This was building by building appraoch. It took a lot of work to defeat the embedded crew in power. CORE did not even think it would win. 2) They went on strike with a 90% --won this strike working with parents and communities. There is still work to do there. It doesn't stop.
The work that needs to be done in Seattle to build and strengthen us has to be happen at the building level. Some buildings are completely apathetic. They want someone to do the work but they cannot be counted on to help with anything. It is going to be slow. It cannot just be at the top. Democracy doesn't trickle down. We need it to trickle up.
--Modern Sound in Rio de Janeiro
Anonymous said…
Parent of Two:
Your kiddo's teacher will probably also have to find their own time to do that observation (during their plan or break time). I'm suprised he confided in you, it killed me to tell families that I was just moving on at the end of the year when they asked why I was leaving. I told two families the truth about my evaluation, as they stopped by when I was cleaning out my room. My adminstrator told me that my students were going to have wasted a whole year by having me as their teacher (among other things, funny enough...almost all of my kids made progress as judged by end of year tests and reading levels).
Dan, what do you know about the claim in the linked CCSS video that geometry will be taught by an "experimental method?"
YUP-- here is the scoop on Geometry =>
Hung-Hsi Wu is a professor at Cal Berkeley and this Geometry approach is his baby. Completely unproven and untried as far as I know. Web page for Woo.
Looking at the OSPI links, I see that Common Core includes probability in the geometry standards (which obviously isn't included in the 2008 WA standards for geometry). I have no idea what is happening at the HS math level in CCSS. Given that Phil Daro of the Dana Center that brought WA state such math crap during the Bergeson years was on the math original design team (back when this was all secret) and Dr. Joe Wilhoft (of WASL assessment years) is now the head of the Smarter Balanced Assessment Consortium my interest in following this is near zero.
As far as I know nothing is actually done at the high school level by standards assigned to be taught in actual courses yet. The whole idea that 100% of the student population should be required to complete high levels of mathematics seems absurd to me. I see this as likely to lower the quality of both classes and instruction (as many disinterested students will be forced into classes in which they have little interest).
I think anyone that thinks this will be a good use of dollars that will produce a reasonable academic educational bang for the buck ... needs to take a closer look at the assumptions that this CCSS is likely to drive improvement.
Mathematics Common Core State Standards WA
Okay, it's just confusing. In the OSPI transition documents, it shows adding probability into Geometry, though in the CCSS, Probability is listed separately. Is WA adding a bit of probability into each year of high school math, whether it be Geometry, Algebra I, or Algebra II, etc?
I have no idea.
math guy said…
The "experimental method" of teaching geometry appears to amount to a desperate plea to use a definition of congruence that works through to the college level.
While all change involves risk, I'm thinking the risk/benefit analysis comes out positive on this one.
The speaker list is up for the Board meeting tomorrow; not as packed as I thought with just four people on the waitlist. The majority of the speakers are speaking on high school boundaries (with several wanting to talk about Ballard High). There are only three of us speaking about the Green Dot resolution asking the City to not grant the zoning departures that Green Dot has requested. It's me, long-time watchdog, Chris Jackins, and the head of the Washington State Charter Schools Association, Patrick D'Amelio. (I knew Mr. D'Amelio when he headed the Alliance for Education and Big Brothers and Big Sisters; he's a stand-up guy.)
Update 2: an absolutely fabulous interactive map made by parent Beth Day (@thebethocracy on Twitter - she covers Board meetings and is fun to read). end of update Update 1: Mea culpa, I did indeed get Decatur and Thornton Creek mixed up. Thanks to all for the correction. end of update I suspect some who read this post will be irate. Why do this? Because the district seems very hellbent on this effort with no oversight skid marks from the Board. To clearly state - I do not believe that closing 20 schools is a good idea. I think they hit on 20 because they thought it might bring in the most savings. But the jury is still out on the savings because the district has not shown its work nor its data. I suspect closing schools and THEN leasing/renting them is the big plan but that means the district really has to keep the buildings up. But this district, with its happy talk about "well-resourced schools" is NOT acknowledging the pain and yes, gr...
From the ever-amusing Washington Policy Center : Vouchers are Pell Grants for students under 18. Vouchers are no different than Pell Grants or GI benefits, except the money goes to the families of students younger than age 18. Except they are. Pell Grants were created to help needy students and that's not really the goal of the voucher program. The Pell grant website does have a couple of great studies on why low-income students drop out before finishing their higher ed and what makes a difference.
Comments
In real science, when a real scientist puts his/her name to some result or idea, then other real scientists look critically and ... if they accept those results or ideas, they accept 'em and move on.
In education "research" charlatans who can kind of do some algebra, and who can fiddle with spreadsheets, and who can muck with stat programs - these charlatans make idiotic claim after idiotic claim, and the charlatans defend their idiocy with "the research shows".
ewing is a real scientist.
the political incompetents of the NEA / WEA are allowing charlatans and their bought and paid for lackeys to define the terms of debate.
Real scientists are doing their jobs, charlatans and hucksters are doing their jobs - when will the unions
DoTheirJobs?
=====
Several studies have reported that the annual student growth of a teacher's students ... varies considerably from year to year. In fact teachers scoring in the top 1/5 one year may be in the bottom 2/5 the next year.
BUT hey -- this is education and politics drives policy .... until decisions are made in a rational manner .... policies like this piece of nonsense will continue.
This policy generates good revenue for testing companies.
The Common Core State Standards are based on thinking that the unproven and untested will work.
CCSS will be great for vendors.
====
Allen says:
It’s not a matter of “Until decision makers concern themselves with how much the students are learning” because, unless the structure of public education changes that’ll happen never.
Those decision-makers are acting rationally by ignoring educational outcomes. They’re operating within the context of the current model of public education and that model has no incentive for anyone to concern themselves with educational outcomes. Since there’s no incentive to concern themselves with education they don’t.
What’s irrational is the expectation that any positive changes can be made if only the proper tinkering about the periphery of public education occurs. It won’t and the institutional indifference to educational outcomes is why no teacher accountability scheme or curriculum improvement or any other such gee-gaw will ever work; the lack of incentive to pursue educational efficacy remains and that will result in the undoing of every effort to improve the extant public education system.
=========
There are lots of teachers concerned with student learning .... but most find themselves at odds with leadership. .... Too many leaders are pushing the "completely ineffective company line" .... which is great for vendors and badly underserves students.
So the teachers get "different" evaluations (woopi do)... but as W. Edwards Deming stated at most 14% of a system's problems are due inadequacies of the worker. The problem is the system and that is why these "different" evaluations are a "bogus" attempt.
---- Where is there a mechanism that holds the "administrative" decision-makers or "union" leaders responsible for producing an improved system?
=====
Notice the claim that US ranks 24 out of 32 countries on PISA testing is based on "typical crappy" statistical analysis..... It is all politics.
- José Banda, Superintendent Seattle Public Schools
NO experts anywhere in the field of K-12 educational assessment consider just two years of this type of data as valid.
The idea that the the leaders of our LMO (Labor Management Organization, the Seattle Education Association) would agree to this just shows how little concern they have for science, mathematics, statistics, or teachers in classrooms.
Seattle Public Schools should not release the two year VAM data on its teachers as planned.
“A study examining data from five separate school districts [over three years], found for example, that of teachers who scored in the bottom 20% of rankings in one year, only 20-30% had similar ratings the next year, while 25 – 45% of these teachers moved to the top part of the distribution, scoring well above average.” -
National Research Council, Board on Testing and Assessment (2009).
Report to the U.S. Department of Education.
Links to the full reports at The National Academy Press for source research publications and books:
http://www.nap.edu/openbook.php?record_id=12780&page=1
http://216.78.200.159/Documents/RandD/Other/Getting%20Value%20out%20of%20Value-Added.pdf
http://www.naeducation.org/Background_Paper_Getting_Teacher_Evaluation_Right.pdf\
The list below of peer-reviewed, academically sound research and reports on the use and abuse of VAM in K-12 in long and compelling.
We don’t understand how or why anyone whose job it is within a school system to collect and meaningfully apply teacher and student assessments to improve student learning is allowed to keep their job without ever doing the needed due diligence and to inform themselves about the core facts of their work. Absurd, really.
Virtually all the research VAM as applied to teacher evaluation indicates that the planned SPS action will seriously mislead the public - not inform them as apparently has been falsely assumed.
Economic Policy Institute:
“Analyses of VAM results show that they are often unstable across time, classes and tests; thus, test scores, even with the addition of VAM, are not accurate indicators of teacher effectiveness. Student test scores, even with VAM, cannot fully account for the wide range of factors that influence student learning, particularly the backgrounds of students, school supports and the effects of summer learning loss…. Furthermore, VAM does not take into account nonrandom sorting of teachers to students across schools and students to teachers within schools.”
http://www.epi.org/publication/bp278/
Annenberg Institute: this is an excellent recent and major review of current principles and practices of VAM measure as relevant to K-12 educational reform.
“At least in the abstract, value-added assessment of teacher effectiveness has great potential to improve instruction and, ultimately, student achievement. However, the promise that value-added systems can provide such a precise, meaningful, and comprehensive picture is not supported by the data.”
http://annenberginstitute.org/pdf/valueaddedreport.pdf
The Kappan: PDK International
From “The Kappan,” the excellent magazine of PDK International (a must subscription for SPS board members and administrators in my view) is that after reviewing the critical problems with VAM… it does not abandon the idea improving teacher evaluations as part of the effort to improve K-12 education and instead presents practices that are more likely to actually accomplish those goals.
1. Value-added models of teacher effectiveness are inconsistent…
2. Teachers’ value-added performance is affected by the students assigned to them…
3. Value-Added Ratings Can’t Disentangle the Many Influences on Student Progress…
http://www.edweek.org/ew/articles/2012/03/01/kappan_hammond.html
“The results here suggest that it is hazardous to interpret typical value added estimates as indicative of causal effects… assumptions yield large biases…. More evidence, from studies more directly targeted at the assumptions of value added modeling, is badly needed, as are richer VAMs that can account for real world assignments. In the meantime, causal claims will be tenuous at best.”
http://www.nber.org/papers/w14666
Test Score Ceiling Effects of Value Added Measures of School Quality
From: U. of California, U. of Missouri, and the American Statistical Association
This is a pure research that is often cited by experts but is not an easy read for a non-educator or lay person. Its critical findings are around test score ceilings and non-random populations of students (think Roosevelt vs Rainier Beach). This creates statistical problems and misconception when amalgamating or disaggregating student/teacher data from test scores.
http://www.amstat.org/sections/srms/proceedings/y2008/Files/301495.pdf
The Problems with Value-Added Assessment - Diane Ravitch
With her perspective as an education historian this is a recent, thoughtful and fact based review of VAM use.
“I concluded that value-added assessment should not be used at all. Never. It has a wide margin of error. It is unstable. A teacher who is highly effective one year may get a different rating the next year depending on which students are assigned to his or her class. Ratings may differ if the tests differ. To the extent it is used, it will narrow the curriculum and promote teaching to tests. Teachers will be mislabeled and stigmatized. Many factors that influence student scores will not be counted at all.”
http://blogs.edweek.org/edweek/Bridging-Differences/2010/10/dear_deborah_you_asked_what.html
Research Calls Data-Driven Education Reforms into Question
Recent reports by National Academies, National Research Council and the National Center on Education and the Economy.
“Both organizations are respected for their high quality, comprehensive, and non-ideological research. Together, they reach the undeniable conclusion that today's array of testing and choice fails to meet the instructional needs of American students and the national goal of broadly-based high academic achievement.”
http://www.huffingtonpost.com/david-bloomfield/education-reform-standardized-testing_b_882718.html
Why Teacher Evaluation Shouldn’t Rest on Student Test Scores
FairTest.Org has a clearly stated agenda, but that does not discount this excellent list of the practical problems applying VAM (as currently used) to teacher evaluation and concludes with a list of solid, academically sound research references.
http://www.fairtest.org/sites/default/files/Teacher-Evaluation-FactSheet-10-12.pdf
Edutopia
An excellent, unbiased resource on educational issues and the relevant research. The George Lucas Educational Foundation is dedicated to improving the K-12 learning process by documenting, disseminating, and advocating for innovative, replicable, and evidence-based strategies that prepare students to thrive in their future education, careers, and adult lives. Edutopia’s byline is, “What Works in Education.”
“Value-added modeling” is indeed all the rage in teacher evaluation: The Obama administration supports it, and the Los Angeles Times used it to grade more than 6,000 California teachers in a controversial project. States are changing laws in order to make standardized tests an important part of teacher evaluation. Unfortunately, this rush is being done without evidence that it works well. “
http://www.edutopia.org/groups/race-top/30432
They are political tools, not evaluative tools.
And the idea being floated that SPS/SEA has come up with a better way is risible.
Always leaping before looking --- for a look might reveal the real inadequacies and there is no interest in that on the part of decision makers.
I just posted data "Algebra for all 8th graders" in Tacoma under Open Thread Wednesday.-- zero evidence used to make this decision and zero used to evaluate its progress.
The SEA leadership made not a peep about Everyday Math or Discovering .... Yet the SEA leaders figured teacher-members should have part of their evaluation based on their ability to make defective materials work. Hey the defective evaluative system corresponds to the defective materials - very consistent.
How much are those monthly union dues?
And, as Dan and Eric have forcefully pointed out, the teacher assessments using VAM don't compute either.
The key is respect, not threats.
For the reading MAP, NWEA states its ceiling is a RIT score of 245. Above that number, they can only say a student scored high, but not that one score above 245 is any more meaningful than another score above 245. When a 5th grader is getting questions on Shakespeare sonnets, can you fault the teacher for not showing growth in that student's scores? If you can't measure growth, how do you evaluate growth?
-wants to know
"Sahlberg... is right to criticize our failure to build smart tests that can diagnose learning problems and measure the quality of teaching."
Okay, maybe she doesn't outright SAY they do, but she sure implies they do - or that they really wish they could, or something.
http://super-economy.blogspot.com/2010/12/amazing-truth-about-pisa-scores-usa.html
I do like many things about what I hear about Finland's approach to education .... but as the above analysis points out .... Finland has a 96% native born non-immigrant population = The lowest in Europe.... That fact makes a huge difference on the PISA test results and other assessments used to indicate Finnish superiority..
In the USA the level of teacher dissatisfaction is on the rise .... but apparently no one cares about that ... as blaming the teachers is in vogue.
The political tribes that control education decision making refuse to even discuss the following:
Once we correct (even crudely) for demography in the 2009 PISA scores, American students outperform Western Europe by significant margins and tie with Asian students.
Because the tribes are all about manipulation to push an agenda. {{So like what's up with the NEA and WEA and SEA?}} {{ bought and sold?}}
wondering
Mathematics Common Core State Standards WA
Using these results to punish teachers is an ed reform idea that will backfire. It will chase teachers out of the profession, and many will be excellent teachers.
As Dan noted, teachers should not be penalized for using lousy curricula. In math, it is the textbooks that should receive the failing grades. They are like lead weights attached to the students. This is the area that should be getting attention.
S Parent
One can't really question a smoothly narrated and slick video, and sadly no one tried.
--Sorrel
I'm betting that Strat360 was paid $100Ks to produce that video. Meanwhile Korsmo and Morris want to know WTF you losers are doing with that $500 stipend!
Only teachers of tested subjects (math and reading) in grades 4-8 and 9th grade Algebra teachers are affected. I felt really sorry for all the primary teachers, specialists, and other staff with certificates who had to endure an hour of this nonsense.
Beyond the well documented and substantial statistical flaws with VAM, there are some other issues. The most obvious is the shift occurring next year to Common Core standards AND a new test. How can student growth possibly be measured across a shift in standards and assessments? I am not a statistician by any stretch of the imagination, but I am fairly certain you have to have some baselines and norms to work with. All the MSP data will become meaningless next year. Does this mean we start all over? If that is the case, why spend so much money and time rolling out a system that becomes irrelevant. Oh wait...that's what the district does every few years.
Ostensibly the main point of the new evaluation system is to improve teaching quality by enabling administrators to identify and then support and potentially fire ineffective teachers. Yet, the process for removing a bad teacher is just as time consuming, complicated, and bureaucratic as it was before. Again, it seems like a huge waste of resources to deal with a very small group of teachers.
Assuming we had a 100% competent and effective teaching corps, would we need this system? How exactly will a student growth rating help a teacher improve their practice if they are already an effective teacher who is actively working to improve every year?
Finally, I am not at all convinced that this information will remain private. The potential implications of it leaking out are huge. Imagine that your child is entering 4th grade. There are 2 teachers with "high" growth ratings, 2 with "typical" and 1 with "low". If you had access to this information, would you be okay if your child was in the "low" teacher's classroom? Many parents have been in this situation. I think everyone knows who the best and worst teachers are in any given school. Now there is another layer and it has numbers attached to it.
It has been said a million time, but I will say it again. Good teaching is dependent on cooperation, collaboration, the free exchange of information and ideas, and mutual professional respect. I am deeply concerned that a system that rates teachers, categorizes them, and ultimately encourages competition will undermine the relationships we have with one another. This will only hurt our students.
There has got to be a way to identify bad teachers and expedite their departure from the profession that doesn't cost millions of dollars, waste valuable time, and pit teachers against each other. Seriously.
-dismayed teacher
The NWEA MAP isn't any more aligned to the WA standards than the CCSS, so the transition to new standards may be a moot point.
Did the videos/training not say what test would be used for evaluations?
dismayed parent
They use both the MSP and the MAP. Linking the MAP to the evaluation system all but ensures we have to continue our expensive contract with NWEA and continue to use MAP scores to evaluate teachers, which they are not intended or designed to do.
-dismayed teacher
The Center For Teaching Quality was given over three million dollars by the Gates Foundation in August.
So I guess the SEA is in bed with Gates, too. Who knew?
From the CBA (Collective Bargaining Agreement):
“Teachers of tested subjects and grade levels are those for whom two or more common state or district assessments are available.
2. Teachers of tested subjects and grade levels will receive a rating on their student academic growth
of either low, typical, or high based on the assessments available to that teacher. Students will be
compared to their academic peers – e.g. students in the same grade who performed at a similar level
in the subject in previous years.
3. Student growth ratings will be based on a two-year rolling average.
4. Students must be enrolled 80% of the time and must be in attendance 80% of that time to have their assessments counted in the teacher’s growth rating.
5. SPS will calculate each teacher’s rating by using a valid, reliable and transparent methodology as agreed upon by SEA and SPS.
6. To ensure that teachers of challenging student populations are evaluated fairly, aggregate student growth results will factor in the student composition of the teacher’s classroom(s), including the proportion of English learners, students who qualify for free/reduced lunches, and students with disabilities…"
WV says this is all an educarb
"6. To ensure that teachers of challenging student populations are evaluated fairly, aggregate student growth results will factor in the student composition of the teacher’s classroom(s), including the proportion of English learners, students who qualify for free/reduced lunches, and students with disabilities…"
So Blackness will be assigned a value, as will poverty. A student who is "Asian" and "poor" will be pro-rated by a certain amount, whilst a student who is an English Language Learner and "White" will be assigned another value.
Opt out. These systems are racist and dehumanizing.
It can be hard enough, especially with the large class sizes in this district, for general education teachers to meet the educational and emotional needs of all the students in their classes. Even with special services, there is only so much I as a specialist can do. This new evaluation system is certainly not doing any teacher or student any favors.
As for the up to $500 of PD money, I bet that there's a similar catch as to the career ladder positions. I was excited when I learned that I qualified for a career ladder position because I was interested in doing some mentoring. Later, I learned that qualifying merely gave me the opportunity to compete with other "innovative" teachers for a limited number of career ladder positions. Well, maybe I'll win a million dollars if I keep ordering magazines.
One more thing, the amount of PD time that has been spent and will be spent on learning the new evaluation system--Charlotte Danielson and VAM really revolts me. I can think of a lot of other ways to actually spend time adding value to my practice.
Sorrel
What ARE students composed of?
What does it mean to identify a student as "Black" or White" or "F/RL" or "disabled"?
Will the number crunchers merely make assumptions about a student based on these categories, or will they talk to the student and actually see what makes them tick?
I'm guessing the former.
So teachers are evaluated on assumptions made about who a student is based on the checkboxes the student's parent/guardian checks.
How very regressive. How classist. How racist. How dismissive of the range of diabilities.
How twisted.
What about elementary schools that do walk to math or reading in which students are ability grouped and the groups are flexible throughout the year?
pt
-EvaluationIsFun!
Molly
http://schoolfinance101.wordpress.com/2012/11/17/air-pollution-in-ny-state-comments-on-the-ny-state-teacherprincipal-rating-modelsreport/
Statistically speaking, this stuff is meaningless. It is the worst kind of anti-intellectual-posing-as-educated play acting to wave around these numbers and pretend like they have meaning.
It is the EMPERORS NEW CLOTHES.
Teachers should know this research and refute it every time a puppet from Oz gets on stage to bloke more smoke & thunder.
Which raises the question: if you know this is manure, and you don't say or do anything, who are you, exactly?
I did love this gem:
"If you have questions about your student growth rating you can email studentgrowth@seattleschools.org for more information. You can also use the same email and request a personal conversation with a district official who can explain how your particular growth rating was calculated. "
Why aren't the formulas public? Oh! Wait! Why isn't there a blog for SEA members do discuss issues? Why is everything from Jonathan as last minute as possible? Why did 60% of SEA members not bother voting in the March SEA officer elections?
What ARE
WePayingDuesFor?
It's time to wake up teachers--you got the leadership you deserve, and that's bad enough for you all, but it has severe negative impacts on the rest of SPS. You have a responsibility not just to yourselves but to the whole community.
This is precisely what Sahlberg (the Finnish Education department guy) said at his talk.
However, I wouldn't say "teachers" in general are politically complacent--Wisconsin and Chicago are just too examples that refute that idea.
However, teachers in Seattle often act like the politics of the the job are beneath them. Then, when the chickens come home to roost, they act like victims.
What a pathetic shame they chose Jonathan Knapp over Eric Muh--that is, the ones who even bothered to vote.
--enough already
WV: edutosu - no kidding!
The MAP score fluctuations cannot be applied to rate teachers for precisely this reason. Most of the students in my student's school receive massive assistance and backup training by their parents. The district students are not a controlled population.
However, I do not support boycotting the exams because these data are useful to me as a parent to know when to take action.
I ran for president, after 3 years of activism. I have already taken one (many) for the team, and as far as I'm concerned, can sit on the sidelines for a while, until a few more teachers wake up.
3/4 of the SEA represented staff couldn't be bothered even to vote. That's while working under a super-crappy concession-filled and not even a penny's worth of cost of living allowance current contract delivered by this SEA leadership.
All signs point to another concession-filled SEA collapse at the negotiating table this August.
I'm depressed, and done with SEA-directed activism, until I see something more than the same small handful of teachers step up and resist.
There is an alex someone on SEA's webpage in charge of member stuff.
IF you're building had representatives, then those people would have received the last minute last second agenda for the Monday meeting on late thurs or late fri, so that no one has time to discuss anything and no one had time to mount counter proposals to whatever Knapp-WEA-Crap is getting pushed down our throats.
I'd recommend sending an allstaff asking if people are interested in running for the AR positions (once you find out how many are open) and put the election date about 3 weeks out, and give people at least a week to nominate / step up.
At my school, not enough step up for an election, so whoever is willing gets 1 of the 4 AR spots we have.
There are all kinds of "duties" of Knapp-WEA-Crap which come down the pike which I'm supposed to do - I forward it to allstaff and forget about it. I tell ALL my staff if they want things done differently, they can be the AR - in 3 years NO ONE steps up - except for a few newbies & ... how long will they last at monthly meetings which are designed to insure that whoever is around at the end of the rambling Arne-love-fest wins?
BestWishes
What about elementary schools that do walk to math or reading in which students are ability grouped and the groups are flexible throughout the year?"
and "word" wrote, "My student's MAP scores did decline when she had an inexperienced teacher one year. The teacher was not bad - just inexperienced. I immediately hired a tutor to take up the slack and her MAP scores shot up again.
The MAP score fluctuations cannot be applied to rate teachers for precisely this reason"
Exactly. I have heard that someone asked District this question:
If a student is in a remedial reading class, and a Langauge Arts class, who gets the credit (or discredit) for student growth or decline?
Answer: The two teachers share it!
Now imagine: Student last year had two parents at home, one a reader with lots of books. Parents divorce. Reader leaves. THIS year, student isn't supported at home, but has an after-school tutor, a remedial reading class, and a regular LA class. On day of test, student has the flu.
What would you make of THOSE scores?
Lastly, the district is pushing for literacy across disciplines - standards-based literacy tools used in many different disciplines. So if History and Science are using tools to help students become more literate, generally, then how do they divide up THAT score? Why is district asking teachers to all teach literacy, then attributing growth in reading (and writing, as a corollary) to just one teacher?
And what DOES a value-added score look like in an art class? What test will they design to rate the art teacher?
Teachers, you have made your bed; now you get to lie in it. Please don't bother to boo hoo to me (and I think that this evaluation system SUCKS!) if you haven't voted in your own damn union elections.
Moose
What we are dealing now with is what happens when you don't have a chance to read the contract before you sign it. Some folks figure that what they don't know won't hurt them!
Union representatives had a briefing on the tentative agreements two days before the general meeting. School had not started so even though there was a large group at the G.A., not all representatives were able to have meetings with building people to discuss it.
The members of the bargaining team will get up and start singing the blues about how hard they worked to get the tentative agreement that they did: "We got the best deal we could." The framing here is that voting against the proposed agreement means that we are being disrespectful of the bargaining team's hard work--so that if they are sent back to the drawing board this means that their time has been wasted.
Some of us tried to hold off the vote by a week so that there would be more time to go over the contract. Nothing doing.
At the looooooooong meeting, those who wanted to stop the examination of those troublesome contract areas tried to end debate. Those who were simply tired of being there voted to end debate.
Sometimes when certain members get up to talk, it annoys other members who will automatically vote against them.
This is all business as usual.
When asked about the VAM, Glenn Bafia's response was that principals can already look at that data.
Bilingual IAs along with Special Ed IAs were thrown under the bus.
Please note that the vote on this current contract was done by voice and not by secret ballot. Yes, it passed. Look at the context of its passage.
And yes, to our peril.
I usually go to RA unless I am sick. I participate and will continue to do so. I contribute to this blog using this name and one other. I voted for Eric.
CORE in Chicago has won two hard battles :
1) They won control over their union. This was building by building appraoch. It took a lot of work to defeat the embedded crew in power. CORE did not even think it would win.
2) They went on strike with a 90% --won this strike working with parents and communities.
There is still work to do there. It doesn't stop.
The work that needs to be done in Seattle to build and strengthen us has to be happen at the building level. Some buildings are completely apathetic. They want someone to do the work but they cannot be counted on to help with anything.
It is going to be slow. It cannot just be at the top. Democracy doesn't trickle down. We need it to trickle up.
--Modern Sound in Rio de Janeiro
Your kiddo's teacher will probably also have to find their own time to do that observation (during their plan or break time). I'm suprised he confided in you, it killed me to tell families that I was just moving on at the end of the year when they asked why I was leaving. I told two families the truth about my evaluation, as they stopped by when I was cleaning out my room. My adminstrator told me that my students were going to have wasted a whole year by having me as their teacher (among other things, funny enough...almost all of my kids made progress as judged by end of year tests and reading levels).
-EvaluationisFun!
Dan, what do you know about the claim in the linked CCSS video that geometry will be taught by an "experimental method?"
YUP-- here is the scoop on Geometry =>
Hung-Hsi Wu is a professor at Cal Berkeley and this Geometry approach is his baby. Completely unproven and untried as far as I know. Web page for Woo.
Looking at the OSPI links, I see that Common Core includes probability in the geometry standards (which obviously isn't included in the 2008 WA standards for geometry). I have no idea what is happening at the HS math level in CCSS. Given that Phil Daro of the Dana Center that brought WA state such math crap during the Bergeson years was on the math original design team (back when this was all secret) and Dr. Joe Wilhoft (of WASL assessment years) is now the head of the Smarter Balanced Assessment Consortium my interest in following this is near zero.
As far as I know nothing is actually done at the high school level by standards assigned to be taught in actual courses yet. The whole idea that 100% of the student population should be required to complete high levels of mathematics seems absurd to me. I see this as likely to lower the quality of both classes and instruction (as many disinterested students will be forced into classes in which they have little interest).
I think anyone that thinks this will be a good use of dollars that will produce a reasonable academic educational bang for the buck ... needs to take a closer look at the assumptions that this CCSS is likely to drive improvement.
Mathematics Common Core State Standards WA
Okay, it's just confusing. In the OSPI transition documents, it shows adding probability into Geometry, though in the CCSS, Probability is listed separately. Is WA adding a bit of probability into each year of high school math, whether it be Geometry, Algebra I, or Algebra II, etc?
I have no idea.
While all change involves risk, I'm thinking the risk/benefit analysis comes out positive on this one.
Judge these PG&E videos for yourself:
PG&E Intro
Video 1
Video 2
Video 3
Video 4