Fake Numbers Update Tomorrow
There will be some statement about two of the misleading statistics on the school reports (students making gains and advanced learners) at the quarterly Strategic Plan Update tomorrow.
For some reason, my hopes are not high.
Let's review the language from the superintendent's [cough]apology[cough]. Do you see anything here that suggests that they will:
No. There is nothing in the public statements by the District that give me cause to believe that they are going to follow an honorable path.
For some reason, my hopes are not high.
Let's review the language from the superintendent's [cough]apology[cough]. Do you see anything here that suggests that they will:
Ensure Definitions Are Clear ‐ We need to be crystal clear around the definitions of the performance measures in our district scorecard and school reports and make adjustments in language as necessary.But they are in no hurry to make those changes. They can wait until Wednesday.
Proactively Communicate with Stakeholders – It is our responsibility to ensure the community understands our district scorecard and school reports.Again, this can wait. There's no hurry.
We are responding to community feedback and sharing what we learn together about how to improve the way we measure and report on our progress.This is a bad decision. Instead of making it clear that they are measuring student progress relative to other students, they should instead measure student progress relative to the Standards.
Examples include:
Adjusting the language for the student gains measure to be clear that this is a measure of gains relative to students' academic peers
Preparing our community for changes in how we report on advanced learning, based in part on their feedback with suggestions for improvement.I'm not sure what this means. It appears that they are going to announce that changes are coming in advance of making the changes. What we really want them to do is count APP- and Spectrum-eligible students in ALOs as advanced learners in their schools.
No. There is nothing in the public statements by the District that give me cause to believe that they are going to follow an honorable path.
Comments
I'd like funding pulled from Research, Evaluation and Assesment.
Here is the honorable path that I shall follow.
I sent this letter, which clearly outlines the intentional failure of the school directors to fulfill their oath of office.
If the directors do not on Wednesday do something to correct their intentional failure to do the jobs they were elected to do, then comes Recall attempt #3.
Recall #1 failed because we failed to show and describe specific actions of each director that were sufficient for recall. The state auditor's report was damning but not specific enough to meet the laws requirement.
Recall #2 listed numerous violations of policies and laws and attributed these to specific directors, but the judge stated that we failed to show intent by the directors in these failures. Note: the written law for recall has no mention of an intent requirement. We may appeal, but the Judge has yet to write a decision. It is ridiculous to attempt to appeal an oral decision.
Recall #3 will be coming if Charlie's predictions about the Director's actions are correct. We will show failure to fulfill the oath of office and intent to do so.
This is amazing that the majority of Directors absolutely refuse to direct the Superintendent.
-- Dan
Keep up good work and keep them busy...
"Fraudulently or deceptively imitative"
as opposed to bullshit
"excretus of the male bovine variety"
I really want to see if anyone on the Board takes any interest in these false numbers.
I really want to see if anyone on the Board says: "No one cares if students outperform the bottom third of their test score peers. We want to know if they are making at least a year of progress."
See what a dreamer and an idealist I really really am?
Of course it contains some numbers.
Let us see if they or the Superintendent can explain the numbers and can answer my six questions.
They know where to write me.
Spare your thumbs and keyboard and go straight to slide #42.
They need five slides to essentially say:
Yes, it would be best if we could report students progress against the absolute scale of grade level expectations instead of their gains against the relative scale of their peer group but... [mumble mumble whine whine]
So, instead of doing what would be best, we're going to continue to do the half-assed job we've been doing, only we're going to be slightly less opaque about it.
Bottom line: they are going to keep the statistic exactly how it is, but they will change the label on it to read: "Students making at least typical growth relative to academic peers within SPS."
It will continue to be a meaningless statistic that doesn't tell anybody anything.
Two Board members specifically asked if they could get tutoring on the statistic, how it is derived, and what it means.
HINT TO SEATTLE PUBLIC SCHOOLS: If it takes five slides to explain the statistic and two Board members want private instruction on it, it is inappropriate to include in a handout to the general public.
On the other hand, the District will do a better job of correctly reporting the number of advanced learners in schools. They will include all APP- and Spectrum-eligible students participating in ALOs, the same group that they have been reporting on the MSP pass rates for those schools.
I don't know if they will be counting APP- and Spectrum-eligible students who are NOT participating in ALOs or who or what they will count in middle schools and high schools, but this is a positive first step with regard to elementary schools.
They will also write more and better FAQs to go with the school reports.
That's it.
On the good side, it has taken them only a few weeks to respond to getting caught on these statistics - instead of the two and a half years they took to come clean about the 17%. Then again, they haven't actually responded yet. They haven't done the corrections that they said they would do.
1) We need to measure student progress.
2) Student progress is an important measure of a school.
3) People want to know that students working at or beyond Standards are advancing at least one grade in knowledge and skills each year and that students working below Standards are advancing more than one grade each year.
4) The goal is for every student to be at least proficient with the grade level Standards and then work towards mastery. Consequently, the only meaningful measure of student progress is relative to to the Standards.
5) The students are not in a race with each other. Measures of student progress relative to other students is simply not a meaningful statistic.
6) The District freely acknowledges that it would - without a doubt - be best to measure student progress relative to an absolute measure.
It comes down to this: they say that measuring student progress against the absolute scale of state test scores would not be good because the math test is harder than the reading test and some years the test is harder than other years.
No kidding. They swear that they would totally use it - that it is the very best way to measure progress - if it were not for this.
Of course, they have a chart. The chart purports to show that the reading test varies in difficulty rating from a low of 23 on the 4th grade test to a high of 49 on the 7th grade test. They also claim that the math test varies in difficulty from a low of 36 in the third grade to a high of 59 in the 7th grade.
I'm not even going to question their method of measuring test difficulty. Mostly because it just doesn't matter.
For them to accept this as a valid reason to reject measuring student progress against an absolute scale is essentially to invalidate the MSP. They are saying that the failure rate on the 7th grade MSP isn't as bad as you think because the 7th grade tests are, like, totally hard.
But you know me. I don't complain unless I can offer a solution.
Here's my solution: They should provide stacked bar graphs that show the number of students who got Level 1, 2, 3, and 4 scores plus the number of students who did not take the test. There should be a bar for each of these groups. Each of the bars should be made of five different colors showing the distribution of the students' scores on the previous test.
My profile picture provides a example.
Then we could see, at a glance, the score distribution and the change in student scores - are 1's becoming 2's or are 2's becoming 1's. Are 4's becoming 3's or are 3's becoming 4's.
This would be meaningful.
This would answer how well the school serves students working below, at, and beyond grade level.
Concerned about sending your high-performing student to the school? Just take a look and see how many 4's are there and whether they are becoming 3's or whether the company of 4's is growing.
One point for:
* students moving from Level 1 to Level 2
* students at Level 3 staying at Level 3
* any student who did not test
* any student who did not test in the previous year
Two points for:
* students moving from Level 2 to Level 3
* students moving from Level 3 to Level 4
* students at Level 4 staying at Level 4
Three points for:
* students moving from Level 1 to Level 3 or 4
* students moving from Level 2 to Level 4
No points for:
* students at Level 2 staying at Level 2
Minus one point for:
* students moving from Level 4 to Level 3
* students at Level 1 staying at Level 1
Minus two points for:
* students moving from any level to Level 1
* students moving from Level 3 or 4 to Level 2
The points are totaled and averaged for all students. The schools will come out with growth ratings like 1.8 or 1.75 depending on how many significant digits are appropriate.
How would that be?
A student moving from a Level 3 score to a Level 2 score should only be minus one point instead of minus two.
Might this actually happen? Which board members are interested? Will the public be able to attend the potential tutorial? I think I'm one of several number nerds who would want to be there.
I actually think I *get* what they are trying to do with the Colorado Growth Model, but I have a bazillion questions about the underlying assumptions and appropriate interpretation of the results that I'd love someone from the district to explain.
I would also like to be at the tutorial, primarily to point out all of the different ways that this measure is meaningless.
It's funny, but you take something that's completely ridiculous and stupid that no one would accept. "Hey, let's count the kids who outscore the bottom third of their test score peers and call that growth." Then give it a cool name, like "Colorado Growth Model". The name suddenly gives the stupid idea all kinds of instant credibility.
In addition to the problems that the system causes for good employees in high performing units (who get culled anyway), it may eliminate too FEW bad employees in bad units, and is bad in a school setting because it rampantly encourages divisiveness and competition between, rather cooperation among, employees (or schools, as the case may be). Really, who wants the complicated special ed child, when a a system like this puts your job on the line for taking them on. Who wants to share lesson plans and techniques, and outlines, and strategies if it only means that the teacher down the hall keeps his job, at the direct expense of yours? What games does your school have to play so as not to get "stuck" with the homeless kids, or those returning from re-entry schools, or -- whatever.
The other problem is -- the only way there is data to "feed" this system is the aggressive, high-stakes, incessant testing of our kids, to their detriment, in love of learning, in available class time, and -- in cases where good instruction stops so kids can "cram" for tests like the MAP -- in quality of instruction.
They call it "Growth" -- though it is disconnected from any real growth measures -- because that is what we want for our kids -- growth. If they called it the "Washington Teacher Assessment for Termination or Promotion Model," we would realize what a waste of time and money it is for our kids, and we might riot in the streets. (I know, I know -- Sahila reasonably wonders why we are all not there already!)
WV says -- there are "manyli" in Chicago Growth Model numbers.