Dr. Enfield has told us two things
1) Mr. Floe was fired from his job as principal of Ingraham High School because student test scores had stagnated. This was offered as the primary rationale for his dismissal.
2) Students at Ingraham have been making gains on state tests at a higher rate than the district average. Ingraham's School Report shows 70% of Ingraham students making gains on the state reading test and 63% of them making gains on the state math test compared to 65% and 66% on the tests for the district as a whole.
These statements cannot both be true.
Further statements from Dr. Enfield pointed to low HSPE pass rates by specific student groups. While the pass rates for these student groups were significantly lower than the pass rates for the district as a whole, they were not significantly lower than the pass rates for those groups district-wide. Moreover, any reasonable person would consider that the nine years of schooling those students had before they arrived at high school would have a greater influence over those pass rates than the two years of high school the students had before taking the tests.
Trends in pass rates, which were also mentioned, reflect differences between cohorts rather than differences made for students. It is a stupid way to measure change because it measures the change in school demographics more than anything else.
Give this situation even a moment's thought and you can see that the statements from Dr. Enfield are irreconcilable. The only possible conclusion is that she is a liar. Either she is lying about the rationale for Mr. Floe's dismissal or she is lying about the statistic in the School Report. Of course, there is a third possibility... that both statements are lies.
Let's start with the statistic.
The statistic used on the school reports is the notorious "Colorado Growth Model". To derive this number, the District creates hundreds of peer groups of students with similar test histories. A student who got a score of 304 on the 7th grade reading WASL in 2007 and a score of 297 on the 8th grade reading WASL in 2008 is put into pool with all of the other students who got similar scores on those tests in those years. For simplicity's sake let's say that there were 99 of them. When these students take the 10th grade reading HSPE in 2010, their scores are ranked. Those with the lowest 33 scores are marked as not making gains and those with the top 66 scores are marked as making gains. Do this for every student in the district - put them in a peer group for test history and rank their scores on the next test. The bottom third are marked as not making gains (whether their score actually rose or not) and the top two-thirds are marked as making gains (whether their score actually rose or not). Then reckon the percentage of students at each school who are marked as making gains and that's the figure reported on the School Report.
This number, of course, not only doesn't mean what the District claims it means, it doesn't mean anything at all.
This method of identifying students making gains is assured to conclude that two-thirds of the students are making gains on the state tests - whether they are or not - and that one-third of students are not making gains on state tests - whether they are or not. The District will always be able to claim that they are doing a good job since two-thirds of students are making progress on state tests. But they will also be able to demand more funding because a third of students are not making gains on state tests. In the original versions of the explanation of this statistic the District wrote that they want all students to make gains on state tests even though their method of counting pre-determined that 33% would not. How honest is that?
The statistic is flawed in a number of other ways that I've already written about (relative vs. absolute, doesn't distinguish between high performers and low performers, etc.). The District was confronted with these flaws. Did they admit the idiocy of the statistic and back away from it? Heck, no! They doubled down on it and said that it was their measure of choice and that it was the best one that could be. They re-doubled their commitment to it.
I would point out that in December the District promised to promptly update the FAQ on School Reports and make some other changes to the way they reported and explained this statistic to make its meaning more clear (but not change the statistic). The District has not kept that commitment. The Board, other than Director Smith-Blum, has shown no interest in following up on the commitment.
The statistic is clearly false and misleading. Given that it is a lie, could it be that test scores are at the root of Mr. Floe's dismissal?
How bad are the student academic outcomes at Ingraham? Are they really that much worse than other schools? Take a look at the District Summary and skip ahead to page 8.
The HSPE pass rates for each high school and for the average of all the high schools shows Ingraham to be number 8 of 12 schools for Math HSPE pass rate, higher than Rainier Beach, Cleveland, West Seattle, and Franklin. Ingraham is 9 of 12 for Reading HSPE pass rate, higher than Rainier Beach, Cleveland, and Franklin. Ingraham is near the bottom for the writing HSPE, only out-scoring Sealth. Ingraham is 7th in Verbal and Writing SATs, and 5th in the Math SAT. It is a middle of the pack school.
Given that Ingraham has more FRE students than the District average, more ELL students than the District average, and more single-parent students than the District average, more students with IEPs than the District average and fewer advanced learning students than the District average, these outcomes are not extraordinary.
A review of the school's performance relative to the District and the state can be seen at the OSPI's web site. Ingraham has slightly underperformed the district on the HSPE on just about every demographic slice: all students, African-American students, Hispanic students, Limited English, and Low Income.
No one is suggesting that Ingraham is a top-performing school by any measure (other than US News and World Report). But Ingraham isn't an embarrassment either. It is not as far behind as some other schools. Let's remember that not only are Rainier Beach and Cleveland further behind Ingraham in a lot of these measures, but they are further behind AFTER getting all of that District-level support that Ingraham didn't get.
I think it bears stating that high school HSPE pass rates are determined more by the nine years of schooling that came before high school than the two years of high school before the test. Holding the high school responsible for student outcomes is like holding the shipping clerk responsible for the product design.
Let's also remember that these test scores have been out for almost a year. Why defer taking action on them until now? That's unclear.
Ingraham's scores aren't good, but they aren't dismal either. There are schools with much worse scores (Rainier Beach, Aki Kurose, McClure, Hamilton, Dunlap, Highland Park, Muir, Northgate, etc.) but I don't hear that any other principals have been fired, have you? In light of the uneven treatment, we can only conclude that the test scores are not really the reason for Mr. Floe's dismissal. So that is also a lie.
Telling lies is no way to build trust.