Dr. Enfield is a Liar

Dr. Enfield has told us two things

1) Mr. Floe was fired from his job as principal of Ingraham High School because student test scores had stagnated. This was offered as the primary rationale for his dismissal.

and

2) Students at Ingraham have been making gains on state tests at a higher rate than the district average. Ingraham's School Report shows 70% of Ingraham students making gains on the state reading test and 63% of them making gains on the state math test compared to 65% and 66% on the tests for the district as a whole.

These statements cannot both be true.

Further statements from Dr. Enfield pointed to low HSPE pass rates by specific student groups. While the pass rates for these student groups were significantly lower than the pass rates for the district as a whole, they were not significantly lower than the pass rates for those groups district-wide. Moreover, any reasonable person would consider that the nine years of schooling those students had before they arrived at high school would have a greater influence over those pass rates than the two years of high school the students had before taking the tests.

Trends in pass rates, which were also mentioned, reflect differences between cohorts rather than differences made for students. It is a stupid way to measure change because it measures the change in school demographics more than anything else.

Give this situation even a moment's thought and you can see that the statements from Dr. Enfield are irreconcilable. The only possible conclusion is that she is a liar. Either she is lying about the rationale for Mr. Floe's dismissal or she is lying about the statistic in the School Report. Of course, there is a third possibility... that both statements are lies.

Let's start with the statistic.

The statistic used on the school reports is the notorious "Colorado Growth Model". To derive this number, the District creates hundreds of peer groups of students with similar test histories. A student who got a score of 304 on the 7th grade reading WASL in 2007 and a score of 297 on the 8th grade reading WASL in 2008 is put into pool with all of the other students who got similar scores on those tests in those years. For simplicity's sake let's say that there were 99 of them. When these students take the 10th grade reading HSPE in 2010, their scores are ranked. Those with the lowest 33 scores are marked as not making gains and those with the top 66 scores are marked as making gains. Do this for every student in the district - put them in a peer group for test history and rank their scores on the next test. The bottom third are marked as not making gains (whether their score actually rose or not) and the top two-thirds are marked as making gains (whether their score actually rose or not). Then reckon the percentage of students at each school who are marked as making gains and that's the figure reported on the School Report.

This number, of course, not only doesn't mean what the District claims it means, it doesn't mean anything at all.

This method of identifying students making gains is assured to conclude that two-thirds of the students are making gains on the state tests - whether they are or not - and that one-third of students are not making gains on state tests - whether they are or not. The District will always be able to claim that they are doing a good job since two-thirds of students are making progress on state tests. But they will also be able to demand more funding because a third of students are not making gains on state tests. In the original versions of the explanation of this statistic the District wrote that they want all students to make gains on state tests even though their method of counting pre-determined that 33% would not. How honest is that?

The statistic is flawed in a number of other ways that I've already written about (relative vs. absolute, doesn't distinguish between high performers and low performers, etc.). The District was confronted with these flaws. Did they admit the idiocy of the statistic and back away from it? Heck, no! They doubled down on it and said that it was their measure of choice and that it was the best one that could be. They re-doubled their commitment to it.

I would point out that in December the District promised to promptly update the FAQ on School Reports and make some other changes to the way they reported and explained this statistic to make its meaning more clear (but not change the statistic). The District has not kept that commitment. The Board, other than Director Smith-Blum, has shown no interest in following up on the commitment.

The statistic is clearly false and misleading. Given that it is a lie, could it be that test scores are at the root of Mr. Floe's dismissal?

How bad are the student academic outcomes at Ingraham? Are they really that much worse than other schools? Take a look at the District Summary and skip ahead to page 8.

The HSPE pass rates for each high school and for the average of all the high schools shows Ingraham to be number 8 of 12 schools for Math HSPE pass rate, higher than Rainier Beach, Cleveland, West Seattle, and Franklin. Ingraham is 9 of 12 for Reading HSPE pass rate, higher than Rainier Beach, Cleveland, and Franklin. Ingraham is near the bottom for the writing HSPE, only out-scoring Sealth. Ingraham is 7th in Verbal and Writing SATs, and 5th in the Math SAT. It is a middle of the pack school.

Given that Ingraham has more FRE students than the District average, more ELL students than the District average, and more single-parent students than the District average, more students with IEPs than the District average and fewer advanced learning students than the District average, these outcomes are not extraordinary.

A review of the school's performance relative to the District and the state can be seen at the OSPI's web site. Ingraham has slightly underperformed the district on the HSPE on just about every demographic slice: all students, African-American students, Hispanic students, Limited English, and Low Income.

No one is suggesting that Ingraham is a top-performing school by any measure (other than US News and World Report). But Ingraham isn't an embarrassment either. It is not as far behind as some other schools. Let's remember that not only are Rainier Beach and Cleveland further behind Ingraham in a lot of these measures, but they are further behind AFTER getting all of that District-level support that Ingraham didn't get.

I think it bears stating that high school HSPE pass rates are determined more by the nine years of schooling that came before high school than the two years of high school before the test. Holding the high school responsible for student outcomes is like holding the shipping clerk responsible for the product design.

Let's also remember that these test scores have been out for almost a year. Why defer taking action on them until now? That's unclear.

Ingraham's scores aren't good, but they aren't dismal either. There are schools with much worse scores (Rainier Beach, Aki Kurose, McClure, Hamilton, Dunlap, Highland Park, Muir, Northgate, etc.) but I don't hear that any other principals have been fired, have you? In light of the uneven treatment, we can only conclude that the test scores are not really the reason for Mr. Floe's dismissal. So that is also a lie.

Telling lies is no way to build trust.

Comments

Anonymous said…
Charlie,

The district doesn't really care about growth of students. They may use growth statistics to justify their pet programs from time to time though. However, what they hold schools accountable to is the absolute pass rate. Schools get no credit for growth when ranked. If you don't make expected pass rates, not growth rates, then you get the hammer. It is an unreasonable assumption that schools with high levels of underprepared students can magically move a student from 5th grade skills to high school grade level skills in one year. Yet, that is the ultimate mandate. Ms. Miracle of the Immaculate Schools of New Orleans might have gotten a bee in her bonnet about this lack of absolute rather than relative success. Being essentially a rookie teacher and not knowing education IMHO I believe that since the only tool she has in her toolbox is a hammer then all the problems start to look like nails.

-Curious
Anonymous said…
is there any reporting of MAP scores ??

reporting, such as:

how many 9th graders at which grade level of skill in each area tested?

if they have 200 or 400 9th graders, and 80 or 120 are at 6th grade math in the beginning of the year, how far are they at the end of the year.

compared to the other high schools - how many 9th graders were at 9th grade level, at 8th grade level, at 7th grade level, at 6th grade level

what was the MEDIAN improvement for each group of kids per grade level?

what was the distribution of improvement for each group of kids per grade level?

for the district, and by each school?

Useful Analysis Is Confusing!
Anonymous said…
I think 'inconsistent' may have been a less incendiary term on this headline and may have encouraged more people to participate in the thread.

"Thinking About the Issue"
dan dempsey said…
I put a couple of comments around #33 under the section "PR Fiasco" about test scores and demographics.

The classic example is known as "Simpson's Paradox".

Here is an example. Let us suppose that scores for each subgroup of students goes up.... guess what? It is still possible for the scores of "ALL Students" to go down.

In Seattle Grade 10 math White students mean score was above 65% and the Black students mean score was below 15%. Consider an example where a school has only students classified as either Black or White by OSPI and that in 2010 these groups were on the district mean at School A. Whites passing at 68% and Blacks passing at 13% but with 70% of the students being White..... for testing Spring 2011 let us suppose at School A that the White Students mean score went up to 75% and the Black students mean score rose to 25%.... note this is a differential gain of 7% for White Students and 12% for Black students ..... but what happens at School A to the mean score for all students?

It all depends on the change in population demographics. Suppose a new student assignment plan changed the School A demographics to 50% white and 50% Black. .... Here is what happens to the mean score for all=>

.70 x 68 + .30 x 13 = 47.6 + 3.9 = 51.5 (2010)

.50 x 75 + .50 x 25 = 37.5 + 12.5 = 50.0 (2011)

Thus we see the mean score for "All students" drops from 51.5% to 50% passing

Even though the White pass rate went up by 7% and the Black pass rate almost doubled by going up 12% to 25%.

This is the type of distortion we find at Ingraham because Free and Reduced Lunch a low scoring population had a population increase of 13%+

White population declined and Black population increased.

Perhaps Ms. Sara Morris needs to send out an email clarifying that the data she sent reveals a lot about the change in population demographics at Ingraham and little if anything about the Principal.

Also the HSPE math was still like the WASL not much of a math test because it did not test the 2008 Math Standards ... we will get a lot better information this year with the End of Course Algebra Test.

-----------

What a continuing leadership fiasco.... Amazing!! Dr. Enfield is proving to be Dr. Goodloe-Johnson's equal.
dan dempsey said…
Dear Useful Analysis Is Confusing!,

Here is what I know from an analysis of Spring NWEA/MAP test results done in 2010 for scores from kindergarten through grade 9.... by RIT scores.

The Non-Free& Reduced Meals group (NP)
The Free & Reduced Maeals group (FRM)

Mean RIT score in math for NP grade 1 is not reached until one school year later by FRM students. This time gap continues to widen util we find that the mean score for NP at grade 5 is not reached until four years later in grade 9 for FRM students.

That four year gap is incredible ... and it is an excellent reflection upon the District's complete failure to provide effective interventions for students.... so it certainly may explain a lot about HS graduation rates below 75%.

In 2010 in the SPS 41% of the students were FRM. So the mean score of a group made up of about 40% of the 9th grade students in the Spring of 2010 has the math skills of 5th grade non-poverty kids..... and the SPS wanted no math classes at the high school level below "Discovering Algebra".

The Scores on the End of Course assessment in Algebra 1 from OSPI should be interesting this year .. Spring 2011.

Again the proposed D43 Promotion/ Non-promotion policy does not even mention interventions.
dan dempsey said…
"Dr. Enfield is a Liar" ....

There are numerous examples of District deception that occurred under CAO Enfield. The School Report Cards were but one example. Either Ingraham is near the bottom or Ingraham is slightly above the middle academically.... it cannot be both places. Yet it has been reported both places.

Thinking about New Tech network $800,000 contract ... Would you prefer participant in Felony Forgery?
sharpeas said…
There's now a Petition to reinstate Martin Floe. Just follow the link and add your signature. Feel free to forward this to anyone else who may wish to show their support.

http://www.petitionbuzz.com/petitions/rehiremartinfloe

Sharon Dickinson
Charlie Mas said…
Dr. Enfield now says that the test scores are NOT the reason that Mr. Floe was fired, so the "inconsistency" has been removed.
Charlie Mas said…
A little side piece about name-calling.

What, you might ask, is the difference between the statement "Dr. Enfield told a lie." and the statement "Dr. Enfield is a liar."?

The difference is stark. One impugns her actions, the other impugns her character. The first says that Dr. Enfield did a bad thing; the latter says that Dr. Enfield is a bad person.

We are all sinners but we retain the possibility of redemption on our crawl out of the muck and towards salvation. All of us, that is, except bad people. They don't want to be redeemed. They wallow in their filth. Some of them even writhe sensously in it. They revel in it and wish to drag us down into it with them.

To call someone a liar puts them on the other side of that line. It puts them among the damned.

Dr. Enfield tells lies. She has been caught in a number of them. Lies about the annual approval of schools, about the math curriculum, about earned autonomy, about waivers, about fidelity of implementation, about high school credit for classes taken in middle school, and more. But does that make her a liar?

It only does when she makes a deliberate and conscious choice to deceive and mis-inform. I believe she qualifies.
Mrmr said…
Workers in the district office refer to Enfield as the "Sara Palin" of the district. What this means, I really dont' know...But it is telling.
Jan said…
Mrmr: Brr. Ominous, that.

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

Education News Roundup