Yet Another Deceptive Statistic

Let me just tell you upfront that this deception is MUCH WORSE than the falsehood about "only 17% of graduates meet college entrance requirements". This is much worse than that.

There are two numbers on the school reports that do not mean what you think they mean. These numbers do not mean what the District says they mean. These numbers are COMPLETELY misleading. The more I think about them, the more convinced I become that they don't mean anything at all. Yet they are two of the most important numbers on the report and they determine half of the school's overall grade.

On the school report, the numbers are labelled "Students making gains on the state reading test" and "Students making gains on the state math test".

Look at the report for Beacon Hill Elementary. You will see that it says that 69% of students made gains on the state reading test and 67% of students made gains on the state math test. That sounds pretty good, doesn't it?

What do you think it means?

I'll tell what I thought it meant. I thought it meant that 69% of the students at Beacon Hill got a higher raw score on the MSP this year than they got last year. If a student scored 410 last year and scored 420 this year, then that student counted as one who made a gain on the state test. Is that what you thought the number meant? What else could it mean? Isn't that the percentage of the students making gains on the state test? Well that is NOT what the number means.

In fact, if you take a look further down on the page you will see that students in every category passed the state reading test in lower numbers than last year.

How can it be that the pass rates dropped when 69% of students made a gain?

And here's something weird: the district averages are 66% every year in every category at every grade. That's kinda weird, isn't it?

The answer to both mysteries lies in the derivation of the number labelled "Students making gains on the state reading test" and what - if anything - it really means.

WARNING - HEAVY MATH TALK AHEAD.
Honestly, it's probably enough for you to know that the schools' numbers just don't mean what they appear to mean and that the District number is rigged to always come out to 66%. If you need to know more, try this sentence for coherence:

The student gain number represents the percentage of students who out-performed the 33rd percentile of other students in the district who scored the same as they did in the previous year.

Here's how it works.
If a student scored 410 last year then their score this year is ranked with all of the other scores from students who scored 410 last year. If that ranking is better than the bottom third, then that student counts as having a gain. The student might actually have scored worse than he or she scored last year, but - so long as they outscore one-third of the students who got the same score as them last year - they count as having made a gain.

That's incredibly deceptive.

Here are some interesting consequences of using this as the measure for growth:

1) Since 66% of the students will ALWAYS finish above the bottom third in any ranking, the District average will ALWAYS work out to 66%. That means that the superintendent will always get to make it appear as if year after year 66% of the students in the district made gains on the state tests. Even in years such as this one when pass rates were down, it will appear that 66% of the students across all grades and subjects made gains on the state tests.

2) The choice of the 33rd percentile as the bar is completely arbitrary. The District could have chosen the 50th percentile, in which case it would appear that only 50% of district students made gains on the state tests each year. Of course they could have chosen the 17th percentile, in which case it would always appear as if 83% of the students made gains on the state tests.

3) This is a zero net-sum game. If at some school they have 70% of the students beat the 33rd percentile then some other school will have only 62% of the students beat the 33rd percentile. There will always be 33% of the students who fail to beat the 33rd percentile. A gain for one school will come at the expense of others. They cannot all do really well.

4) In a year like this, when test scores are down across the District, the 33rd percentile will have a negative delta. That means that some portion of the students who are counted as having made a gain on the state test will actually have gotten a lower score than in the previous year.

Here's what's driving me crazy: why couldn't the District have simply reported the percentage of students who got a higher score than in the previous year?

To their credit, the District does disclose the meaning of this statistic. Here's how they explain it:
% of students at or above the 33rd percentile of growth from year to year on the state test using the Colorodo Growth Model. The Colorado growth model estimates a "growth percentile" for each student with at least two years of test data by creating a peer group of all students in the same grade who had a similar test history, and then rank this group of students by their test scores in the current year.
The District says that this number is important because "All students should be making progress every year and be on their way to meeting or exceeding standards." All students SHOULD be making progress, but this calculation guarantees that 33% of them will not be counted as making progress - whether they do or not - and that 66% will be counted as making progress - whether they do or not.

Is anyone really okay with this?

Let's remember that school segmentation, whether a school is Level 1, Level 2 or Level 3 is based, in part, on this "growth" measure. Every student at Beacon Hill Elementary doesn't know it, but they are in a race with every other student who got the same score as them on last year's test. And if they don't finish ahead of the bottom third, their school will drop down in the segmentation and be subject to more District-level control and interventions.

Comments

beansa said…
OMG, somebody hand me a shovel!
Jan said…
What this suggests to me is the same conclusion that some of the District's other actions suggest -- actual student achivement, or progress, or learning, -- however you want to title it, is becoming irrelevant. ALL of the testing, and all of the overemphasis on data, is just a method of generating numbers that will allow them to "reform" schools, move "nonperforming" teachers out in favor of TfA, charters, etc., and enter into one high priced contract after another with private entities who will "fix" the fake problems that the fake data suggest exist.
This is totally "grading on a curve," which is useful for ranking schools/teachers/etc. under the new plan -- but is useless in terms of providing objective measures of whether students are learning. Kids could all be scoring in the 90th percentile -- and yet, 33 percent of them would be "not making progress," which would subject their teachers to heavy handed district management and possible dismissal, and their schools to closure.

We are no longer running schools; we are running teacher/school evaluation systems -- with learning as a second tier byproduct.

We need to ask the Board to stop this! And parents need to boycott the MAP/MSP, and starve this beast of the data it needs to implement this system. They can only do this if our children generate the data they need to make this happen.
dan dempsey said…
Wowzers!! Just as I finished my comment on DeBell and wandered off into trying to figure out how the District spends $1122 on Central Admin and does such a pitiful job of producing much in the way of positive results, Charlie comes to the rescue.

Now can someone explain how at middle schools the percentage of 8th graders going into high school "prepared for High School Math" can be so large that many of the students who are unable to score above far below basic on the MSP math are considered ready for high school math?

Looks like more creative accounting. It certainly takes time to generate data.... and if the data is useless to teachers how is that going to improve the instruction or the situation?

Just when I think it cannot possibly get any more strange, the SPS continues to surprise me.

This is the crew that is going to use data to rate teachers ... what a first class nightmare.

Let us take another no confidence vote ... seems like 100% NO CONFIDENCE is now in reach.
Anonymous said…
Well, so much for those shiny new scorecards.

You have done Meg Diaz proud.

-skeptical-
dan dempsey said…
Jan you hit it.

The kids are required to get funding to run the machine. The running of the machine has little to do with providing students and families with a positive service.

I contend the district could be run like Auburn and $30 million moved from Central Administration to the schools. Of course a lot of the complete nonsense coming from SPS Central could no longer be afforded.

Upper administration would see RIFs galore.

==============
Remember DeBell agonizing over extending Goodloe-Johnson contract to an additional third year? Fired with cause is looking mighty fine at this point.

The 17% and the NTN 3-12-2010 Action Report forgery situation should soon have MGJ on the road if anyone cared to act.
dan dempsey said…
Crappy Chart Thursday ... oh yeah.
dan dempsey said…
Full disclosure should have included:

"All students should be making progress every year and be on their way to meeting or exceeding standards."

But we have rigged this shell game so regardless of whether students in the district are learning anything or not it does not matter. 66% will be making growth and 33% will not and 1% will be lost in the shuffling of paper.
dan dempsey said…
Hey Charlie,

Do not stop at reading and math for all ...

Try this as well:

English Language Learners making significant gains on the state english proficiency test
% of English Language Learner students with two consecutive years of state test results making gains on the state WASL/MSP assessments. Gains are calculated by the % of students at or above the 33rd percentile of growth from year to year on the state test using the Colorado Growth Model. The Colorado growth model estimates a "growth percentile" for each student with at least two years of test data by creating a peer group of all students in the same grade who had a similar test history, and then rank this group of students by their test scores in the current year.

English language proficiency is an important skill that our schools should be supporting in all students.
=================

So important to the district we have already decided to rig the game so that 66% of these students will appear to be making gains.

=================
Seattle Change minus State change
Reading grade 10
Limited English = -9.30%

Reading grade 8
Limited English = -10.90%

Reading grade 7
Limited English = -2.10%

Reading grade 6
Limited English = -7.50%

Reading grade 5
Limited English = +0.50%

Reading grade 4
Limited English = -2.40%

Reading grade 3
Limited English = -4.60%


===============

On to Limited English and Math


Seattle Change minus State change
Math grade 10
Limited English = -5.40%

Math grade 8
Limited English = 0.00%

Math grade 7
Limited English = +3.90%

Math grade 6
Limited English = -1.70%

Math grade 5
Limited English = +3.80%

Math grade 4
Limited English = -3.20%

Math grade 3
Limited English = -5.30%

===============


Writing grade 10
Limited English = -4.10%


Writing grade 7
Limited English = -14.00%

Writing grade 4
Limited English = -7.50%

=================

The problem is that in the original strategic plan the goals that MGJ wrote were completely improbable to fulfill. As a result we see this completely new bogus metric for measuring.

When Limited English speaking students were supposed to be making gains but instead the groups are going backwards a new metric for measurement is introduced. Students making gains.... how brilliant is this? The answer is always around 66% no matter what really happened.

=========
And Harium will say progress is being made and this needs to run the full 5 years before an accurate appraisal can be made. We only have the first two years of Strategic plan results.
dan dempsey said…
Consider this:

2010 Seattle Public Schools District Scorecard:
What the Data Shows and How the District is Responding

Background

Seattle Public Schools’ annual District Scorecard provides a snapshot of district-wide performance, showing both where academic growth has been made as well as where that rate of growth puts the district in terms of meeting its five-year goals set forth in the strategic plan, Excellence for All.

Seattle Public Schools is committed to raising achievement for all students, and blah, blah, blah.

The District Scorecard includes district-wide academic data, from test scores to graduation rates, and key operational data showing performance and efficiency for services such as transportation and maintenance that directly support schools. The district uses data from the District Scorecard to guide decisions about how to increase academic achievement and close achievement gaps.

{[Oh really like say for the Limited English speaking students??? How is that working out?]}

Changing a large urban school system takes time. During the first two years of Excellence for All, the district has focused on improving the systems it uses to support and prepare students for success.

{[Is there "Any evidence" that this focus on improving systems to prepare students is working or just horribly expensive???]}

WOW deceptive stats are going to do a lot aren't they?

So who is going to fire the superintendent with cause? or must we endure three more years of total nonsense?
dan dempsey said…
Oh wow I had no idea but....

The executive management team reviews strategic plan project issues twice a month, actively working with project managers to form corrective action plans for projects that are behind schedule.

----
Areas for Improvement

High School Performance in Reading & Math Went Down

* 10th grade state reading and math performance were down despite flat trends at the state level.
o This indicates the need to offer rigorous courses to all students in core subjects.

----
Holy Hanna, I thought it meant the SPS blew $1.2 million on crappy math materials and how to use them for high school.

The predictions made from observing Bethel's lack of success with Discovering are right on track.

So let me get this straight the District bought $800,000 worth of math books and spent $400,000 on math professional development .... and yet "the need to offer rigorous courses" was not met. What happened that the $1.2 million missed the mark?

-------
Is it possible Judge Julie Spector was correct? That the Board failed to use all the evidence and that no reasonable person upon evaluation of all the evidence could have reached the approval decision.

Maybe we need judges to run for school board as we seem to be short on Directors that use evidence.

Note: Spector asked the Board to reconsider using all the evidence. Clearly MGJ, Carr, Sundquist, Maier, and Martin-Morris see little need to use all the evidence even when a Superior Court Judge rules that they should use all the evidence.
Anonymous said…
dan- ready for h.s. math is passing 8th grade math w/ a 70% or better. (on district page where the report cards are linked)

so, as long as my students do better than the bottom 33%- of kids who are like them with similar
test history... crazy.
why can't they celebrate ALL gainers, not just the top 66%? i can see how the method gives a chance for recognition of 'non gainers,' but when (if) our kids are all rockin' 33% would be considered failures. that seems like a serious flaw that will hurt kids.
ttln
grousefinder said…
Charlie: This is..."driving [you] crazy: why couldn't the District have simply reported the percentage of students who got a higher score than in the previous year?"

Disclaimer: I am just explaining this phenomenon, not supporting the spurious data claims created by Brad Bernatek.

Here is the answer. It has to do with the idiosyncrasies of test writing from year to year. The answer is two-fold. I will use science as an example (though it really does not apply to the 3-5 data they are using to demonstrate "growth."

The last science 5th grade MSP showed an 11% drop statewide in raw scores. Recall that all tests were truncated per the wishes of Randy Dorn. The test was, therefore, higher stakes on a question-by-question basis. Missing 1-point on an extended answer had a greater consequence to overall score than in previous years. So, even a strong science student might see a drop in score should they be tested two years in a row. This rational applies to reading, writing and math. The shorter the test, the higher the stakes per question. However, this was a one-time scenario (maybe).

Second, each year MSP test writers get together in various places to come up with test questions that best align with the standards. So the 4th grade team in year X may consist of a bunch of hard-driving old-school educators that like tough questions. Meanwhile in year X the 5th grade test writing team is a bunch of constructivist educators that wish to make all test takers feel good. This group will write an easy test. In year Y the roles reverse. The 4th grade team are softies and the 5th grade team are toughies. What we have then is a test that varies in its rigor from year to year. Remember, the rigor (or difficulty) of a question is subjective based on the whims of the test writers.

So you can't insure that test writers will all have similar goals from year-to-year, and thus you can't demonstrate growth in reading and math based on the change in pass rates within a cohort from year X to year Y.

The only real measure that is a valid indicator of student success is the number of students making Level 3 or 4 on the MSP. On the MAP it's the number of students at or above a RIT score that is correlated to the State Standards. This later correlation does not exist. In fact hundreds of teachers have requested it from the District, but have been refused. Instead, students are evaluated based on what is called a "growth index." This little gem has nothing to do with the standards alignment and is useless to teachers for whom standards are the standard by which students are evaluated.

Just a side bar note here: On the MAP, schools with high performing students will eventually begin to demonstrate a low growth index because teachers generally teach to a fixed standard. So, if you are doing your job, students will grow as they are supposed to and their MAP scores will show small gains. To demonstrate remarkable growth a teacher must teach roughly two grades worth of curriculum per year. I have talked to the MAP wonks about this oddity. The response is always the same, "We'll get back to you on that."
ParentofThree said…
"Is that what you thought the number meant?
Why, yes I did think that is what the number meant.

"Is anyone really okay with this?"
Nope, not OK, but honestly I am an astute parent who knows that these reports (and the hype around them) is BS.

And honestly I am amazed that so many people out there still buy into all the falsehoods.

For more BS, read the "2009 Strategic Plan, Report to the Community." There MGJ sites a $50 million savings over five years for closing schools, $2 million savings in transportation (yet that report has yet to see daylight!)


Lessons learned?
Charlie Mas said…
I agree with grousefinder.

The statistic I would like to see on this line is the number of students who went from a 1 to a 2, from a 2 to a 3, and from a 3 to a 4. I would also like to see the number of students who went from a 4 to a 3, from a 3 to a 2, and from a 2 to a 1.
Charlie Mas said…
By the way, take a look at this school report for Martin Luther King Jr Elementary School (formerly Brighton).

You will see that Students making gains on the state math test was 30%.

That means that 70% of the students at MLK were in the bottom 33% for test score growth among their test score peers.

Do you have any idea how horrible that is? Do you have any idea how improbable it is for a school to have over two-thirds of the students performing in the bottom one-third?

This is not the teachers or the students. This is the principal.

Take a look at the changes in the math test pass rates at this school. They are all down double digits. The pass rate for Black students was cut in half, falling from 39% to 20%. The pass rate in math for ELL students fell by two-thirds from 47% passing to 16% passing.

Something is going horribly wrong at this school.

When you read the school's plan for improvement on page two, they don't have any. They are going to keep doing what they've been doing. Here's what it says:
"We use the 'Everyday Math' program, which provides students with instruction that emphasizes real‐world situations. Lessons include time for whole group instruction, small group, partner and individual activities. In partnership with the Explorations in Math program, we are creating a culture where teachers discuss math with other teachers and plan lessons together."

Chilling.
seattle said…
Charlie, where would the public find factual data on the student test scores district wide, and their drop this year?

And here was some interesting info on tests being tied to graduation, by Rand Dorn (column in the times 11/29/10):

"My math-science proposal will require students through the class of 2014 to pass only one end-of-course exam in math, and it will delay the science requirement until 2017"

The current requirement is that all students in the classes of 2013 and beyond have to pass EOC exams in both Algebra and Geometry to graduate, as well as an EOC in science.

Are we finally admitting that our kids are still grossly unprepared to pass end of course exams in math and science? And that many students will not graduate due to failing these exams?

Where all this data?

And, I have a question. What about the HSPE? Do students have to pass the HSPE to for graduation? Or are the EOC's replacing the HSPE requirement. I have a HS student and have no idea what is or is not required to graduate anymore. My head is spinning.
ParentofThree said…
Wait, wasn't Brighton/MLK the school where Beverely Raines was principal for 30 years then moved against her will? Do you think that maybe lack of leadership has something to do this how this school looks on paper for this reporting cycle?
hschinske said…
The statistic I would like to see on this line is the number of students who went from a 1 to a 2, from a 2 to a 3, and from a 3 to a 4. I would also like to see the number of students who went from a 4 to a 3, from a 3 to a 2, and from a 2 to a 1.

I would add "or higher" and "or lower" as appropriate (kids must occasionally go from a 2 to a 4 or possibly even a 3 to a 1), but basically, what Charlie said. I don't think fluctuations in the three-digit scores (which incidentally aren't raw scores, but that's a nitpick) are meaningful.

I would really mostly be interested in changes in the pass rate, as I consider the state tests (apart from MAP) to be essentially pass/fail.

Helen Schinske
dan dempsey said…
Holy Bizarre SPS....

grousefinder:

This later correlation does not exist. In fact hundreds of teachers have requested it from the District, but have been refused. Instead, students are evaluated based on what is called a "growth index." This little gem has nothing to do with the standards alignment and is useless to teachers for whom standards are the standard by which students are evaluated.

Mas on Brighton (MLK Jr):

In partnership with the Explorations in Math program, we are creating a culture where teachers discuss math with other teachers and plan lessons together.
-----
Is there any evidence this culture is having a positive effect on students?

The culture has all the right words to make UW Math Education Project devotees really happy but more baloney is no substitute for results. UW CoE is likely "Overdosed on Bliss" while reading these cultural fairy-tales.

----
Want better results?

Yah sure, Well then:
Hire someone in a leadership capacity who has a clue. There is scant evidence that current Central Administrative direction is based on evidence of having a clue about much of anything. Worse they are spending big money to create data reports that are essentially meaningless thus they never will get a clue.

Saved $50 million closing schools and $2 million on transportation. Well that is just peachy ... Please show me the data on which these claims are based that the SAO would find valid.

We can save $30 million annually by eliminating those that build excessively high houses made of cards.

And the Board kept on Rubber-Stamping ... ad infinitum while listening to a tune heard only by them. As teachers and parents scratched heads wondering: How lifelong learning had become such an insurmountable challenge for at least four school directors.
Anonymous said…
Here's what's happening in my child's classroom with MAP and math:

In what appears to be an attempt to show growth for already high scoring kids, the teacher is "cramming" for the MAP test. The required curriculum is taking a back seat while random math topics are covered superficially.

It's absolutely maddening. And wrong.
hschinske said…
Just a side bar note here: On the MAP, schools with high performing students will eventually begin to demonstrate a low growth index because teachers generally teach to a fixed standard. So, if you are doing your job, students will grow as they are supposed to and their MAP scores will show small gains. To demonstrate remarkable growth a teacher must teach roughly two grades worth of curriculum per year.

You say that as though it were a bad thing. What is wrong with expecting a student to learn a year's worth of new material in a year, rather than coasting? I thought that was supposed to be the kind of thing that MAP was meant to bring to light.

Now, the lack of support for teaching anything but the grade-level standard -- THAT is a problem. The lack of support in using MAP data to show whether students met specific standards -- THAT is a problem. The apparent inaccuracy of the MAP in measuring achievement -- THAT is a problem. Expecting all students, including those working ahead of grade level, to show that they've learned something this year -- not a problem.

Helen Schinske
ParentofThree said…
MAPS testing: Can we start a thread on opting out. I plan to opt all my children out and would like to know how others are handling it.
Lori said…
In trying to understand this better, I read through information on the Colorado DOE website, specifically this FAQ page about their growth modeling: http://www.schoolview.org/GMFAQ.asp

While I agree that the way SPS has presented the numbers is highly misleading, I actually do sort of understand the theoretical framework behind the growth model. Yes, SPS should have explained it more clearly, but are we surprised? They can't even make effective Powerpoint slides with their data. I seriously wonder what the qualifications are of the SPS employees who do data analysis and number crunching.

Anyway... if I'm understanding this correctly, the way these data would be used in practice is exactly as Charlie describes about MLK. Yes, every year, there will be one-third of children who are considered not to have made *growth* relative to their peers. In an ideal world where every child had the same home environment and access to a great school, these children would be evenly distributed throughout the district, and each school would have exactly 66-67% of children above that threshold and 33-34% below (plus or minus based on standard deviation/SEM/CI).

So, what the district is really looking at is those schools that have a much higher percent or a much lower percent of growth, because this suggests that the schools are not *equal* in some important way that is not yet known. MLK has a disproportionate amount of children in the critical area of the ranking system. But what does that mean and what do you do about it? Is this the fault of the teachers, the principal, the parents, the community? In order to develop an effective intervention, you need to know where the problem lies. But SPS will use this information to replace the principal (apparently already happened) and the teaching staff because they assume that is what caused this discrepancy.

I can see some value in this metric (although I don't necessarily agree it should be used to evalute teachers), but it shouldn't be viewed in isolation. It is indeed possible for all children to actually get a lower score in the 2nd year, but the growth model doesn't account for whether the score change is positive or negative. This, and other analyses, are critical to consider as well.

(hmm, WV is conali, as if I'm an ally of the district in this con. Weird)
dan dempsey said…
seattle said...
Charlie, where would the public find factual data on the student test scores district wide, and their drop this year?

Right Here.

Use the:
Select a category of students:

to change lots of stuff.

I have a spreadsheet of change from 2009 to 2010 that factors in the State change with the Seattle Change to get a more realistic view.

If you would like it write:
dempsey_dan@yahoo.com
dan dempsey said…
Check this out in the Tacoma News Tribune.

Tacoma teachers say performance evaluations must be fair

Tacoma teachers want their superintendent and union president to know that if they're going to be judged on how well students perform on state and national tests, they want to ensure that the evaluation process is fair.

"Test scores are a start," said Billy Harris, who teaches math at Giaudrone Middle School.

But he also urged the school district to consider using a panel of evaluators that might include students, other teachers and principals. He said classroom observation by administrators should count too.

Harris' remarks reflected the opinions of other teachers who spoke Tuesday at a forum held at Foss High School. The forum, which drew more than 30 educators who addressed comments to Superintendent Art Jarvis and Tacoma Education Association President Andy Coons, was the first of a dozen designed to gather ideas on how to improve teacher quality.

How about Central Administration quality...
How about instructional materials quality ...
How about the quality of education research used by District decision makers...
How about the need for practice to Master content...
How about the lack of content in many programs...
How about the teacher preparation programs at UW etc.
How about Teach for America....

How about someone spending $37.40 + tax on Hattie's "Visible Learning" Read all six reviews at Amazon.

AND THEN doing something that will positively impact student learning rather than having
11 more meetings designed to gather ideas on how to improve teacher quality.

Hey after a dozen meetings gathering ideas... maybe they could read some peer reviewed research and gather some data as part a thoughtfully designed program to improve student learning outcomes. Then perhaps it could be seen as to whether the Central Administration is even headed in an appropriate direction.
SP said…
This comment has been removed by the author.
Maureen said…
This comment has been removed by the author.
dan dempsey said…
Nicely done .....

So are things any better at WSHS since Ramona P. left her desk at the JSCEE?
dan dempsey said…
It appears I just responded to a comment deleted by the Author .... punked by the author.
SP said…
For historical perspective here's an interesting look at how Ramona Pierson, head of the REA before Bernatek, was using data to support district positions:

A regression controlling for the demographic differences allows us to look at the program of West Seattle specifically and compare the pure program without the noise of "difference caused by cohort differences" from being added in.

For instance, the impact of poverty for a school with a 30% poverty rate may actually be higher than a school with a higher poverty rate of 42% because poverty impacts people differently. Therefore we controlled for the differences statistically by controlling the differences, For example, if a school of 30% poverty rate has kids who are living in homeless shelters and has refugee kids...the impact of poverty may be
felt more for these kids as it effects academic achievement than it might for schools with 42% poverty with students who are living in a home with a family that live below the poverty line....therefore, we controlled for these differences in demographics by pulling them out
from all schools and doing a pure comparison of academic achievement.

Therefore, we used a regression...controlled for the differences in cohort and evaluated the program against other programs...thus, the
statement, when all things are equal, the WSHS program is in the top three....

That means by equalizing the demographic differences in students and the impact of the demographic differences, so we can look at the Program in WSHS...they are doing well.

One more try...I was a foster kid growing up...no mom and no dad and
lived in lots of homes and I had food only once a day so I was usually 20% underweight for a girl my age all through High school...would it be fair to compare my score on the SAT against a kid who lives in a home
with a mom and dad which is stable, and provides good food for all three meals? No...you would want to compare me against my equal and the kid with the normal family against their equal...in order to have a real look at how well my high school prepared me for taking the SAT.
Therefore a regression controlling for demographic differences does its best to compare apples to apples and oranges to oranges... So the comparison is honest and fair.
Ramona Pierson
-----------------------

(note: This "top three" rating from the REA completely contradicts an an independent study released at the same time stating that when adjusted for the standard poverty (F&RL), WSHS had the lowest WASL pass rates both for all three subjects, as well as just for math, compared to all 10 comprehensive Seattle high schools).
TechyMom said…
Are those percentiles for Seattle only? Or, are they state-wide (MSP) and national (MAP) percentiles? If they're city-only percentiles, then, yes, it's a zero-sum game and not a good way to manage schools and teachers.

If it's state or nationally normed, I don't think it's unreasonable to expect Seattle students to have growth that is not in the bottom third of the state or country. While "is the score higher this year than last" is a compellingly simple measurement, the percentiles might help answer the question "how much higher is enough?" and to deal with situations where all scores were down, for example if there was a problem with the test.

You might still expect to see around 66% meeting the goal and 33% not. That would make Seattle typical of the state or nationally-normed group. However, given the resources we have here, and some of the problems we have less of than other districts (particularly nationally), it doesn't seem unreasonable to expect Seattle to be better than the state or national average. If we're not, it might mean there's a problem. My bet would be on curriculum rather than "teacher quality," but some serious analysis would need to be done before we'd know.
Maureen said…
Sorry for pulling my post Dan! I just read deeper into the Colorado Growth Model FAQs and I want to do a rewrite! Colorado seems to use a low/medium/high growth system that may have determined SPS' use of 33% as if it is something meaningful. (which it may be to some extent if you are comparing to state level results, which SPS is not).
Lori said…
Hmmm, I posted a comment at 937AM, which showed up in my RSS feed but is not here. Weird. I may try to repost it, so my apologies if something goes crazy and it shows up multiple times.
Dorothy Neville said…
Lori, it was in my RSS feed as well. I am here actually to thank you for it.

Seattle Parent, the quote from Ramona P is startling. How the heck would the district accurately assess whether students are in shelters or simply living a hair under the poverty line?

The state DOES do a regression analysis of schools that are doing better or worse than expected based on demographics, but we ignore that.

Remember, Brad Bernatek has a European History undergraduate degree and a couple of MBAs. He has never shown himself to be all that adroit with statistics and analysis.
Lori said…
alright, trying again, divided into two posts:

In trying to understand this better, I read through information on the Colorado DOE website, specifically this FAQ page about their growth modeling: http://www.schoolview.org/GMFAQ.asp

While I agree that the way SPS has presented the numbers is highly misleading, I actually do sort of understand the theoretical framework behind the growth model. Yes, SPS should have explained it more clearly, but are we surprised? They can't even make effective Powerpoint slides with their data. I seriously wonder what the qualifications are of the SPS employees who do data analysis and number crunching.

Anyway... if I'm understanding this correctly, the way these data would be used in practice is exactly as Charlie describes about MLK. Yes, every year, there will be one-third of children who are considered not to have made *growth* relative to their peers. In an ideal world where every child had the same home environment and access to a great school, these children would be evenly distributed throughout the district, and each school would have exactly 66-67% of children above that threshold and 33-34% below (plus or minus based on standard deviation/SEM/CI).

cont'd on next post
Lori said…
con't 2nd part

So, what the district is really looking at is those schools that have a much higher percent or a much lower percent of growth, because this suggests that the schools are not *equal* in some important way that is not yet known. MLK has a disproportionate amount of children in the critical area of the ranking system. But what does that mean and what do you do about it? Is this the fault of the teachers, the principal, the parents, the community? In order to develop an effective intervention, you need to know where the problem lies. But SPS will use this information to replace the principal (apparently already happened) and the teaching staff because they assume that is what caused this discrepancy.

I can see some value in this metric (although I don't necessarily agree it should be used to evalute teachers), but it shouldn't be viewed in isolation. It is indeed possible for all children to actually get a lower score in the 2nd year, but the growth model doesn't account for whether the score change is positive or negative. This, and other analyses, are critical to consider as well.
hschinske said…
The district does track homeless students, but otherwise the only broad categories they keep track of are free lunch, reduced lunch, and non-FRL. I'm not sure they have the capacity to analyze any further, and I don't really see why they should do so, at a district level.

Of course at a building level someone should figure out what's going on with a kid experiencing anything like Ramona Pierson apparently did. But I would think mighty poorly of any teacher or administrator who was satisfied with just knowing HOW MANY such students there were and how to compare their educational data -- surely the point would be to FEED them, or better yet, get them into non-abusive foster care? (Failing to feed foster children is surely both abusive and fraudulent, considering that foster parents are paid to do so.) Saying "We'll spot you X number of points on the SATs" is hardly the same thing.

Helen Schinske
hschinske said…
In what appears to be an attempt to show growth for already high scoring kids, the teacher is "cramming" for the MAP test. The required curriculum is taking a back seat while random math topics are covered superficially.

How would you distinguish between the situation you describe, and effective, appropriate enrichment and differentiation for students who have already mastered some or all of the standards for the year? What would the teacher need to do differently to accomplish the latter rather than the former?

Helen Schinske
Anonymous said…
Charlie, the stats you request are the ones I keep for me, personally, and my department, when I have time to do all of the data entry since I don't have access to databases with the records. I am given paper copies, sometimes, when I request them. I then look for causal correlations like "intervention class" or "high absenteeism" or "parent refused services." As is the case when dealing with human beings, what may cause one student to make gains/lose ground does not always have the same affect on others with the same or similar experiences. There are always outliers. However, trends are noticeable and analysis of the MSP data in this way helps to examine the systems in place at the school for their effectiveness and/or impact on student learning. From there data based decision making can occur- expand reading intervention, change the math intervention, etc.

What bothers me about the 33% low growth indicator is that the measure is forever changing. If this is how I am to be evaluated, I feel that it is only fair for me to know what the target is before I am judged against it. To do otherwise seems like a set up. I, for one, don't appreciate being blind-sided by negative feedback.

On the other hand, it might be interesting for teachers with high achievers to see if they are on the "low growth" end of the spectrum with their students vs. the "high growth" end. Will that even be discernable with the AHG demographic? How much do they move on the MSP (because they are tested on the standards for their grade level- I assume their scores are close to the ceiling)? Perhaps the model will provide interesting information on high achievers and their teachers.

For the most part, I vote "hate it."
-ttln
dan dempsey said…
I just find it amazing that this much effort goes into number crunching and yet the math instructional materials and practices get a complete pass. So how is readers and writers workshop coming along?

Really is there a reason not to change doing business in a general way that would improve the situation substantially for most teachers and students? .... Why all the expenditure of time and resources to figure out how to meet ridiculously optimistic goals written in the Strategic Plan through some bizarre calculation that has zero to do with reality?
dan dempsey said…
How about this for a statistic....

When entering a National Park, I often see Smokey the Bear poised beside the Fire Danger Board and an arrow pointing to low, medium, high, extremely high.

Perhaps we could get a similar sign for the JSCEE.

Confidence Level in the Superintendent:

high, medium, low, exceptionally low, catastrophically low

But apparently MGJ is performing at the median level for all current Superintendents employed in the Seattle Schools. If we only compare growth among cohorts of similar Seattle Students, it seems appropriate that the Superintendent be evaluated with her cohort of 1.
Chris S. said…
Wait, we have standardized tests but they are not standard enough from year to year to use the data without engineering it into some completely predictable number?

When you change the test, you have to accept you can't compare apples to oranges. If you don't change the test, if you can't assume some consistency, why are we bothering anyway?
Anonymous said…
Helen - With the district curriculum as a base, enrichment or differentiation would probably look more like an extension of what is being taught. More depth vs acceleration.

"Cramming" would look more random, be done by the whole class, and be totally out of place with the prescribed sequence of topics.
JvA said…
Maureen -- Thanks for the link to the FAQ. The 33rd percentile really is arbitrary, isn't it?

From what I interpret the FAQ to say, it's not that 33 or higher is GOOD, but that anything under 35 is BAD. It then seems to say that if you want to pick a number to investigate growth patterns for benchmarking purposes, that number should be 50 (not 33).

What is considered low growth?

As defined by Colorado State Board of Education rule, a student growth percentile for a single child that falls below the 35th percentile reflects low growth. ...

What is considered typical growth?

... As defined by Colorado State Board of Education rule, a student growth percentile for a single child that falls within the 35th-65th percentile range reflects Typical Growth. When referring to median Growth Percentiles, such as for a school or demographic group, the Colorado Department of Education (CDE) considers a median of 50 to be typical growth for school or group. The statewide median growth percentile in each subject and grade is the 50th percentile. When examining medians for schools, grades, subjects or groups, it is useful to look for differences from 50 when investigating growth. These data are particularly useful for benchmarking purposes and to understand how other schools or grades are doing in addressing problems in the educational system, such as the frequently observed achievement gap between poor and non-poor students. ...
Anonymous said…
I just added this conversation as a link to my post The Battle for Seattle
Charlie Mas said…
I wrote to the Board about this bogus statistic. I haven't heard back from any of them, but I did get a VERY weird email.

It was my message sent back to me - as if a reply - from Ron English with this addendum from him at the top:
"Legally we can and should proceed. However politically we should do this in an orderly manner without the appearance of rushing."

That's not only weird; it's creepy. What is it that they want to move forward with, that might not seem entirely legal to do, and would require political finesse? Clearly it is to cause harm to someone - and I'm the only someone around. I'm not getting a good feeling about this.

I'll acknowledge that I have been a bigger pain in the ass than usual of late.

I sucessfully bullied them into granting my daughter high school credit for her middle school French.

I stirred up a hornet's nest with the annual approval of schools thing.

I filed ethics complaints against the superintendent.

And now I called bullshit on the "Students making gains on state tests" statistic.

It's been a busy month for me.

I'm not quite done. I'm going to apply for a job working as support staff for the Board. Hee hee.

I'm SUCH a prick!

So if they are planning some sort of retribution for me, I guess I've earned it.
Josh Hayes said…
@Charlie: if the doorbell rings, have someone else answer it. Just in case.

@Helen and anonymous: At AS1 this year we've taken about ten of our 8th graders who've already mastered at least some of the 8th grade curriculum yet to come, and moved them to a volunteer-taught classroom that's moving at about 2x to 3x the speed in the pacing guide. I hope to -- err, I mean, the volunteer hopes to get through all of the 8th grade curriculum and most if not all of 9th grade as well, this year. Is that cramming? Or enrichment? Or what?

And finally, Ramona's "explanation" of the data analysis serves only to prove to me she hasn't the faintest damn idea what "doing a regression" means. I'm a statistician, and this kind of thing just drives me bonkers.
dan dempsey said…
Hey Charlie,

You have not filed any recalls

You have not filed any legal appeals

Yes you did say that some directors need to be removed ASAP.

But is not this how Republics operate? Time for the powers that be to just "Chill Out".

There is no mandate for an Oligarchy. Thus they need to back off.

MAS for MANAGER !!! GO MAS !!

Ron English ... hummm
Charlie Mas said…
A guy I know recently told me that Don Nielsen is still controlling the board and the District and that Ron English is his representative on the inside.

He made Ron English out to be a pretty sinister character.
Charlie, I think you are the spoon that is stirring a pot that is ready to boil over.

When I write my thread on the Executive committee meeting, you'll know why. Today I promise.
Chris S. said…
Charlie, who is Don Nielson?
Chris S. said…
Oh, Google says he stepped down from school board in 2001. Just in time for me not to remember him.

It's not clear he has the typical reform agenda this board seems to support, though.
Dorothy Neville said…
Chris, Nielsen is business oriented though. He writes regularly for Crosscut or somewhere about school issues, so you can see he is still interested and at least peripherally involved. I think Charlie's rumor is fascinating, in a very rumorish sort of way.
Charlie Mas said…
I'm going to copy an anonymous comment (and then delete it since anonymous comments are not allowed):

"Anonymous said...
i highly suspect the response was a misplaced response and not meant for Charlie at all-which is creepier. but, it could hint at a Christmas gift... one we all have asked Santa for...
-please let it be termination w/ cause.
"

I suppose that this is a possible interpretation.

If the Board wanted to fire the superintendent for cause they certainly could.

Actually, any one of these acts would suffice:

* The policy violations with the credit card

* The retirement party

* The conflict of interest with the NWEA

* The failure to disclose the potential conflict with the Alliance

* The failure to recuse herself from dealings with the Alliance

* Her failure to take a specific action that the Board voted to direct her to take: to review and recommend revisions for Policy D12.00. The Board voted unanimously on January 29, 2009 directing her to do this and she never did it. Don McAdams told me that this alone is a firing offense.

* Her failure to provide the annual reports required by Policy.

* Her numerous policy violations.

* Her inability/unwillingness to enforce policy.

* Her multiple efforts to mislead the Board (17%, students making gains on state tests, Policy C54.00)

* Her multiple failures to keep commitments to students and families. SBOC building, APP split, NOVA building, Special Education,

* Her general incompetence with implementations - botched capacity management project with multiple openings and closings, botched NSAP with Garfield and West Seattle overcrowded, botched curriculum alignment, botched special education model, actually every single strategic plan initiative botched in the implementation and now even admitting that a third of them are behind schedule.

* Her miserable demeanor
Dorothy Neville said…
Do not let this be lost from a previous thread. According to the list of contracts for last year, the district gave the Alliance $130K for community engagement work. So it's not just a potential conflict of interest, it is a REAL conflict of interest.

And do not miss Mirmac's smoking gun, an email that shows MGJ asking her SPS assistant to print some documents regarding her NWEA work. Using district personnel and resources to perform her NWEA board work smells pretty bad.
Charlie Mas said…
I want to say that I think that ttln expresses a very healthy perspective on using data.

This sort of data is best used primarily to pique questions rather than to provide answers. It can also be used, thoughtfully and cautiously, to test hypotheses.

But let's not confuse statistics with reality and let's not confuse correlation with causality.
ParentofThree said…
"Legally we can and should proceed. However politically we should do this in an orderly manner without the appearance of rushing."

I wonder if English intended to forward this onto somebody else, but messed up and sent it to you instead. It is really wierd.

Did you reply back?
Charlie Mas said…
I didn't answer it because I'm positive that the message from Ron English was not meant for me.

More than that, it clearly comes after a lot of other discussion that I didn't see.

Maybe it has nothing to do with me or my message. Maybe he just accidently typed his response to email A onto email B.
Anonymous said…
can you say "oops"?

-ttln
Don Nielson is a dangerous character. He has money and power and is very much big business.

He is a gentleman (we've had several discussions) and he and his wife have been very generous in donations to SPS.

But as someone who is involved in education issues, I would watch him careful.
Anonymous said…
From a 2008 Harvard Business School of Puget Sound roundtable with MG-J, Patrick D'Amelio (Executive Director, Alliance for Education), Kimberly Mitchell(The Bill and Melinda Gates Foundation and former TFA):

Don Nielsen (HBS ’63)

Former Seattle School Board President and Chairman, Teach First

Formerly an elected member and President of the Seattle School Board, Don brings both passion for and knowledge of K-12 education. Prior to being elected to the Seattle School Board, Don had a successful career in business. He co-founded Hazleton Corporation which became the world's largest contract biological and chemical research company. Under Don's leadership, Hazleton grew dramatically, was listed on the New York Stock Exchange and then purchased by Corning, Inc. Don currently serves on the boards of many community organizations, including Seattle Pacific University, the Talaris Institute, the Seattle Foundation, the Alliance for Education and is the Chairman of the National Eating Disorders Association. Don is also a member of the Board of Advisors for the Harvard Business School, and both the University of Washington Business School and Education School. He has a BA from the University of Washington and an MBA from the Harvard Business School.
dan dempsey said…
Were Charlie left off...

To continue on 12/2/10 at 10:35 AM

She is the Board Secretary. She had the Board approve an NTN draft contract on 2-3-10 that did not match the action report prepared by her.

In spite of requests the approved contract was never shown to the public. Appeal filed 3-5-10 by Joy Anderson et al. District is late in providing transcript of evidence and it is not certified to be correct.

3-12-10 MGJ and S.Enfield produce action report, which satisfies the definition of forgery. The action report claimed to be based on a memo sent to the school board but was not. It was based on a kinder gentler earlier draft version that never was sent to the Board.

4-7-10 Board approved NTN contract

5-7-10 NTN appeal by Anderson et. al Pro se.

District violates RCW 28A 645.020 by failing to provide an evidence filing that is certified to be correct.

In fact it is not correct as the "draft memo" used to construct the 3-12-10 Action report is included but the actual memo sent to the School Board is not.

If the Board wished to fire Superintendent Goodloe-Johnson with cause ..... there is an enormous amount of material to use in dismissing her with cause.

So what is the obstacle to firing Dr. Goodloe-Johnson?

It sure is not three years of great results.
Anonymous said…
Yep. That's what I've been hearing for years. Nielsen is still VERY involved, and while maybe not directly tied to national ed reform practices, guiding destructive policies here at home. He's no friend of teachers or students.

QA high school anyone, anyone?

--flyonthewall

Popular posts from this blog

Tuesday Open Thread

Why the Majority of the Board Needs to be Filled with New Faces

First Candidates for Seattle School Board Elections 2023