School Segmentation Explained
Seattle Public Schools has this School Segmentation thing. It's a bit unclear how this segmentation is done, so I'll direct your attention to the appropriate document and try to offer some of the missing transparency.
Here is how each school's Level is determined:
You'll notice that if the Absolute score is over 60, then the growth score is not a factor in determining the school's level. There are a number of schools with growth scores below 50 in Levels 3, 4, and 5, including thirteen in Level 4 and 5.
The Absolute score is calculated like this:
There were nine schools, all elementary schools, that earned 100 points by meeting the District's 2013 goals. They were Wedgwood, Whittier, John Hay, Loyal Heights, John Stanford, Bryant, West Woodland, Laurelhurst, and View Ridge. You can take this to mean that the District's 2013 academic performance goals set in 2008 were to have all of our schools performing as well as these nine academically. There were no schools that did not earn at least some absolute points. This means that there were no schools that did not at least out-perform the bottom 10% of schools in 2008-2009. The choice of the 10th percentile is not explained.
The Growth score is calculated like this:
There were no schools that did not earn at least some growth points, yet Concord, based on their school report, does not appear to have qualified for any growth points because they did not have 40% of their students in the reported categories meet or exceed the median growth for their test score peers. I'm curious about how this growth score was derived. Either way, using the 50th percentile as the benchmark is more honest and slightly less arbitrary than the previous practice of using the 33rd percentile.
The District is very pleased - and rightly so - that a number of schools have advanced in level, particularly over the past three years:
Here is how each school's Level is determined:
You'll notice that if the Absolute score is over 60, then the growth score is not a factor in determining the school's level. There are a number of schools with growth scores below 50 in Levels 3, 4, and 5, including thirteen in Level 4 and 5.
The Absolute score is calculated like this:
Schools earn 100 points if they perform at or above the District-wide 2013 Goal
Schools earn zero points if they perform below the District-wide Floor
Schools earn some points if they perform between the Floor and 2013 Goal
The Floor for any metric is equal to the Year 1 (2008-09) district-wide 10th percentile. The floor will remain constant for all five years.
There were nine schools, all elementary schools, that earned 100 points by meeting the District's 2013 goals. They were Wedgwood, Whittier, John Hay, Loyal Heights, John Stanford, Bryant, West Woodland, Laurelhurst, and View Ridge. You can take this to mean that the District's 2013 academic performance goals set in 2008 were to have all of our schools performing as well as these nine academically. There were no schools that did not earn at least some absolute points. This means that there were no schools that did not at least out-perform the bottom 10% of schools in 2008-2009. The choice of the 10th percentile is not explained.
The Growth score is calculated like this:
Schools earn 100 points if 60% or more students met or exceeded typical growth
Schools earn zero points if less than 40% of students met or exceeded typical growth
Schools earn some points if between 40% and 60% of students met or exceeded typical growth
Year-to-Year Growth Model measures if students grew as much as others in their same cohort across WA State with similar scores in previous years.
There were no schools that did not earn at least some growth points, yet Concord, based on their school report, does not appear to have qualified for any growth points because they did not have 40% of their students in the reported categories meet or exceed the median growth for their test score peers. I'm curious about how this growth score was derived. Either way, using the 50th percentile as the benchmark is more honest and slightly less arbitrary than the previous practice of using the 33rd percentile.
The District is very pleased - and rightly so - that a number of schools have advanced in level, particularly over the past three years:
This year 16 schools advanced a level this year while only 7 retreated a level.
As with previous years, however, the Level 1 and Level 2 schools are in low-income communities and exclusively south of the Ship Canal:
Level 1 Schools:
West Seattle Elementary
Hawthorne Elementary
MLK Elementary
Emerson Elementary
Highland Park Elementary
Madrona K-8
Level 2 Schools:
Dunlap Elementary
Muir Elementary
The NOVA Project High School
Graham Hill Elementary
Sanislo Elementary
Rainier Beach High School
Concord Elementary
I'm particularly troubled to see our two elementary SIG schools, Hawthorne and West Seattle Elementary on this list. While we can't expect their absolute scores to be much higher yet, the below-average growth is a source of concern. They did come close to the average, at 44% and 47%, but the investment was in growth and they are only doing as well as other schools without that investment. I don't know how much attention is being paid to this.
This school segmentation is used to measure the district's academic performance in a two-dimensional way reflecting both achievement and growth. It is also used to determine each school's level of autonomy in allocating some of their budget. Schools in Level 1 and 2 are told by the District how to spend some school improvement money while schools in Level 3, 4, and 5 are granted a broader menu of choices and, in cases, can make choices off the menu. This is the "earned autonomy" that was bandied about a few years ago.
Comments
I haven't had time to do a full report about the State of the District speech which included three schools and their scores in it. I will try to get that in this weekend.
Ben
I wonder, Charlie. One way to look at this might be to conclude that for certain low-performing schools, for whatever reasons, the extra investment is required just to keep those schools making comparable improvement -- that without it, they would have had LOWER growth. I am not saying this is the case. I am just saying -- these schools did not end up where they started simply by chance. And we always want (and rightly so) them to make exceptional progress, so they can "catch up" to schools doing better. Is it possible that there is a "success" story for the extra dollars in just having them (more or less) keep pace with the growth of others -- at least in the early years of growth? I am not saying this is the case; I really don't know. But I wonder.
Rachel
So each student in the state who scored a 250 last year forms a peer group and their current year performance is gauged relative to that peer group. The students who scored 254 are another peer group and their performance on the test this year is measured relative to that peer group.
So historically low scoring students are compared to other historically low scoring students and historically high scoring students are compared to other historically high scoring students. The comparisons are done regarding the students as individuals, not as a school.
There is little evidence that Level 1 or Level 2 schools have diminished enrollment - except for Rainier Beach High School.
MAP is now primarily used as a measure of student academic growth for the purposes of teacher performance measurement. The District uses this as the primary rationale for why they cannot drop the test.
It may be that there are a few - very few - teachers who get value out of the test as a formative assessment. However, it is administered too infrequently to be of optimal value in that role and the results are indecipherable by the bulk of teachers. Even if the MAP were effective as a formative assessment (the original rationale for it) and even if the results were clear to teachers, there remains the question of whether teachers have the time, resources, and license to individualize instruction in response to the MAP results.
In coming years, MAP will be used as an assessment in the context of MTSS to determine if a student needs something other than the Tier 1 standard instruction. That's not here yet, and it's unclear how authentically MTSS will be implemented and how effective MAP will be as an assessment in that role.
-Fremont Mom