School Reports

The new school reports - for the school year 2010-2011 - have been released.

They can be found here.

This year's School Reports are different from last years in only two ways. The changes are the two corrections that were promised in December of last year. They are long overdue, but better now than later still.

Last year, the measure of student academic growth was grossly mislabeled "Students making gains on the state reading test" but was actually a derived relative measure. This year the student academic growth gets a label like "4th and 5th graders who met or exceeded typical growth on the state reading test" and it is a new measure. From the glossary:
This measure shows how students are growing compared to other students in the same grade, with the same test score throughout the state. It is the percentage of students whose growth was above average (50th to 99th percentile) on the state test from one year to the next. Students are compared to other students across the state that scored the same on the state test and were in the same grade level.
This is a more honest measure (and a more honest label) than the one used last year, but it is still a poor choice. It is still a relative measure when we should focus on absolute measures. It doesn't tell us if the school did well with students working below, at, or beyond grade level.

Also, they corrected the Student Demographic information by choosing to show the number of students enrolled in APP/Spectrum category instead of the number eligible for those programs. So we have no sense how many students at Bryant, for example, are Spectrum-eligible. This is not quite the fix that was promised last year - we were supposed to get a count of the advanced learners at each school.

I'll make reference to some funny statistical anomolies in the comments.

Comments

Charlie Mas said…
Here are some odd-looking statistics:

From NOVA High School:

First, the name of the school is "The NOVA Project", not NOVA High School.

Second, check this out:
Graduates prepared for a 4‐year
college (09-10): 25%

Graduates enrolling in higher
education within 1 year (10-11): 74%


I guess about 50% of the graduates are going to college un-prepared.

First‐time 9th graders earning sufficient credit (08-09): ###

What is that? 100%?

And a few unavailable statistics:

Students with fewer than 10 absences per year (NOVA doesn't take attendance)

Seattle Public Schools Segmentation Level: Not available I don't understand that one. They have achievement and growth numbers.

Percent of test‐takers passing a college‐level test during HS (AP or IB) Data not available NOVA doesn't offer AP or IB classes.

Average class size: - Kind of hard to say, I guess. A lot of the students are doing independent study. Is that a class size of 1?

The student and staff survey numbers at NOVA are much better than the district average. The family survey numbers are right on the district average.

92% of NOVA students with IEPs passed the state reading test.
Charlie Mas said…
Aki Kurose Middle School

Pass rates on the MSP are trending down or flat. Only science is clearly up. Math could be trending up, but it's hard to regard a 38% pass rate as a bright spot.

Here's the weird thing: the year over year change in the pass rate for the school is up 7%, from 31% to 38%. But look at the demographic breakdown. Every demographic category shows increases of no less than 9 percentage points on the math test and some of them increasing as much as 19 percentage points.

How is that possible?

Growth compared to test score peers is trending down. Other schools are doing a better job.

Attendance is clearly a problem. Fewer than half of the kids have fewer than 10 absences per year.

The demographic data suggests that there are 12 Spectrum students at Aki Kurose. What is their program like? Why did they have enough math MSP scores to report but not enough reading MSP scores to report?

The school is in step 5 for NCLB, which means its supposed to be transformed. Right. There's nothing transformational happening there. It's just more of the same.
Charlie Mas said…
Rainier Beach High School

Wow. Kind of a disaster.

Test scores trending down.

Growth trending down and low.

Staff Climate Survey: "School
Leadership" Category: 12% Ouch.

Pass rate on reading test for ELL students down 47 percentage points, down 19 percentage points for FRE students, down 19 percentage points for Asian/Pacific Islander students, down 7 percentage points. It's a slaughter.

Attendance is very poor.

35 teachers for 425 students. You do the math.

Here's something really freaky. Check out the section titled: School Description and Plan to Achieve Goals. It says what K-5 teachers will do. Are there a lot of K-5 teachers at Rainier Beach High School?
anonymous said…
The math pass rate for the 2010/11 school year is higher, and in some cases significantly higher than it was in 2009/10 - in every single SPS high school in the district. That is great news, especially since the math pass rate had decreased in the majority Of schools the year before.

Of course this year we used the EOC as the measurement tool and last year we used the HSPE - not sure if that has any significance?

I'm not quite sure how to interpret the progress? Is Discovering working? Is standardization working? Are the new state math standards better? Or does it have something to do with using different tests from 2009/10 to 2010/11?

Whatever it's due to the progress is great news!

Murky Water
Jon said…
So, what's going on at Rainer Beach? Seems badly messed up. Can someone summarize the problem and possible solutions? Clearly class size isn't the issue.
Jan said…
Charlie -- it would appear to me that the plan to achieve goals for the reading section must be a complete error -- somehow, the paragraph for some elementary school was inserted in the RBHS document -- because the K-5 language is used throughout. No other reason I can think of makes sense.

But here is another oddity: For most of the math goals, their statements correlate to the charts and data. But they state: For 9th ‐ 10th grade students we will increase the % of students meeting or exceeding typical growth on the
math MAP assessment from 59% to 65%.
But 59% is not the school's current pass rate; it is the District average. The current 9th-10th grade rate at RBHS is 23%. And getting from 23% to 65% in math -- in one year -- well, that one seems out of reach.
Anonymous said…
The RBHS report triggers a couple of questions -

1. Who "produces" the reports?
2. Is the authority and/or responsibility for the reports site based or central admin based?
3. Who is responsible for approving the reports prior to publication?

Yikes.

Oompah
Apparently there is no single source of responsibility for the RBHS report, no one authority,
seattle citizen said…
Jon,
If I could take a stab at addressing "what's going on at Rainier Beach"...
Fomr my perspective as an informed observer, I would first say that by many reports there is a reinvigorated student body and staff. I was in the building myself not a month ago, and saw students engaged, polite, eager...I also saw Director Patu there, and I know she visits often because that is what she does: Be the face of the community in a community school, as Beach, properly, really is.

From there, we might ask, reinvigorated how?

A) lots of generational poverty in that demographic: 74% Free/reduced lunch, compared to, say, 19% at Roosevelt.

B) a recent history of turmoil in staffing - perhaps three years of it.

From these two points, we might ask: "how are they invigorated? Look at some of the test scores!"

Well, yes, there are some drops. But look at some of the other scores, and look at the battles they face. I would almost assume a fluctuating slate of test scores at Beach because they have a fluctuating, unstable population, student-wise.

Yes, there are still struggles there. There will be until poverty is eliminated. It's obviously not a staff issue, at a general level (of course Beach, like any school, has a variety of staff and a variety of skills to deploy); Look at the relatively small class sizes. It's not funding (although we can call argue about the way funds are spent to support a 75% poverty student body...) It's not, I don't think, institutional racism at the school or district level - I know many of those teachers and many district staff, and they DO have high expectations and are not, as a rule, racist (tho' everyone is racist sometimes, of course.)

So "what's going on at Rainier Beach?"

Poverty, and also perhaps some language/culture barriers (Beach: 24% ELL; Roosevelt: 8%)

Given that the "data" is so hard to parse, test scores fluctuate we aren't able to line them up to individual students and add in all the individual factors that pertain...All we can go on is observation, sometimes, and my observation is that Beach has been kicked around, wrestles with poverty and language issues, but has recently shown great fortitude and resilence, and, yes, reinvigoration.

Go, RB!
Charlie Mas said…
What about those schools with leadership issues? What does the staff survey say?

Staff Climate Survey: "School Leadership" Category

District Average: 66%

Bryant: 34%
Ingraham: 79%
Lafayette: 67%
Lawton: 70%
Lowell: 36%
North Beach: 76%
Wedgwood: 48%
Charlie Mas said…
I saw some AMAZING high scores for leadership for a lot of principals. I saw rates in the 80+% range and some in the 90's including a 9%% and two 97% positive ratings for principals.

Some other schools got shockingly low leadership survey results from the staff.

Arbor Heights: 31%
Dearborn Park: 0 staff returned the survey.
Coe: 23%
Garfield: 0 staff returned the survey.
Hamilton: 30%
Montlake: 0 staff returned the survey.
TOPS: 23%
Anonymous said…
Not surprised by the low Bryant numbers. Parents have been expressing concern about leadership for some time. Not only is the number (34%) low, but a significant drop from 61% last year. I think the Staff Climate numbers echo parent feelings, but disappointing to know that teachers feel this way, too.
Anonymous said…
TOPS reflects last year's leadership, given to us by Goodloe-Johnson. It's taken 2+ years for the staff and parents to get an extremely unfortunate situation resolved. But we did. (Thanks to the ever-wonderful Phil Brockman Ed Director.) We will recruit a new principal shortly and we, the school staff and parents, are taking the lead on shaping the posting and on the interviews. Huzzah. Then if the District can stop stop oh please stop with the curriculum standardization and let us do what we need to for our own kids, we can actually make progress.


TOPSIE
Steveroo said…
"It's taken 2+ years for the staff and parents to get an extremely unfortunate situation resolved. But we did. (Thanks to the ever-wonderful Phil Brockman Ed Director.)"

Topsie, Phil Brockman emphatically did not solve this serious and ongoing problem. Your 23% Leader is now busy 23-percenting our school, Lafayette Elementary!

I'm a Brockman fan but ten thumbs down to Phil Brockman for accomplishing this cowardly, crap-as-usual JSCEE-signature-dish move.
Braessae said…
Well, it sounds like TOPS has gotten something done on the leadership front. Here is hoping that Susan Enfield has her Ed Directors in all of those schools with low scores, figuring out what the problems are. Unfortunately, in some instances, the Ed Directors are part of the problem (at least one ex-staffer at Lowell is of that view). In that case, the Board had better be holding the Supe's feet to the fire to make the Ed Directors do the right thing! Too bad not every school gets Brockman as an Ed Director!
dan dempsey said…
Murkey Water wrote:
That is great news, especially since the math pass rate had decreased in the majority Of schools the year before.

NO it is not great news. The HSPE math was a one year deal just marking time until End of Course assessments showed up in 2011.

In 2010 the District HSPE Math scores were:
3.6% higher than the State average for all students
1.7% lower than the state average for Low Income
6.5% lower than the State average for Black students

By comparison with the State the EoC algebra scores are much worse than the state for all students.

For openers for all students in grade 9 that took algebra
Seattle Pass rate for all = 48.8% on EoC
State Pass rate for all = 53.7%

District all student pass rate went from HSPE 3.6% above in 2010 ... to
4.9% below in 2011 on EoC

That is a drop of 8.5%.


Check this .pdf sheet for individual schools ... this is not pretty picture. I suggest you use the low income numbers for each school to do a fair comparison.

Remember most high schools taught algebra as a one credit 150 hour class.

Cleveland gave algebra 2 credits and taught in for 255 hours.

-------------------
Algebra 1 EoC pass rates for low income student is 4 school districts:

Pass rate for ---------
Low Income - District - All pass rate
37.83% : Seattle : 45.9
35.28% : Bethel .: 39.72
56.35% : Clover Park :59.95
53.00% : Spokane ....:59.80

Seattle and Bethel use Discovering
Clover Park and Spokane use HOLT
anonymous said…
Wow, Topsie, what a refreshing change! A couple of years ago MGJ appointed a principal to TOPS with zero community input from the community. Today, Endield is allowing the following:

"We will recruit a new principal shortly and we, the school staff and parents, are taking the lead on shaping the posting and on the interviews. Huzzah."

All of these seemingly small changes add up. Another Kudo to Enfield. I sure hope she becomes our new super.

signed,
re-do
anonymous said…
This comment has been removed by the author.
anonymous said…
I concede to you when it comes to math data Dan. I did not compare SPS to the state, nor did I break down the data by income level.

I simply looked at each schools 2010/11 report individually and compared the math ***pass*** rate in 2010/11 to the math **pass** rate in 2009/10, and found that every school across the board had raised their math pass rates (overall, alg and geo combined). And I saw that as an improvement.

It made me wonder if Discovering is working - I remember staff saying it took three years to get up and running and then we'd see scores improve?? I wonder if standardization is working? If the new state math standards had something to do with it? Or if changing from the HSPE to the EOC skewed the numbers?

Q. Does the entire state use the same math EOC exams? Or does each district create their own?

Murky Water
JvA said…
Have they plotted all the overall segmentation scores on a single chart yet? I noticed our Beacon Hill zone school, Maple, moved up from 3 to 5 this year, and I was wondering if any other South Seattle schools have moved up into that area, occupied only by North End schools last year.
Anonymous said…
On a related note, the updated BERC report on college attendance for WA state students came out this week, too.

http://www.collegetracking.com/reporting/Reports.aspx

I posted some highlights on the Tuesday open thread.

In addition to the college enrollment data point that's available on the school reports, these reports break down college attendance for each high school by gender, ethnicity, family income (FRL...but only available for 2009)and type of school (2- or 4-year).

They also provide information on college persistence (but see the cautionary note for the data for 2010 grads...it's best just to use 2009--which was recently updated-- and prior data for now).

-college tracker
Anonymous said…
I'll try the link again:

CollegeData

-college tracker
Anonymous said…
Stevearoo, Braessae, Redo:

Clarification: Brockman didn't send the TOPS principal to Lafayette. He does not have that power. Only Enfield does.

Enfield has also shown a dismaying lack of follow up on the curriculum autonomy alternative schools were supposed to have.

I agree that Ed Directors are going to be make or break for many schools. And I double agree that many of them currently in place should not be. Enfield seems to have a blind spot about her own teaching and learning staff. Many need to go.

TOPSIE
Beacon Hill mom said…
South Sea - Mercer Middle also moved up to a 5! My daughter started there this year and has been really happy, and we've been really impressed by the teachers. This is a school with 72% poverty, although high family participation.
anonymous said…
I think Enfield inherited a mess, a really really big mess. With many emergent issues to tackle. I don't expect that she or anybody else could fix everything in the 7 months that she has been at the helm. Rather, what I look for is progress, vision, and the direction the district is moving in.

That she is allowing community input on the search for the new TOPS principal is a HUGE step in the right direction, and I'd think we could all agree on that.

Restoring autonomy for alt school curriculum is important too. Very important. My guess though is that she has much larger and more emergent issues to deal with first, and that she is prioritizing. I hope she gets to curriculum autonomy soon and finds an agreeable solution - I look forward to following this.

re-do
Anonymous said…
It will be interesting to see if Enfield even gets the chance to tackle the issues longterm. Both Peaslee and Mclaren campaigned on wanting a national search for a permanent super. It doesn't seem as though Betty's a fan. That leaves, if Peaslee is on the board, who? as a swing vote. Smith-Blum? Will be interesting to see if she's going to be 'establishment' or 'challenger' in nature from here on out.

I am betting establishment.

-skeptical-
anonymous said…
Good point skeptical. Yes, McClaren and Peaslee will both be in favor of a search, as will Patu. Moving forward my guess is that the three of them will vote against most staff proposals as a knee jerk reaction, and the board will have trouble coming to consensus. My only hope at this point is that this dynamic doesn't paralyze our district and halt progress.

We know what we got the last time we had a national search. MGJ. I just don't think we should roll the dice again.

watcher
Anonymous said…
The last search for Superintendent was flawed in that there was really no choice at the end. By default, MGJ was the only one left, and she wasn't chosen as the best of the available candidates.

-willing to roll the dice again
dan dempsey said…
re-do wrote:

" I don't expect that she or anybody else could fix everything in the 7 months that she has been at the helm."

These are school report cards. It is about academics. Enfield was the CAO before her 7 month interim-Superintnendent gig. She has been at the academic helm at lot more than 7 months. She has shown no interest in fixing the sad math mess. Would not grant a Singapore waiver to Loyal Heights.

See my year to year SPS data analysis on the achievement gaps at grades 4, 7, for read write and math; grades 5 and 8 for science and grade 10 for HSPE read write science.

The Black / White GAP from 2010 OSPI testing to 2011 OSPI testing is greater in 9 of 11 categories.

The Non Low Income / Low Income Gap is greater in 7 of 11 categories.

The Incumbents that recently ran each mentioned that Achievement Gaps are a TOP priority.

Enfield should be OUT.

Your previous comment reads like either:
A.. an SPS Press release
B.. a,Seattle Times editorial.

Put some facts on the line to back up your "HOPES".

Enfield has repeatedly shown herself as both CAO and Interim-Supt as unable to write Action Reports that (1) Are accurate (2) address legal requirements.

I can cite many instances of her action report meltdowns (here are a few):
(1)
$800,000 NTM Actions reports ... two of them; each were inaccurate. She took M-M and Sundquist with her to look at NTN Sacramento because it was a STEM school. {{Small glitch .. No research = NTN had 7 schools in CA and none were STEM}} (There is a lot more to this fiasco).

(2)
TFA actions reports ... every single one ignored the requirement for a careful review of Achievement Gaps .... needed to make the claim "conditions warrant" the use of conditionally certified TFA corps members to close achievement gaps. ... research shown TFA use in areas with no shortage of fully qualified teachers to be counterproductive in almost every instance. TFA was a fiasco ... central admin was spouting complete BS.

(3)
Cleveland Waiver .... ignored the WAC requirements need to get a waiver from the 150 hour / credit rule.

There are a lot more examples.

=========
Next item will be on how misleading the school report cards are in regard to OSPI math results on tests for High Schools.
anonymous said…
"Enfield should be OUT."

Yes of course she should be out Dan. As every super should be after about 6 months on the job. Right.

watcher
Steveroo said…
Community input is a fickle thing, that is trotted out with the appropriate fanfare except in some persistent, prevalent circumstances.

You noted that there was no community input when assigning Ms. 23% to TOPS. There was no community input when assigning this same principal to Lafayette. There will be no community input when assigning her to the inevitable next victim school.

The other tired old option is to create an unnecessary job at the JSCEE so that she can be kicked upstairs into it. I vote for creating a third Executive Director of Special Ed position for this purpose.
dan dempsey said…
About End of Course Math testing:
Using Chief Sealth as an example.
EoC #1 Algebra was given to all students that took an algebra class -- 208 students
... and to 198 students that were beyond Geometry.

The only reason that the Disrrict can report that the pass rate at Chief Sealth is 52.3% is because of the blending of upper division math students' results (on a first year Algebra test) into the First Year EoC Algebra results.

The real number that should be used is pass rate 30.4% because that was for the Chief Sealth students that took the Algebra I math class in 2010-2011.

((The only reason that students beyond Geometry are taking the Algebra EoC is because it is supposedly going to be a graduation requirement... thus these scores are listed as make-up.))

Because of the botch job of trying to use one test for both traditional Algebra / Geometry students and for Integrated I / integrated II students ... the tests were modified and did not test all the standards for Algebra or Geometry.

The EoC algebra tested almost all the standards but left out quadratics (three standards) ... The rationale being that Integrated I students will not have had quadratics. ... The EoC #1 test is likely the BEST math test ever to come out oF OSPI (( Not only that it is pretty good ))
Another watcher said…
Watcher, I give Dan a month before he starts up about a lawsuit against Marty and/or Peaslee If she wins) too. This will happen the second either of of them vote with the majority (i.e. in a way he doesn't like). To him this will be cause for removal. I won't bet you because it's a given. Look at the long string of lawsuits this guy has because they don't do things his way. I know he's gone all the way up to Randy Dorn with his suits. What's another board member?
dan dempsey said…
The EoC Geometry test is really weak. It was determined that Integrated I / Integrated II students would only have covered about half the Geometry standards by the end of year two. The EoC #2 Geometry only covers half the standards....

This is similar to a test of first semester geometry. It becomes a lot easier to teach to the test if that is the goal.

The only takers of EoC #2 were students actually in a Geometry class in 2010-2011 ... There were no geometry make-ups.

Note if we only look at Sealth "9th" graders that took Algebra the pass rate rises to 33.5% (slightly higher than for "all" takers of Algebra at 30.4%).

I firmly believe that the EoC #1 Algebra test results should be for students in an Algebra Class in 2010-2011.... to alter this result with advanced algebra and Pre-Calc students results (on a first year algebra test) is very misleading. (Note at Sealth the make-ups 198 students (at 75.8%) were almost as many as the Algebra I test takers 208 students (at 30.4%).

I think the fairest number to use to evaluate what is going on at individual high schools is the EoC #1 Algebra pass rate for Low Income students ... as then the schools are closer to a level playing field for comparison.

The attached sheet provides both the 9th grade "all students" pass rate as well as the 9th grade "Low-Income students" pass rate.

The first column of numbers is for the 9th grade "all students" pass rate

The second column of numbers is for the 9th grade "Low-Income students" pass rate.

Note the averages given for 10 Seattle high schools are from averaging the results for each of the 10 schools together. This would give a small school like RBHS the same weight as a big school like Garfield in that average.

The 9th grade average for Seattle done by OSPI for "All" = 48.8% .. {{{53.7%}}}

for 9th grade Low-Income = 38.5% ... {{43.8%}}
for 9th grade Black students = 26.5% ..{{35.9%}}

{{{ _ _ }}} are the statewide averages.
Seattle "all" below by 4.9%
Seattle "L-I" below by 5.3%
Seattle "Black" below by 9.4%

and the American Indian students
Seattle = 25.0%
State = 38.1%

Gap = 13.1% .... shows what continuing to screw up funding and programs for native youth can do when coupled with Seattle's completely inadequate k-8 math program and "Discovering for High School".

I believe that Enfield once said something about staking her professional reputation on this Seattle math program.... THE MATH PROGRAM NEEDS CHANGING .... mathematically unsound just is not good enough.
anonymous said…
Alternative schools have fought the district for many years to be able to give input into their principal selection because they are not the same as traditional schools - each has very unique pedagogical differences, and they want a principal that not only understands their approach but will support it. The same could be said for traditional schools, yet, I have not heard traditional school communities fighting for more input.

re-do
Anonymous said…
side note:

Anyone concerned about 37 students in a class at McClure?

Queen Anne News article about school over-crowding:

http://www.queenannenews.com/

Comments to:
Jeff Bond, editor
qamagnews@nwlink.com

-JC.
mirmac1 said…
Okay, please give me a *&*&%#! break. I sit and listen to the Board Work Session on the District Scorecard last night. First, Mark Teoh, head honcho of Research, Evaluation, and Analysis, simply offers up a result like "Students graduating in 4 years or fewer up to 73%", then silence. No explanation, no research, no analysis. This guy should not be in this position. He fell into it after Bernatek was shown the door. Teoh is a "strategic thinking" guy, although that is not clearly evident.

The Board directors hear this, look around like, yeah okay so...? DeBelle ventures to ask do we have any idea why this went up? What's working? Then Enfield says "Well, that's the perennial question. We don't know for sure but it could be..." then rattles off some generic "strategies" like "quality of instruction" or the "contributions of "community-school based partnerships" or "even though not all schools have counselors, maybe it's counseling."

Are we still flying blind here? WTH is the point of all these numbers if we have no clue why they are what they are? Very aggravating. I would say the district cannot crow about gains and achievements if it has NO CLUE what brought them about!
dan dempsey said…
Another Watcher incorrectly wrote:

Look at the long string of lawsuits this guy has because they don't do things his way. I know he's gone all the way up to Randy Dorn with his suits.

The lawsuits are because of violations of RCWs and WACs .... Do you believe that some law breakers should be exempt?

The recalls are for exactly what the WA Constitution specifies as grounds for recall.
See WA Constitution Article I sections 33 & 34.
Dorn recall hearing is on Monday at 9:30 in Pierce County superior court.
RECALL Filing HERE.
Are you against my constitution right to recall? Are you also against freedom of speech and freedom of assembly?

What I am looking for is...... the use of practices known to work .... The SPS gets exactly the results that would be expected from the poor instructional materials and practices it selects and pushes. Most teachers are doing likely the best job possible under very sad leadership. --- Did you ever look at results from the UW's Math Education Project?

New Tech Network is an excellent example of past poor results being ignored at the time of selection.
(Actually the results were inaccurate most of the data in NTN action Report #1 was wrong ... NTN #2 was a bit better but still a lot of significant errors. --- result was we got the 4-3 decision courtesy of Carr, Maier, Martin-Morris, and Sundquist to waste $800,000.

Everyday Math and Discovering are another two fine examples of pathetic decision-making to produce pathetic results.

"To Improve a System requires the intelligent application of relevant data" -- Many of the SPS current results suck .... why is that?

I refer you to the Black./White achievement gaps growth in 9 of 11 categories from 2010 to 2011.

NO I decided not to just sit back and watch the carnage ... I decided to do something during the last 5 years... granted the Board did not care.... but at least I did make an attempt.

Sorry over 10 years of meaningless blather from the Board and Central Admin about achievement gaps while doing nothing about it ... pains me. HOW ABOUT YOU?

The practices in use are often very poor for nearly every student. I chose Achievement Gaps as the District has stated they are a TOP priority .... I wonder how things would have been different if the GAPS were not TOP?

Check my previously posted OSPI score changes for Auburn at 52% Low Income vs. Seattle 43% low income for grades 3,4,5 from 2007 to 2011.

Seattle in 2009-2010 spent $11.848 per child
Auburn in 2009-2010 spent $9,250 per child

2,598 more dollars spent by Seattle for what?

CHECK THE RESULTS ....
Auburn 24 .... Seattle 0.

Sorry that is just not good enough performance for spending an average of $11,848 per child.
dan dempsey said…
Board meeting is now online ... watch the School Reports presentation.
mirmac1 said…
Another watcher,

Based on what I know of the new board members, they will not simply swallow whole certain staff's song and dance, will apply logic and research to decision-making, will follow the law and policy, and will support teachers and students first.

In light of this, the lawsuits won't be happening. Nice try.
anonymous said…
Right Mirimac, lets not celebrate that graduation rates have risen to 73%. What were we thinking.

It's actually hilarious, that you are trashing A 6% CLIMB IN GRADUATION RATES, because the district can't pinpoint the exact reason for the climb.

watcher
SeattleSped said…
" I vote for creating a third Executive Director of Special Ed position for this purpose."

Steveroo, Bite your tongue!
Dorothy Neville said…
There are some very odd definitions in the glossary. Some don't make any sense at all. Doesn't anyone with basic understanding of arithmetic proofread these?

Plus, there are some figures on school reports that are misleading. Even the most stalwart supporter of AP classes, Jay Matthews of WaPo, states that research shows that AP classes for all are still quality, PROVIDED THAT students are required to take the AP exam. If students are not required to take the exam, all bets are off on the quality of the course. Therefore things like AP LA for all and AP HG for all without kids taking the exam are meaningless. The school reports tout the percent of students taking an AP class, but then it hurts credibility by stating the percent of kids WHO TAKE AP EXAMS who score a passing grade. The statistic it should have is the percent of kids who take an "AP" class AND sit for the exam. That missing information is very important. Without it, the information given is misleading.
mirmac1 said…
Yeah, I'm confused about this metric...

What we measure:
5 year retention rate of teachers

How we measure it:
The number of teachers at the end of a school year who were employed at the District 5 years or more divided by the total number of teachers at the end of a school year based upon empolyee records maintained by the Human Resource department.

Why it is important:
"Studies show that one of the most important factor(s) in determining student performance is the quality of his or her teachers and that the number of years a teacher has taught contribute directly to student achievement"

How the heck does this jibe with hiring Teach for America short-timers? Not only does this muck up your results, but it APPARENTLY does not contribute directly to student achievement.
dan dempsey said…
A big question about graduation rates are how are such rates calculated? There are a large variety of ways to calculate this number. Few of them are an accurate measure of the chort graduation rate.

It is good news that SPS grad rates are UP.

=====
From OSPI Auburn SD

Other Information (more info)
Unexcused Absence Rate (2009-10) 5,995 0.5%
Annual Dropout Rate (2008-09) 226 4.4%
Estimated Annual On-Time Graduation Rate (2008-09) 934 81.0%
Estimated Annual Extended Graduation Rate (2008-09) 993 86.2%
----

Other Information (more info)
Unexcused Absence Rate (2010-11) 5,667 0.4%
Annual Dropout Rate (2009-10) 166 3.3%
Estimated Annual On-Time Graduation Rate (2009-10) 984 85.7%
Estimated Annual Extended Graduation Rate (2009-10) 1,054 91.8%
Actual Adjusted On-Time Cohort Graduation Rate (Class of 2010) 74.1%
Actual Adjusted 5-year Cohort Extended Graduation Rate (Class of 2010) 85.7%
=======

Auburn went from estimated on time rate of:
81% to 85.7%
but the actual cohort rate = 74.1% (11.6% lower than 85.7%)
=======


Last two OSPI results for Seattle

Other Information (more info)
Unexcused Absence Rate (2009-10) 27,315 0.6%
Annual Dropout Rate (2008-09) 909 7.1%
Estimated Annual On-Time Graduation Rate (2008-09) 2,409 70.1%
Estimated Annual Extended Graduation Rate (2008-09) 2,676 77.9%



Other Information (more info)
Unexcused Absence Rate (2010-11) 11,659 0.8%
Annual Dropout Rate (2009-10) 563 4.5%
Estimated Annual On-Time Graduation Rate (2009-10) 2,334 78.1%
Estimated Annual Extended Graduation Rate (2009-10) 2,620 87.7%
Actual Adjusted On-Time Cohort Graduation Rate (Class of 2010) 72.2%


NOTE THE Rise from
70.1 to 78.1 for on time graduation rate

but the reported Cohort Graduation rate was 72.2%
that is only 5.9% lower than the 78.1%

========

Every time I see graduation rate numbers in any report I wonder how they were calculated?

========

NTN Sacramento used a bizarre formula to generate graduation rates above 90%.... while cohort rates were well below 50%.
mirmac1 said…
Yeah, watcher, I'm sure you're still celebrating the 44% rise in "Graduates prepared for a 4-year college". WooHoo!

The stated rationale for using MAP is "it lets us evaluate the effectiveness of various programs and strategies." Well, where's the evaluation? Do we just throw everything at the wall and see what sticks? That doesn't cost alot of money.
Charlie Mas said…
Look around and you will find very, very little attribution analysis anywhere in K-12 education.

So, no, they can't say what caused the graduation rate to increase.

Here's the silver lining: they didn't pretend to attribute it to any given cause. That would have been a much greater error. I'm glad that they didn't commit it.

In the absence of any real attribution analysis "We don't know what caused the increased graduation rate." is the best and most honest answer. I'm delighted that it is the one they gave.
mirmac1 said…
But Charlie. They couldn't even say which HSs showed the greatest gain. What was different at those HSs? With the limited resources at our disposal, we need to be a little more circumspect.
Charlie Mas said…
There are only about fourteen high school programs to check. It would only take a couple minutes to discover which ones saw a rise in the rates and which did not.

I will say that it was really labor intensive to pull data out of the school plans. To get the staff survey data I had to click on each and every report. Exhausting.
mirmac1 said…
Well, Charlie, as to be expected, the board got a "we'll get back to you on that." Whadya think the chances are it'll all be forgotten until next November?
Anonymous said…
Despite a 3rd revamping of the measures used for the score card's "Graduates prepared for a 4-year college" (with rates adjusted from the original 17% in 2008, to 46% last fall, then to 63% last spring), the district still is watering down their mark.

First, the district says they are using the current HECB (WA Higher Ed. Coordinating Board)standards, but last spring they removed one measure which is definitely a requirement for any 4-year college in WA- they removed "Students taking a college entrance test (SAT/ACT)"!

Interestingly, exactly one year ago at the Nov. 2010 district Scorecard board workshop, a slide was presented which showed that of the 2009-10 students who met 7-9 of the 10 criteria (ie "near misses), that "not taking the ACT or SAT has the lowest percentage of students meeting that criteria" (shown at 43%).

So, remove that requirement and suddenly the students "prepared for a 4-year college" surges to 63%?

Aother measure which was changed last spring was removing the earlier 2.0 GPA for core classes (actually a Seattle high school graduation requirement) and replacing it with the lesser HECB requirement of 2.0 cumulative GPA based on ALL classwork. So...let's get this straight, with the new measure a student might be counted as "prepared for a 4-year college" and yet if they didn't have a 2.0 core GPA they could not graduate from a Seattle HS? You've got to be joking!

Finally, look at the goofy results this has possibly created in the School Reports. Take Nathan Hale for example, for 2009-10 ('10-'11 not posted yet).
Students taking college admissions test (SAT or ACT): 68%
Graduates prepared for a 4-year college: 78%
Really? Explain this, please? How do the extra 10% qualify for college admission when they haven't taken a required SAT/ACT test?

(there are 6 high schools with similar ooops listed for the past 3 years)

This is not a new problem that the current administration "inherited", it is a mess that they created and now support.

---Score cards & School Reports need to be meaningful & not misleading!
anonymous said…
"How do the extra 10% qualify for college admission when they haven't taken a required SAT/ACT test?"

Huh? Students can take college prep courses (3 years of science, 4 years of LA, 3 years of math, 2 years of world language, etc) and be 100% prepared to attend college, regardless of whether they take the SAT/ACT or not. The SAT does not determine whether a child is prepared for college, it is just an entrance exam used by SOME colleges. Not all colleges require the SAT/ACT for admissions, and in fact, there is a strong movement to get away from requiring these tests.

my2cents
Anonymous said…
my 2 cents-
According to SPS, the measure that SPS is using in the School Reports for "prepared for 4-year college" are "per Washington HECB" requirements for college entrance into a 4-year WA state college. Taking a SAT/ACT test is required, not an option in a WA state 4-year.

I am fully aware that some private colleges no longer require those tests, but this is the measure HECB uses for all WA state 4-year colleges, and the one that SPS has chosen to use. They could have made up their own measure (they tried two previous times, and that went bust with the 17% and 46%).

---Score cards/School Reports need to be meaningful & not misleading
Jan said…
watcher -- it might have been nice if mirmac1 had pattered his/her little hands together for a minute first, in honor of the increase, before asking her questions. But I think you are being unfair and mirmac is right. If we are going to go the "data" route, it is critical that we understand what the numbers mean. Maybe only 3 schools improved, and they improved solely by revamping the way they count kids who have dropped out or left (I am quite certain this is not the case -- but hypothetically). That would mean very little in terms of teaching or learning -- and if they reverted to the old method next year, there would be a sudden plunge -- and that would be equally meaningless. Or maybe, increased targeted remediation being done in 5 schools has caused the difference -- and wouldn't it be great to know that so we could expand that effort to the other schools?

This stuff didn't much matter when MGJ was here, because she had no real interest in improving schools, just in restructuring them, results be damned. But if we are going to spend all this time and money generating numbers, is not mirmac correct in pointing out that now they need to aggressively pursue the data to see what we can learn from it?

Maybe this isn't Mr. Teoh's job. Maybe he IS just the guy who grinds out the numbers and pushes the clicker during the power point presentation -- but SOMEBODY (I am thinking Dr. Enfield, or the Exec Directors of the high schools) needs to be the guys to follow up.

If the ultimate answer is -- yes, we improved, but we can't tell why for sure -- then, that is the answer. But that isn't the answer until somebody has really looked.
Anonymous said…
So what was the point of MAP again? To help guide instruction? To see if kids are learning or not? That is why we are testimg 2 or 3 x a year, right? That is why we are spending millions and using up facilities and instructional time for it? Here I was assuming with all the time they spent on TFAs and revamping attendance and newspaper editorial policies, they have a strategy in place to figure out what works and doesn't work in the classrooms. Gee, maybe if admin can focus on coming up with a coherent science C & I guideline from ES to HS, that might help as an example. They certainly have well informed and experienced people like the district's own science teachers to ask. How many years has this been going on? Priority?

exasperated
Jan said…
exasperated -- in my heart of hearts, I don't think that was ever the main purpose of MAP. I think the main purpose was always meant to be teacher evaluation/retention. Of course, we would never have agreed to pay that amount, waste that much of our kids' time, and waste so much library/computer center time just for a teacher evaluation number (especially since many people don't believe in evaluating teachers in that manner -- or think MAP is valid for that use) -- so it became all about "informing instruction, blah blah but it hasn't been used much for that, as far as I know -- and then it became about a barrier/hurdle for access to advanced/accelerated learning -- but I am not aware of any research that validates its validity for that either. It is really a teacher evaluation aid.

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

MEETING CANCELED - Hey Kids, A Meeting with Three(!) Seattle Schools Board Directors