School Reports
The new school reports - for the school year 2010-2011 - have been released.
They can be found here.
This year's School Reports are different from last years in only two ways. The changes are the two corrections that were promised in December of last year. They are long overdue, but better now than later still.
Last year, the measure of student academic growth was grossly mislabeled "Students making gains on the state reading test" but was actually a derived relative measure. This year the student academic growth gets a label like "4th and 5th graders who met or exceeded typical growth on the state reading test" and it is a new measure. From the glossary:
Also, they corrected the Student Demographic information by choosing to show the number of students enrolled in APP/Spectrum category instead of the number eligible for those programs. So we have no sense how many students at Bryant, for example, are Spectrum-eligible. This is not quite the fix that was promised last year - we were supposed to get a count of the advanced learners at each school.
I'll make reference to some funny statistical anomolies in the comments.
They can be found here.
This year's School Reports are different from last years in only two ways. The changes are the two corrections that were promised in December of last year. They are long overdue, but better now than later still.
Last year, the measure of student academic growth was grossly mislabeled "Students making gains on the state reading test" but was actually a derived relative measure. This year the student academic growth gets a label like "4th and 5th graders who met or exceeded typical growth on the state reading test" and it is a new measure. From the glossary:
This measure shows how students are growing compared to other students in the same grade, with the same test score throughout the state. It is the percentage of students whose growth was above average (50th to 99th percentile) on the state test from one year to the next. Students are compared to other students across the state that scored the same on the state test and were in the same grade level.This is a more honest measure (and a more honest label) than the one used last year, but it is still a poor choice. It is still a relative measure when we should focus on absolute measures. It doesn't tell us if the school did well with students working below, at, or beyond grade level.
Also, they corrected the Student Demographic information by choosing to show the number of students enrolled in APP/Spectrum category instead of the number eligible for those programs. So we have no sense how many students at Bryant, for example, are Spectrum-eligible. This is not quite the fix that was promised last year - we were supposed to get a count of the advanced learners at each school.
I'll make reference to some funny statistical anomolies in the comments.
Comments
From NOVA High School:
First, the name of the school is "The NOVA Project", not NOVA High School.
Second, check this out:
Graduates prepared for a 4‐year
college (09-10): 25%
Graduates enrolling in higher
education within 1 year (10-11): 74%
I guess about 50% of the graduates are going to college un-prepared.
First‐time 9th graders earning sufficient credit (08-09): ###
What is that? 100%?
And a few unavailable statistics:
Students with fewer than 10 absences per year (NOVA doesn't take attendance)
Seattle Public Schools Segmentation Level: Not available I don't understand that one. They have achievement and growth numbers.
Percent of test‐takers passing a college‐level test during HS (AP or IB) Data not available NOVA doesn't offer AP or IB classes.
Average class size: - Kind of hard to say, I guess. A lot of the students are doing independent study. Is that a class size of 1?
The student and staff survey numbers at NOVA are much better than the district average. The family survey numbers are right on the district average.
92% of NOVA students with IEPs passed the state reading test.
Pass rates on the MSP are trending down or flat. Only science is clearly up. Math could be trending up, but it's hard to regard a 38% pass rate as a bright spot.
Here's the weird thing: the year over year change in the pass rate for the school is up 7%, from 31% to 38%. But look at the demographic breakdown. Every demographic category shows increases of no less than 9 percentage points on the math test and some of them increasing as much as 19 percentage points.
How is that possible?
Growth compared to test score peers is trending down. Other schools are doing a better job.
Attendance is clearly a problem. Fewer than half of the kids have fewer than 10 absences per year.
The demographic data suggests that there are 12 Spectrum students at Aki Kurose. What is their program like? Why did they have enough math MSP scores to report but not enough reading MSP scores to report?
The school is in step 5 for NCLB, which means its supposed to be transformed. Right. There's nothing transformational happening there. It's just more of the same.
Wow. Kind of a disaster.
Test scores trending down.
Growth trending down and low.
Staff Climate Survey: "School
Leadership" Category: 12% Ouch.
Pass rate on reading test for ELL students down 47 percentage points, down 19 percentage points for FRE students, down 19 percentage points for Asian/Pacific Islander students, down 7 percentage points. It's a slaughter.
Attendance is very poor.
35 teachers for 425 students. You do the math.
Here's something really freaky. Check out the section titled: School Description and Plan to Achieve Goals. It says what K-5 teachers will do. Are there a lot of K-5 teachers at Rainier Beach High School?
Of course this year we used the EOC as the measurement tool and last year we used the HSPE - not sure if that has any significance?
I'm not quite sure how to interpret the progress? Is Discovering working? Is standardization working? Are the new state math standards better? Or does it have something to do with using different tests from 2009/10 to 2010/11?
Whatever it's due to the progress is great news!
Murky Water
But here is another oddity: For most of the math goals, their statements correlate to the charts and data. But they state: For 9th ‐ 10th grade students we will increase the % of students meeting or exceeding typical growth on the
math MAP assessment from 59% to 65%.
But 59% is not the school's current pass rate; it is the District average. The current 9th-10th grade rate at RBHS is 23%. And getting from 23% to 65% in math -- in one year -- well, that one seems out of reach.
1. Who "produces" the reports?
2. Is the authority and/or responsibility for the reports site based or central admin based?
3. Who is responsible for approving the reports prior to publication?
Yikes.
Oompah
Apparently there is no single source of responsibility for the RBHS report, no one authority,
If I could take a stab at addressing "what's going on at Rainier Beach"...
Fomr my perspective as an informed observer, I would first say that by many reports there is a reinvigorated student body and staff. I was in the building myself not a month ago, and saw students engaged, polite, eager...I also saw Director Patu there, and I know she visits often because that is what she does: Be the face of the community in a community school, as Beach, properly, really is.
From there, we might ask, reinvigorated how?
A) lots of generational poverty in that demographic: 74% Free/reduced lunch, compared to, say, 19% at Roosevelt.
B) a recent history of turmoil in staffing - perhaps three years of it.
From these two points, we might ask: "how are they invigorated? Look at some of the test scores!"
Well, yes, there are some drops. But look at some of the other scores, and look at the battles they face. I would almost assume a fluctuating slate of test scores at Beach because they have a fluctuating, unstable population, student-wise.
Yes, there are still struggles there. There will be until poverty is eliminated. It's obviously not a staff issue, at a general level (of course Beach, like any school, has a variety of staff and a variety of skills to deploy); Look at the relatively small class sizes. It's not funding (although we can call argue about the way funds are spent to support a 75% poverty student body...) It's not, I don't think, institutional racism at the school or district level - I know many of those teachers and many district staff, and they DO have high expectations and are not, as a rule, racist (tho' everyone is racist sometimes, of course.)
So "what's going on at Rainier Beach?"
Poverty, and also perhaps some language/culture barriers (Beach: 24% ELL; Roosevelt: 8%)
Given that the "data" is so hard to parse, test scores fluctuate we aren't able to line them up to individual students and add in all the individual factors that pertain...All we can go on is observation, sometimes, and my observation is that Beach has been kicked around, wrestles with poverty and language issues, but has recently shown great fortitude and resilence, and, yes, reinvigoration.
Go, RB!
Staff Climate Survey: "School Leadership" Category
District Average: 66%
Bryant: 34%
Ingraham: 79%
Lafayette: 67%
Lawton: 70%
Lowell: 36%
North Beach: 76%
Wedgwood: 48%
Some other schools got shockingly low leadership survey results from the staff.
Arbor Heights: 31%
Dearborn Park: 0 staff returned the survey.
Coe: 23%
Garfield: 0 staff returned the survey.
Hamilton: 30%
Montlake: 0 staff returned the survey.
TOPS: 23%
TOPSIE
Topsie, Phil Brockman emphatically did not solve this serious and ongoing problem. Your 23% Leader is now busy 23-percenting our school, Lafayette Elementary!
I'm a Brockman fan but ten thumbs down to Phil Brockman for accomplishing this cowardly, crap-as-usual JSCEE-signature-dish move.
That is great news, especially since the math pass rate had decreased in the majority Of schools the year before.
NO it is not great news. The HSPE math was a one year deal just marking time until End of Course assessments showed up in 2011.
In 2010 the District HSPE Math scores were:
3.6% higher than the State average for all students
1.7% lower than the state average for Low Income
6.5% lower than the State average for Black students
By comparison with the State the EoC algebra scores are much worse than the state for all students.
For openers for all students in grade 9 that took algebra
Seattle Pass rate for all = 48.8% on EoC
State Pass rate for all = 53.7%
District all student pass rate went from HSPE 3.6% above in 2010 ... to
4.9% below in 2011 on EoC
That is a drop of 8.5%.
Check this .pdf sheet for individual schools ... this is not pretty picture. I suggest you use the low income numbers for each school to do a fair comparison.
Remember most high schools taught algebra as a one credit 150 hour class.
Cleveland gave algebra 2 credits and taught in for 255 hours.
-------------------
Algebra 1 EoC pass rates for low income student is 4 school districts:
Pass rate for ---------
Low Income - District - All pass rate
37.83% : Seattle : 45.9
35.28% : Bethel .: 39.72
56.35% : Clover Park :59.95
53.00% : Spokane ....:59.80
Seattle and Bethel use Discovering
Clover Park and Spokane use HOLT
"We will recruit a new principal shortly and we, the school staff and parents, are taking the lead on shaping the posting and on the interviews. Huzzah."
All of these seemingly small changes add up. Another Kudo to Enfield. I sure hope she becomes our new super.
signed,
re-do
I simply looked at each schools 2010/11 report individually and compared the math ***pass*** rate in 2010/11 to the math **pass** rate in 2009/10, and found that every school across the board had raised their math pass rates (overall, alg and geo combined). And I saw that as an improvement.
It made me wonder if Discovering is working - I remember staff saying it took three years to get up and running and then we'd see scores improve?? I wonder if standardization is working? If the new state math standards had something to do with it? Or if changing from the HSPE to the EOC skewed the numbers?
Q. Does the entire state use the same math EOC exams? Or does each district create their own?
Murky Water
http://www.collegetracking.com/reporting/Reports.aspx
I posted some highlights on the Tuesday open thread.
In addition to the college enrollment data point that's available on the school reports, these reports break down college attendance for each high school by gender, ethnicity, family income (FRL...but only available for 2009)and type of school (2- or 4-year).
They also provide information on college persistence (but see the cautionary note for the data for 2010 grads...it's best just to use 2009--which was recently updated-- and prior data for now).
-college tracker
CollegeData
-college tracker
Clarification: Brockman didn't send the TOPS principal to Lafayette. He does not have that power. Only Enfield does.
Enfield has also shown a dismaying lack of follow up on the curriculum autonomy alternative schools were supposed to have.
I agree that Ed Directors are going to be make or break for many schools. And I double agree that many of them currently in place should not be. Enfield seems to have a blind spot about her own teaching and learning staff. Many need to go.
TOPSIE
That she is allowing community input on the search for the new TOPS principal is a HUGE step in the right direction, and I'd think we could all agree on that.
Restoring autonomy for alt school curriculum is important too. Very important. My guess though is that she has much larger and more emergent issues to deal with first, and that she is prioritizing. I hope she gets to curriculum autonomy soon and finds an agreeable solution - I look forward to following this.
re-do
I am betting establishment.
-skeptical-
We know what we got the last time we had a national search. MGJ. I just don't think we should roll the dice again.
watcher
-willing to roll the dice again
" I don't expect that she or anybody else could fix everything in the 7 months that she has been at the helm."
These are school report cards. It is about academics. Enfield was the CAO before her 7 month interim-Superintnendent gig. She has been at the academic helm at lot more than 7 months. She has shown no interest in fixing the sad math mess. Would not grant a Singapore waiver to Loyal Heights.
See my year to year SPS data analysis on the achievement gaps at grades 4, 7, for read write and math; grades 5 and 8 for science and grade 10 for HSPE read write science.
The Black / White GAP from 2010 OSPI testing to 2011 OSPI testing is greater in 9 of 11 categories.
The Non Low Income / Low Income Gap is greater in 7 of 11 categories.
The Incumbents that recently ran each mentioned that Achievement Gaps are a TOP priority.
Enfield should be OUT.
Your previous comment reads like either:
A.. an SPS Press release
B.. a,Seattle Times editorial.
Put some facts on the line to back up your "HOPES".
Enfield has repeatedly shown herself as both CAO and Interim-Supt as unable to write Action Reports that (1) Are accurate (2) address legal requirements.
I can cite many instances of her action report meltdowns (here are a few):
(1)
$800,000 NTM Actions reports ... two of them; each were inaccurate. She took M-M and Sundquist with her to look at NTN Sacramento because it was a STEM school. {{Small glitch .. No research = NTN had 7 schools in CA and none were STEM}} (There is a lot more to this fiasco).
(2)
TFA actions reports ... every single one ignored the requirement for a careful review of Achievement Gaps .... needed to make the claim "conditions warrant" the use of conditionally certified TFA corps members to close achievement gaps. ... research shown TFA use in areas with no shortage of fully qualified teachers to be counterproductive in almost every instance. TFA was a fiasco ... central admin was spouting complete BS.
(3)
Cleveland Waiver .... ignored the WAC requirements need to get a waiver from the 150 hour / credit rule.
There are a lot more examples.
=========
Next item will be on how misleading the school report cards are in regard to OSPI math results on tests for High Schools.
Yes of course she should be out Dan. As every super should be after about 6 months on the job. Right.
watcher
You noted that there was no community input when assigning Ms. 23% to TOPS. There was no community input when assigning this same principal to Lafayette. There will be no community input when assigning her to the inevitable next victim school.
The other tired old option is to create an unnecessary job at the JSCEE so that she can be kicked upstairs into it. I vote for creating a third Executive Director of Special Ed position for this purpose.
Using Chief Sealth as an example.
EoC #1 Algebra was given to all students that took an algebra class -- 208 students
... and to 198 students that were beyond Geometry.
The only reason that the Disrrict can report that the pass rate at Chief Sealth is 52.3% is because of the blending of upper division math students' results (on a first year Algebra test) into the First Year EoC Algebra results.
The real number that should be used is pass rate 30.4% because that was for the Chief Sealth students that took the Algebra I math class in 2010-2011.
((The only reason that students beyond Geometry are taking the Algebra EoC is because it is supposedly going to be a graduation requirement... thus these scores are listed as make-up.))
Because of the botch job of trying to use one test for both traditional Algebra / Geometry students and for Integrated I / integrated II students ... the tests were modified and did not test all the standards for Algebra or Geometry.
The EoC algebra tested almost all the standards but left out quadratics (three standards) ... The rationale being that Integrated I students will not have had quadratics. ... The EoC #1 test is likely the BEST math test ever to come out oF OSPI (( Not only that it is pretty good ))
This is similar to a test of first semester geometry. It becomes a lot easier to teach to the test if that is the goal.
The only takers of EoC #2 were students actually in a Geometry class in 2010-2011 ... There were no geometry make-ups.
Note if we only look at Sealth "9th" graders that took Algebra the pass rate rises to 33.5% (slightly higher than for "all" takers of Algebra at 30.4%).
I firmly believe that the EoC #1 Algebra test results should be for students in an Algebra Class in 2010-2011.... to alter this result with advanced algebra and Pre-Calc students results (on a first year algebra test) is very misleading. (Note at Sealth the make-ups 198 students (at 75.8%) were almost as many as the Algebra I test takers 208 students (at 30.4%).
I think the fairest number to use to evaluate what is going on at individual high schools is the EoC #1 Algebra pass rate for Low Income students ... as then the schools are closer to a level playing field for comparison.
The attached sheet provides both the 9th grade "all students" pass rate as well as the 9th grade "Low-Income students" pass rate.
The first column of numbers is for the 9th grade "all students" pass rate
The second column of numbers is for the 9th grade "Low-Income students" pass rate.
Note the averages given for 10 Seattle high schools are from averaging the results for each of the 10 schools together. This would give a small school like RBHS the same weight as a big school like Garfield in that average.
The 9th grade average for Seattle done by OSPI for "All" = 48.8% .. {{{53.7%}}}
for 9th grade Low-Income = 38.5% ... {{43.8%}}
for 9th grade Black students = 26.5% ..{{35.9%}}
{{{ _ _ }}} are the statewide averages.
Seattle "all" below by 4.9%
Seattle "L-I" below by 5.3%
Seattle "Black" below by 9.4%
and the American Indian students
Seattle = 25.0%
State = 38.1%
Gap = 13.1% .... shows what continuing to screw up funding and programs for native youth can do when coupled with Seattle's completely inadequate k-8 math program and "Discovering for High School".
I believe that Enfield once said something about staking her professional reputation on this Seattle math program.... THE MATH PROGRAM NEEDS CHANGING .... mathematically unsound just is not good enough.
re-do
Anyone concerned about 37 students in a class at McClure?
Queen Anne News article about school over-crowding:
http://www.queenannenews.com/
Comments to:
Jeff Bond, editor
qamagnews@nwlink.com
-JC.
The Board directors hear this, look around like, yeah okay so...? DeBelle ventures to ask do we have any idea why this went up? What's working? Then Enfield says "Well, that's the perennial question. We don't know for sure but it could be..." then rattles off some generic "strategies" like "quality of instruction" or the "contributions of "community-school based partnerships" or "even though not all schools have counselors, maybe it's counseling."
Are we still flying blind here? WTH is the point of all these numbers if we have no clue why they are what they are? Very aggravating. I would say the district cannot crow about gains and achievements if it has NO CLUE what brought them about!
Look at the long string of lawsuits this guy has because they don't do things his way. I know he's gone all the way up to Randy Dorn with his suits.
The lawsuits are because of violations of RCWs and WACs .... Do you believe that some law breakers should be exempt?
The recalls are for exactly what the WA Constitution specifies as grounds for recall.
See WA Constitution Article I sections 33 & 34.
Dorn recall hearing is on Monday at 9:30 in Pierce County superior court.
RECALL Filing HERE.
Are you against my constitution right to recall? Are you also against freedom of speech and freedom of assembly?
What I am looking for is...... the use of practices known to work .... The SPS gets exactly the results that would be expected from the poor instructional materials and practices it selects and pushes. Most teachers are doing likely the best job possible under very sad leadership. --- Did you ever look at results from the UW's Math Education Project?
New Tech Network is an excellent example of past poor results being ignored at the time of selection.
(Actually the results were inaccurate most of the data in NTN action Report #1 was wrong ... NTN #2 was a bit better but still a lot of significant errors. --- result was we got the 4-3 decision courtesy of Carr, Maier, Martin-Morris, and Sundquist to waste $800,000.
Everyday Math and Discovering are another two fine examples of pathetic decision-making to produce pathetic results.
"To Improve a System requires the intelligent application of relevant data" -- Many of the SPS current results suck .... why is that?
I refer you to the Black./White achievement gaps growth in 9 of 11 categories from 2010 to 2011.
NO I decided not to just sit back and watch the carnage ... I decided to do something during the last 5 years... granted the Board did not care.... but at least I did make an attempt.
Sorry over 10 years of meaningless blather from the Board and Central Admin about achievement gaps while doing nothing about it ... pains me. HOW ABOUT YOU?
The practices in use are often very poor for nearly every student. I chose Achievement Gaps as the District has stated they are a TOP priority .... I wonder how things would have been different if the GAPS were not TOP?
Check my previously posted OSPI score changes for Auburn at 52% Low Income vs. Seattle 43% low income for grades 3,4,5 from 2007 to 2011.
Seattle in 2009-2010 spent $11.848 per child
Auburn in 2009-2010 spent $9,250 per child
2,598 more dollars spent by Seattle for what?
CHECK THE RESULTS ....
Auburn 24 .... Seattle 0.
Sorry that is just not good enough performance for spending an average of $11,848 per child.
Based on what I know of the new board members, they will not simply swallow whole certain staff's song and dance, will apply logic and research to decision-making, will follow the law and policy, and will support teachers and students first.
In light of this, the lawsuits won't be happening. Nice try.
It's actually hilarious, that you are trashing A 6% CLIMB IN GRADUATION RATES, because the district can't pinpoint the exact reason for the climb.
watcher
Steveroo, Bite your tongue!
Plus, there are some figures on school reports that are misleading. Even the most stalwart supporter of AP classes, Jay Matthews of WaPo, states that research shows that AP classes for all are still quality, PROVIDED THAT students are required to take the AP exam. If students are not required to take the exam, all bets are off on the quality of the course. Therefore things like AP LA for all and AP HG for all without kids taking the exam are meaningless. The school reports tout the percent of students taking an AP class, but then it hurts credibility by stating the percent of kids WHO TAKE AP EXAMS who score a passing grade. The statistic it should have is the percent of kids who take an "AP" class AND sit for the exam. That missing information is very important. Without it, the information given is misleading.
What we measure:
5 year retention rate of teachers
How we measure it:
The number of teachers at the end of a school year who were employed at the District 5 years or more divided by the total number of teachers at the end of a school year based upon empolyee records maintained by the Human Resource department.
Why it is important:
"Studies show that one of the most important factor(s) in determining student performance is the quality of his or her teachers and that the number of years a teacher has taught contribute directly to student achievement"
How the heck does this jibe with hiring Teach for America short-timers? Not only does this muck up your results, but it APPARENTLY does not contribute directly to student achievement.
It is good news that SPS grad rates are UP.
=====
From OSPI Auburn SD
Other Information (more info)
Unexcused Absence Rate (2009-10) 5,995 0.5%
Annual Dropout Rate (2008-09) 226 4.4%
Estimated Annual On-Time Graduation Rate (2008-09) 934 81.0%
Estimated Annual Extended Graduation Rate (2008-09) 993 86.2%
----
Other Information (more info)
Unexcused Absence Rate (2010-11) 5,667 0.4%
Annual Dropout Rate (2009-10) 166 3.3%
Estimated Annual On-Time Graduation Rate (2009-10) 984 85.7%
Estimated Annual Extended Graduation Rate (2009-10) 1,054 91.8%
Actual Adjusted On-Time Cohort Graduation Rate (Class of 2010) 74.1%
Actual Adjusted 5-year Cohort Extended Graduation Rate (Class of 2010) 85.7%
=======
Auburn went from estimated on time rate of:
81% to 85.7%
but the actual cohort rate = 74.1% (11.6% lower than 85.7%)
=======
Last two OSPI results for Seattle
Other Information (more info)
Unexcused Absence Rate (2009-10) 27,315 0.6%
Annual Dropout Rate (2008-09) 909 7.1%
Estimated Annual On-Time Graduation Rate (2008-09) 2,409 70.1%
Estimated Annual Extended Graduation Rate (2008-09) 2,676 77.9%
Other Information (more info)
Unexcused Absence Rate (2010-11) 11,659 0.8%
Annual Dropout Rate (2009-10) 563 4.5%
Estimated Annual On-Time Graduation Rate (2009-10) 2,334 78.1%
Estimated Annual Extended Graduation Rate (2009-10) 2,620 87.7%
Actual Adjusted On-Time Cohort Graduation Rate (Class of 2010) 72.2%
NOTE THE Rise from
70.1 to 78.1 for on time graduation rate
but the reported Cohort Graduation rate was 72.2%
that is only 5.9% lower than the 78.1%
========
Every time I see graduation rate numbers in any report I wonder how they were calculated?
========
NTN Sacramento used a bizarre formula to generate graduation rates above 90%.... while cohort rates were well below 50%.
The stated rationale for using MAP is "it lets us evaluate the effectiveness of various programs and strategies." Well, where's the evaluation? Do we just throw everything at the wall and see what sticks? That doesn't cost alot of money.
So, no, they can't say what caused the graduation rate to increase.
Here's the silver lining: they didn't pretend to attribute it to any given cause. That would have been a much greater error. I'm glad that they didn't commit it.
In the absence of any real attribution analysis "We don't know what caused the increased graduation rate." is the best and most honest answer. I'm delighted that it is the one they gave.
I will say that it was really labor intensive to pull data out of the school plans. To get the staff survey data I had to click on each and every report. Exhausting.
First, the district says they are using the current HECB (WA Higher Ed. Coordinating Board)standards, but last spring they removed one measure which is definitely a requirement for any 4-year college in WA- they removed "Students taking a college entrance test (SAT/ACT)"!
Interestingly, exactly one year ago at the Nov. 2010 district Scorecard board workshop, a slide was presented which showed that of the 2009-10 students who met 7-9 of the 10 criteria (ie "near misses), that "not taking the ACT or SAT has the lowest percentage of students meeting that criteria" (shown at 43%).
So, remove that requirement and suddenly the students "prepared for a 4-year college" surges to 63%?
Aother measure which was changed last spring was removing the earlier 2.0 GPA for core classes (actually a Seattle high school graduation requirement) and replacing it with the lesser HECB requirement of 2.0 cumulative GPA based on ALL classwork. So...let's get this straight, with the new measure a student might be counted as "prepared for a 4-year college" and yet if they didn't have a 2.0 core GPA they could not graduate from a Seattle HS? You've got to be joking!
Finally, look at the goofy results this has possibly created in the School Reports. Take Nathan Hale for example, for 2009-10 ('10-'11 not posted yet).
Students taking college admissions test (SAT or ACT): 68%
Graduates prepared for a 4-year college: 78%
Really? Explain this, please? How do the extra 10% qualify for college admission when they haven't taken a required SAT/ACT test?
(there are 6 high schools with similar ooops listed for the past 3 years)
This is not a new problem that the current administration "inherited", it is a mess that they created and now support.
---Score cards & School Reports need to be meaningful & not misleading!
Huh? Students can take college prep courses (3 years of science, 4 years of LA, 3 years of math, 2 years of world language, etc) and be 100% prepared to attend college, regardless of whether they take the SAT/ACT or not. The SAT does not determine whether a child is prepared for college, it is just an entrance exam used by SOME colleges. Not all colleges require the SAT/ACT for admissions, and in fact, there is a strong movement to get away from requiring these tests.
my2cents
According to SPS, the measure that SPS is using in the School Reports for "prepared for 4-year college" are "per Washington HECB" requirements for college entrance into a 4-year WA state college. Taking a SAT/ACT test is required, not an option in a WA state 4-year.
I am fully aware that some private colleges no longer require those tests, but this is the measure HECB uses for all WA state 4-year colleges, and the one that SPS has chosen to use. They could have made up their own measure (they tried two previous times, and that went bust with the 17% and 46%).
---Score cards/School Reports need to be meaningful & not misleading
This stuff didn't much matter when MGJ was here, because she had no real interest in improving schools, just in restructuring them, results be damned. But if we are going to spend all this time and money generating numbers, is not mirmac correct in pointing out that now they need to aggressively pursue the data to see what we can learn from it?
Maybe this isn't Mr. Teoh's job. Maybe he IS just the guy who grinds out the numbers and pushes the clicker during the power point presentation -- but SOMEBODY (I am thinking Dr. Enfield, or the Exec Directors of the high schools) needs to be the guys to follow up.
If the ultimate answer is -- yes, we improved, but we can't tell why for sure -- then, that is the answer. But that isn't the answer until somebody has really looked.
exasperated