What's happening with program evaluation?

Seattle Public Schools has a longstanding policy that requires annual evaluations for all academic programs. It's policy 2090 and the need for these evaluations is beyond question.

Not only do annual program evaluations provide a critical check on the quality and efficacy of our academic programs, but they are an integral element of the performance evaluation of the program managers, they are a necessary element of the District's budgeting process, and they address the most fundamental question for district managers: are the various programs getting the job done.

There's really only two things wrong with the policy:
1. Despite the policy, the District does not conduct any program evaluations and never has.
2. Do I really have to list a second thing? Let's just list that first thing again but with greater outrage and disbelief: Despite a policy that clearly requires annual evaluations of the efficacy of every academic program, and despite the critical and fundamental role that such evaluations should play in every aspect of managing the district, Seattle Public Schools has never conducted any program evaluations ever.



Take a moment with this.

The District is spending enormous amounts of money expanding language immersion programs on the blind and unsubstantiated belief that they will... what? Improve academic outcomes for students, I presume. Were the goals of language immersion ever stated? Have the impacts of language immersion ever been measured? A dozen schools dismantled their Spectrum programs with promises that they will still be just as effective without any data on how effective the programs were before and without any data on how effective the programs became after the change. A Montessori program was blended with non-Montessori throughout a school because non-White families didn't choose Montessori in sufficient numbers. The district refuses to support IB programs at high schools without any data on the cost-effectiveness of these programs. We have no measure of the effectiveness of our English Language Learner programs or our programs for students with disabilities (yes, they call Special Education a "service" in some places, but there are still lots of places where they call it a program). The District completely re-designed the way they talk about Special Education in the JSCEE, how has that worked for students in schools? We don't know because there's no data. I have to wonder how they conduct the program manager's performance evaluations without any data on the quality or efficacy of the programs they manage. Surely that would be the best measure of the manager's performance, wouldn't it?

Thankfully we now have a Board that is asking the superintendent to meet the requirements of the policy. Weird, right? So what is the superintendent's response? He suggests that we change the policy so it doesn't require the reports.

If you scroll nearly to the bottom of the agenda for the April 4 Curriculum and Instruction Policy Committee meeting, you will see the proposed amendments to the policy.

Here's the proposed revision:
"Seattle Public Schools requires efficiency and effectiveness in all facets of district operations, including its instructional programs. In order to achieve this goal, each year the Superintendent shall provide the Board the following:
A. Clear statement of goals and objectives for improving district instructional programs;
B. Summary of new and continuing investments in staffing and other resources to achieve the stated goals and objectives; and
C. Plan for evaluating the extent to which district goals and expectations are being met."
There's a lot of inaction verbs in there. Also, I notice it says programs - leaving out, of course, services, sites, classrooms, curricular foci, and any other nomenclature yet to be invented to evade regulation.

We need action steps for improving instructional programs, but instead we're going to get goals and objectives for improving them. By the way, what is the difference between a goal and an objective in this case?
We need an evaluation of the extent that district goals are being met, not a plan for an evaluation.
As for the staffing and investments, I don't think the Board needs to see that. You can report that in the budget.

The proposed revision goes on to say:
"The Superintendent shall prepare an annual report which reflects the degree to which district goals and objectives related to the instructional program have been accomplished."
but I notice that the instructional programs that were plural in the first paragraph that gets only lip service becomes singular in the second paragraph where an actual meaningful measure is required.  They seem to be implying that the district has just one big instructional program. That might suggest that the District Dashboard answers the question. That's not what we're looking for here.

While the policy does say "Student data will be disaggregated by relevant and applicable programs or service areas for which data is available," that still doesn't get the job done. That means they will describe individual programs only if the data falls into their laps. They won't seek it out. So, for example, they won't measure the efficacy of IB, Spectrum, or ALOs since the IB, Spectrum, and ALO student data isn't already reported separately for them. They won't be able to measure the efficacy of Highly Capable Services provided outside of HCC because the student counts will be too low to report.

Just to be clear, what we're looking for is a written report filled with data that describes the quality and efficacy of the District's many instructional programs including: ALO and Spectrum (they're the same thing now, right?)- at the school level and the district level, HCC - at the school level and the district level, each and every special education program - including Access  and Continuum - at the school level and the District level, the general education program (separate from any other programs at the school) at the school level and the district level, language immersion, IB, Bilingual Orientation Center, International programs (whatever those are), plus HC services, ELL services, and SpEd services provided outside of programs.

We want to know what works, what doesn't, and what is worth the investment and what isn't. We want to know which schools are actually providing services and which schools only claim to provide services. We want to know this and it drives me crazy that the District has gone all of this time without bothering to collect that data because they don't want to know these things. It proves that the District has priorities other than what works for students, such as what satisfies the internal politics of the JSCEE.

Our smart and capable Board Directors, the ones who have pushed this issue this far, need to push it just a bit further. They have already seen how the Superintendent and his staff play games with the word "program". Don't give them another playground in this policy.

Comments

Anonymous said…
You could probably still get some data from before Spectrum programs were dismantled and after simply by using the test data such as End of Course Exam results and Standardized test data that is in records for all the students in those programs. I think those records could be searched out and data still gathered if funding for this was provided. It would be interesting to see that data.
NW Mom
Anonymous said…
Probably, when it comes time to produce the report - Programs are quickly reclassified as Services or Schools. Voila, no report necessary! ;-)

-StepJ
I have wonder what is going to get said at this week's Work Session where Highly Capable will be discussed.
3inSPS said…
NW Mom, I wonder if AP pass rates would be a good way to evaluate before and after for Spectrum and HCC or even APP. That would really show real numbers: % of participants and average scores for each group.

Charlie Mas said…
Judging from CSIPs, the measure of ALO Spectrum programs will be the percentage of participating students who score "exceeds" on the state tests.
Anonymous said…
The district selects students who score well on tests to participate in Advanced Learning. If those students then later score well on tests, that probably doesn't show whether the Advanced Learning programs have helped the students. Maybe the programs have helped them, but the test scores probably don't contain that information.

Irene
Charlie Mas said…
@Irene, I'm not going to dispute your logic. Believe it or not, however, there was a time when about 20% of the students at Leschi's Spectrum program didn't even pass the WASL. Continuing to earn a level 4 score on the tests is pretty good evidence that the school is supporting learning beyond the grade level Standard. I'm not sure what other evidence they could provide.

Of course, maybe they could offer something if there were academic Standards for Spectrum students, but those don't exist.

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

Education News Roundup