Wednesday, May 11, 2016

Follow up Q & A from Highly Capable Oversight meeting

The Friday memo to the Board for May 6 includes answers to questions from the Board that could not be answered at the Curriculum and Instruction Management Oversight meeting. It makes for interesting (and odd) reading.

1. The first question is about the student performance and growth and how it is measured. The answer is the familiar Denver method in which student scores for this year are compared to the other scores for this year by students who had the same score last year, and then the scores are ranked on a percentile basis. This method provides an entirely relative measure without any objective measure. By this method one-third of all students will be labeled as slow growth, one-third as average growth, and one-third as high growth, without regard to whether the cohort as a whole experience low, average, or high growth. In other words, if the median growth for the cohort as a whole was three grade levels in a single year, the student who advanced only two grade levels in that year would be labeled as having slow growth. Similarly, if the average growth for the cohort was negative, the student who maintained their level would be viewed as high growth.

2. There were a lot of questions about the racial breakdown of students in HCC. People are really obsessed with this. There is no breakdown of students in HCC by socio-economic status. Why are people ignoring that?

3. The district misrepresented the review of APP done by the University of Virginia in 2007. The staff claim that the review recommended doing away with the self-contained model in grades K-8. This is not true. I have no idea why the staff would tell such a blatant and easily disproved lie.

6. The staff are asked to estimate the ratio of MTSS interventions for support vs the number for enrichment. The staff refused to make any such estimate and acknowledged that they have absolutely no means of assessing how much, if any, MTSS work is done or what is done.
"Curriculum, Assessment & Instruction teams do not approve and cannot require interventions at the building level. Our teams are able to offer suggested academic interventions and support/training for them, but their implementation is a building-level decision."
7. The staff acknowledges, again, that they have no data at all on the implementation of MTSS.

8. More of the same: "Our CAI staff work to provide curricular support to buildings but cannot create accountability structures." It's pretty clear that the staff have no real method of implementing MTSS, and no method of even knowing if it has been implemented. How, exactly, will they measure its progress or know when to declare it done?

9. The district wants to take credit for eliminating pre-qualifications for participation in eligibility evaluation for HCC and Advanced Learning, but they want to impose a pre-qualification: the Smarter Balanced Assessment. So... they're lying.

10. Why were Spectrum/ALO not discussed? "These were not part of the work session. This was a Curriculum, Assessment & Instruction and Highly Capable work session." So when WILL Spectrum and ALO be discussed? Oh, right. Never. Here's a funny thing: look at the next question and answer.

11. "For example, an issue we have had to address multiple times is the claim that “they [AL and district] have done away with Spectrum.” This is simply false. Schools are moving away from a self-contained model, but the Spectrum program very much exists." Waitaminute. I thought that this work session wasn't about Spectrum and ALO and they would not be discussed? Also, how does anyone at the district know that the Spectrum program exists since they don't assess for it?

12. Where is the curriculum for HCC? Nowhere. The staff try, again, to make contradictory statements. They want to say, on one hand, that they are working to align the curriculum among all of the sites and then, on the other hand, say that all of the schools have autonomy in the decision making process. This is a big, fat lie. Schools should not have any more autonomy in setting curriculum for HCC than they have when it comes to general education - none.


Anonymous said...

Also add to the discussion the SBAC test scores for HCC enrolled vs HCC qualified and not in HCC. The scores are aggregated for grades 4-8 and show higher ELA pass rates for HCC qualified and not in HCC, though similar pass scores for math. There's no breakdown by grade or school, or indication of opt out rates, so it's hard to make any sense of the data. HCC LA/SS classes are blended with Spectrum classes in middle school, at least at JAMS. Spectrum (and single subject qualified students? - it's really not clear how students are assigned) are in HCC classes. I am inclined to attribute score discrepancies to the lack of a challenging, defined curriculum, as the classes have varied wildly depending on the teacher. The actual curriculum continues to be a mystery.


Anonymous said...


Charlie Mas said...

There is no possible conclusion to draw from the SBAC data.
It could be that it shows that the programs are not effective.
It could be that it shows that grade-level assessments are not a good measure of advanced work.
It could be that students are self-selecting to stay in strong neighborhood programs and to leave weak ones.
It could be that this is just one year's assessment and data and it is a statistical outlier.
There are so many possible interpretations of the data that there is no possible interpretation of the data.

Ouch said...

Thanks Charlie it is all so mind numbing.

Well there is one conclusion that the >80% ELA sure looks bad for HC services.

Anonymous said...

Huh? How did they calculate growth when it was the first year of the test?
Are opt outs included in the totals used to calculate "Row N"?
Why didn't they include third grade data, and why didn't they at least break it out by elementary HCC (which is truly cohorted) versus MS (where the cohorting is fuzzier)? Also, I though math in grades 6-8 wasn't even part of the cohort pathway? So they are breaking out the math score according to who is in the LA pathway in middle school?


Anonymous said...

These test score differences go back to the WASL days. On the face of it in elementary the APP qualified kids who stay in their neighborhood schools like John Hay or Bryant are about the most homogeneous population you can think of. I see the well-behaved, and yes, often female, students stay. Parents are highly involved and taking on a lot of the academics. Boys who act out, kids who need more behavioral supports...they leave for the cohort.

No news

Anonymous said...

Holy Batman on that link from TP. And strong disagreement that no conclusions can be drawn from that data.

The data is absolutely stunning.

In both English/Language Arts and in Math, students who are HCC and Spectrum qualified did better in terms of cohort achievement (percent meeting highest SBAC standard) and cohort growth (year over year) when they remained in SPS general ed classrooms vs. moving into the HCC/Spectrum program.

In the English/Language Arts area, the numbers aren't even close. The HCC/Spectrum-qualified students in general ed are killing the scores of the self-contained students. Students don't 'forget' English/Language Arts skills over two years, which is the standard argument for why self-contained, accelerated HCC students may not score top of game on grade level math examinations.

The data shows that the current HCC model needs a reboot. We can disagree on what that looks like, as I am in agreement that next steps cannot be gleaned from that data. Maybe it is more training for the HCC teachers. Maybe it is a different curricular approach. Maybe it really is taking a look at the self-contained model. But the message from that data is something has to change.


Anonymous said...

Question: What is “out-of-sync information and messages from higher leadership and site-based school leaders hinders the quality of services” (identified as a threat in slide 8)? What, in specific, does this mean?

Our district’s culture of autonomy at the building level means that information and
messages coming from schools sometimes differ from our understandings within central
office. Our CAI staff work to provide curricular support to buildings but cannot create
accountability structures.

Um....what?? Can't or don't want to or what??? That is one of the more blatantly weird things I've seen from SPS admin in a while...


Greenwoody said...

Test scores are not an effective way to determine how well a program is meeting the needs of our kids. That's true for HCC as well. HCC has other issues to address, including diversity, but test scores really should not be a factor in this discussion.

Anonymous said...

@ Greenwoody. Really? Then why is SBAC used as a gateway to entering HCC? The program can't have it both ways. Either stop with SBAC as an achievement test qualification to HCC and also stop using SBAC as a measurement, or live with both.

This is not a comment on the students within the program. It is a comment on AL Administration or lack thereof.

I'd like to get rid of SBAC altogether. But if it's going to be used as a qualification for getting in, it stands to reason that results within the program will have that measuring stick.

It is an hour after I first looked at that data and I am still shaking my head. It is not not good.


Anonymous said...

Shocked, did you ever see the WASL data? Have you been shaking your head for years at this point?

No news

Melissa Westbrook said...

So much that has happened over the last year make me disappointed in Larry Nyland. He's a seasoned administrator (albeit in a smaller districts) and I don't get how he doesn't see how bad this all looks (not to mention feels.)

I keep waiting for the smart person (with a sheriff mentality) to come into SPS, look around deeply and say, "Wait, what? Nope, that's not how we're doing it."

There is so much that needs to change and we have the Board to do it but if the leadership isn't there, it's not going to happen.

Anonymous said...

The APP program evaluation report from 2007 shows the same trend with grade level test standardized scores.

On pg. 17 they state students who were "APP eligible but not attending Lowell or Washington [the only APP sites at the time] had significantly higher reading WASL scores than the APP group who attended Lowell and Washington. There was no significant difference between groups on the math scale scores."...not too different from the SBAC scores reported in the recent Friday Memo. It goes on to stress the need for a curricular framework should additional self contained sites be created, especially given the issue of "highly-variable teachers."


Anonymous said...

Dear No News: No, we were not in the SPS system during the WASL. That was before the MSP wasn't it? How many years ago was that? I see the poster above me shows the same trend in 2007, so was it before that?

That this underperformance has been noted for at least a decade is even more shocking. After that 2007 report, was the issue ever addressed by staff? I don't think so, but I don't know so. Has the issue ever been daylighted in the APP now HCC community of parents? Again, I don't think so but I don't know so.

This district is so messed up in so many ways. Now I am adding to the list "SPS HCC - Where talented students go to underperform." Maybe we don't want to push the students on the low end of the achievement gap into HCC. I do not believe that but what an irony.


Anonymous said...

Opt outs are counted as Zero scores and are counted towards the overall percentages.


statistician said...

Shocked, I don't necessarily disagree with your outrage over these numbers as presented, but I do want to point out a couple things that might make you feel better. I have a kid in our neighborhood school and one in the APP cohort. I know many APP kids who left the neighborhood school for the cohort and many who stayed. The ones who stayed were by and large girls without any sped needs (that I noticed which doesn't mean they didn't). The kids who left (I know, small sample size) were mostly boys who did have some sort of special needs...being on the spectrum, ADHD, whatever..

Most folks can make data say whatever they want it to. Without a much further breakdown of these numbers, we can't really get too upset by it. For example:

How many kids opted out? I opted my kid in the cohort out, but not at our neighborhood school. So, the cohort kid had a score of 0 that is included in these results. That kid is excelling in every way. My other kid has scores included in the numbers.

What is the breakdown of girls/boys? My girl child tests MUCH better than my boy child since he couldn't care less. She cares and tries.

What is the breakdown of my original point, how many kids need some sort of support?

Our local tutoring center is packed to the gills daily with kids from our neighborhood school, but I don't see the cohort kids there. Are parents who stay local supplementing more than the cohort parents? No idea, but that's not answered.

Someone else pointed this out, but what schools are the APP kids staying at instead of leaving for the cohort and what are the "average" test scores at those schools? Maybe they are almost in a cohort in their neighborhood school.

Lastly, I don't see my kid in the cohort being "taught to the test" as much as my kid in the neighborhood school is. The results could simply be some kids are only learning stuff that will help them on the test. Others in the cohort might be getting a different education since the teachers know they will do well enough on the tests for them not to get fired.

I'm not saying any of this is better or worse. I just feel strongly that we can make any numbers say what we want them to say. These sorts of presentations that say "gotcha" would never fly in my private sector job. I'd get eaten alive if I presented something and tried to make a correlation without significantly more data.

Anonymous said...

The table does not show a zero, or no score category, and the percentages reflect just the scores shown. Look at the count for HCC, in cohort, Y, for example.

Level 1 - 0
Level 2 - 11
Level 3 - 97
Level 4 - 1409

They add to 1517, and 1409/1517 = 92.9%, as shown in the table. If opt outs were included, the percentage would be even lower.


Anonymous said...
This comment has been removed by a blog administrator.
Watching said...

Are SBAC scores really being used for HCC placement? Are there SBAC implications for students leaving middle school and entering high school?

Melissa Westbrook said...

One data point on a new test? I'm not sure I'd get that excited over it.

Anonymous said...
This comment has been removed by a blog administrator.
Anonymous said...

"2. There were a lot of questions about the racial breakdown of students in HCC. People are really obsessed with this. There is no breakdown of students in HCC by socio-economic status. Why are people ignoring that?"

People might be "obsessed" because of SPS's treatment of children of color. We pay strict attention to disparities because we know that SPS has a long way to go towards achieving equity for its students.

See other thread discussing Seattle's black-white achievement gap and Seattle coming in fifth out of 200 school districts.


Anonymous said...

What is the difference between HCC cohort students and HCC neighborhood students? Overwhelming difference - private testing. Neighborhood HCC students are those who don't want a segregated environment so badly that they will do anything to get it. Contrast that to those who are in the segregated program. 20-30% private testing in. Clearly, a higher caliber student who doesn't need so many exceptions to qualify for the program is evidenced by the higher test scores, over many rears and tests. Funny how the HCC community screams about how badly they require segregation for academic achievement, yet the kids' results don't show it. It's about the segregation, not the results. Charlie always said "it's about the cohort", and he is right.


Anonymous said...


As always, you post inflammatory statements regarding HCC.

You said: "Neighborhood HCC students are those who don't want a segregated environment so badly that they will do anything to get it. Contrast that to those who are in the segregated program.

I am very interested and would like to see the source for your statements. Citation please? I would like to see documentation showing that privately tested kids are NOT in neighborhood schools.


Anonymous said...
This comment has been removed by a blog administrator.
Melissa Westbrook said...

Reprinted for Anonymous - no anonymous comments, please.

"So what do we know about those 175 or so kids who are HC-qualified but not in the cohort, and who did so well on the SBA? Not much! Is their neighborhood school providing a rockin' curriculum for HC kids? Are their teachers awesome at differentiation? Are these kids even taking math and English at the neighborhood school, or are they perhaps homeschooled in some subjects? Are these kids receiving high levels or supplementation outside of school, to make up for an insufficiently challenging GE curriculum? We have no idea, so we can't interpret these data in any meaningful way.

My kid, for example, was very advanced in math. The neighborhood school couldn't meet his need--but neither could HCC, so we didn't move. We opted out of school-based math instruction altogether, doing it by independent study instead. But we needed to do an assessment (part of the homeschooling contract), so the school got "credit" for his score, which was the top score they'd seen. Other families are in the same boat.

Suggesting that these data demonstrate GE classes are working well--or better!--for HC kids is flawed logic."

"The fundamental problem is that the Advanced Learning office today is a testing service and a resource for teacher training, and when asked they say as much. People have raised a ton of very valid questions about the SBAC data. If there was truly an empowered Advanced Learning Office instead of an Advanced Learning Testing Office, the questions would have already been addressed because someone would be advocating for the program and showing the data for what it is, meaningless. Special Ed has an Executive Director. There’s a director for ELL and International Programs. Athletics has an Executive Director. But on the district’s org chart (see pg. 52 of 2015-2016 budget doc), Advanced Learning doesn’t even rate its own box."

Charlie and I have said this last point for years, thanks for saying it again.

Ouch said...

And reader I would also wager that many of the NE schools are racially and SES segregated as much as Cascadia is. Last time I looked at those numbers many of the non-cohort kids were going to those very same schools. Pretty prejudiced view of the world I would say.

statistician said...

Reader, There are many kids in the cohort that are private tested, but the many who stayed at our neighborhood school are, as well. They all wanted the golden ticket away from Whitman and into Hamilton for MS.

Facts Matter said...


Whitman has one of the best advanced math teachers in the city. All of her students go into high school honor math classes and they are two years ahead of their peers.

Anonymous said...

1. I am curious about Melissa's comments about being disappointed in Nyland. Is she willing to be more detailed about that? Does she mean in regards to advanced placement or other areas?

2. I agree that this as SBA is a new test and that report without much analysis is of limited value. But. If other readers say this pattern of higher test achievement of APP students in general education v self-contained has persisted for years, across multiple tests, and across different total sizes of self-contained cohorts, then yes. Yes it is of significance.

3. Does this data indeed exist? Or is it just hearsay?

4. I hope this new issue does not overshadow the issue of lack of some minority and possibly lack of some socio-economic tiers of students in APP. Here we also need more data to understand the issue and come up with achievable fixes.

5. The lack of current public data and analysis on these points is p-poor given the $$$$$$ diverted to the "data warehouse" and "data analysis" initiatives and away from our classrooms these past 10 years.


Anonymous said...

Saw this post earlier and thought it apropos,

" Here's the publication link again:

Here is the author's conclusion:

"Advocates of full inclusion and those who struggle for appropriate education for students identified as gifted must not become entrenched enemies. There is little that is incompatible in the vision of both groups: schools that teach, challenge, and honor children for who they are. If we must settle for classrooms as they are now organized and staffed, curriculum as it is currently defined, and teaching strategies limited to lecture and whole-group instruction, then it is no wonder that advocates for gifted students see the students' removal and segregation as the only viable solution.

If, however, we can envision new models of school organization, curriculum, and pedagogy, then we can embrace within that vision classrooms that meet the needs of all students, including those identified as gifted. If we can see our goal as not just saving those students for whom educational marginality or failure is considered intolerable, but as assuring educational success for all children, then there are other possibilities. And, if we can look at aspects of the current system that are not working for students labeled as gifted as barometers of an unsuccessful system rather than as justification for removing students to a better subsystem, then we can work together toward far-reaching, comprehensive school reform for all students."

I'd guess SPS is moving in this direction.


Albert Swenson

Chris S. said...

I wholeheartedly endorse the idea that data is insufficient for any conclusions. At the very least, to determine if presence in the cohort really has any correlation with test scores (and correlation is not causation) you'd have to adjust for any other differences in the cohort-choosers vs. non-choosers, like, as others have mentioned gender, IEP/504s. There are probably a ton of unmeasurable factors too. Also, I'd want to separate recent cohort entrants from long-time cohort members; recent entrants are undergoing a transition and may have not received much education at all in the prior year (anecdotal evidence.)

I also want to point out that SBAC state tests are grade-level and HCC is supposedly working ahead. So a) the SBAC content may not be a recent curricular focus for the cohort and may in fact have been something they skipped b) there may be differing approaches to test prep in the cohort versus the neighborhood school.

Anonymous said...

-yawn. Gee. The "data" is simply fact. Why and when do people privately test into HCC? When they fail the district's required entrance tests. After that happens, parents appeal with private testing. If you don't plan to enroll in an HCC school - there's no need to privately test into it. (duh) Therefore, students in HCC schools included those who privately test in.

So. The HCC "identified" students remaining in local schools are the ones who DON'T NEED an appeal. (duh)

One would expect (and data reveals) that HCC students (including the private-tester/appealers) in HCC schools perform less well than HCC students, all of whom didn't need an appeal. You don't need an appeal if you are staying local and aren't enrolling in an HCC school. The data includes the same info in the 2007 audit. Any reasonable person could predict that those who need an appeal to get into an HCC - are likely to be those who don't do as well further on down the road.

Let me guess - the new excuse for segregation.... "Testing doesn't matter - except for the testing to get my kid into the program. That's the only valid testing. The tests to stay in, or prove that the kid benefits from the program more than regular ed - irrelevant."


Chris S. said...

I am admittedly ambivalent and not all that well informed about self-contained HCC, but I do have a pretty strong opinion about using data to create bulls**t.

Only slightly relevant but totally fun:

This memo doesn't even count as p-hacking because there is no real statistical significance involved, no attempt to even address variability and measurement error. It might be worth pointing out that the non-cohort group is pretty small and therefore their numbers would be expected to vary more over the years, but that's about all I can say.

Anonymous said...


Ok. Go for it. I am as stupid as you imply in your post. Treat me as such and prove your claim. I want to see numbers and a link to where you got them.

For us morons out here, prove this: "The HCC "identified" students remaining in local schools are the ones who DON'T NEED an appeal. (duh)"

While you're at it, I want proof of this: "...{Privtely tested} are likely to be those who don't do as well further on down the road."

Pulling garbage out of thin air and sending it through your fingers as you type, does not make it fact. Duh.

I await your proof, but I will not be holding my breath.


Anonymous said...

We could argue that the data has no meaning. Or we could argue the HCC emperor has no clothes.
Why are parents who fight so hard to get kids into the program so reluctant to examine the program? What is the program, really, other than messed up? What curricular advantage does the program really offer to "HCC-identified kids" other than, as Charlie always says, the cohort itself? As the hamburger commercial used to ask, "Where's the Beef?"


Anonymous said...

I am really excited about efforts to examine the program. I think the LA curriculum in particular is weak(maybe this shows that? But maybe not), and there needs to be more differentiation within the program. But I am disappointed that they are using a grade level test, and a bad one at that, and not making any effort at all to mitigate things like- non cohort students may supplement at a much higher rate (could they survey those families? About a lot of things?), have reasons for self selection, be at particular schools with good programs for advanced learners, etc. I think they have been doing some data collection on appeals, and I truly think that's great.

If people who I trusted were interested in the needs of advanced learners and in Seattle having a strong program for advanced learners(self contained or otherwise) were the ones doing this data exploration I would trust 1) that they would find better data and 2) that they weren't just looking for anything and everything to possibly justify splitting it up(and almost certainly NOT replacing it with anything in home schools- if we split it up all I expect is one size fits all for everyone. I have things I would like to improve about the program, but it's miles better than the one size fits all approach was for my kid. 2 years advanced is a closer fit.). As it stands, Blanford is as hostile a director towards advanced learners as we have had(and Michael Tolley is no better), and this data is clearly so cruddy. The only reason I can think of for them using it is they found some kind of "gotcha!" with the ELA data. I would love to be proven wrong, but I have been around the district for a long time now. Running out of faith that I am going to be surprised by staff this way.

Can't speak for everyone, but that is where I am coming from, Wendy. It's a lack of faith in the district, not so much blind faith in the program, if that makes sense.


Anonymous said...

Aghast said: "But. If other readers say this pattern of higher test achievement of APP students in general education v self-contained has persisted for years, across multiple tests, and across different total sizes of self-contained cohorts, then yes. Yes it is of significance."

What significance? Can you explain what it means? What exactly can you glean from these data about the educational experience of those in GE vs. self-contained?

The scores don't tell you anything if you don't also know what the kids in each setting are getting in--and out--of school.

Context Matters

Melissa Westbrook said...

Aghast, I would like a superintendent who will come into this district and right the operations side of it.

We almost had that with Dr. Goodloe-Johnson. She DID order audits into various departments including HCC (but only for HCC kids, not Spectrum or ALOs.)

I need to get a new link for the Moss Adams report but that is what I want. I want an objective, outside look at the operations of this district and that the Board hands it to the Superintendent and says, "Go forth and heal this district."

Operations limps along, both with facilities and academics.

Work session after work session from various departments and nearly ALL of them state as "risks" that data cannot be properly accessed and systems cannot communicate with each other. And, worst of all, we have people doing a lot of manual work.

I have heard this for 15 years. We spend and spend on technology and I remember we were promised - for various BEX/BTA spending - that "this time we'll get it right." And it's not and it's costing time and money this district doesn't have.

Charles Wright knew this. I honestly believe he did try. We DO have some smart people in SPS and yet somehow, it still doesn't work.

And, as the Moss Adams report so brilliantly pointed out, if you do not change the culture of a bureaucracy, you change nothing.

JSCEE needs its own personal earthquake and superintendent after superintendent, I hold my breath and hope. It never happens.

Why are parents who fight so hard to get kids into the program so reluctant to examine the program?"

Parents would LOVE to examine all the Advanced Learning programs. We've had two taskforces and yet, the power remains with the district so what parents think doesn't seem to matter. But many parents don't want to rock the boat for fear that they will lose the one thing they do have - a cohort. They don't have a curriculum, only a few trained teachers and their class sizes are the same as any others. So it's the cohort.

Reader, I ask you again to watch your tone. Please don't imply other people are Homer Simpson.

Anonymous said...

Oh, reader. Here we go again. You said: "Neighborhood HCC students are those who don't want a segregated environment so badly that they will do anything to get it. Contrast that to those who are in the segregated program."

I call BS on that. Wanting a program so badly that you're willing to pay for private testing is probably due to ACADEMIC NEEDS, not a wish for segregation. Really. Get over it.

And maybe 20-30% are privately testing in as you suggest, who knows. I haven't seen those data though, have you? (If so, please post!) But even if that's the case, it doesn't tell you much. There's no evidence that the privately tested kids are the ones scoring lower on the SBA. And even if there were, maybe those kids are cognitively gifted and really need the program, but don't always do so well on tests for whatever reason (e.g., special needs). Are you really suggesting that SBA results are the best long-term measure of whether or not HCC is the most appropriate placement for any given kid?

Evidence Please

Anonymous said...

@Reader, with the change in AL policy, students can qualify and remain at their neighborhood school until joining the cohort in middle school. Previously, students needed to retest if they qualified but chose to remain at their neighborhood school and wanted to join the cohort at a later time. I suspect more families are going this route. The one thing that hasn't changed is needing to be in the HCC cohort in 8th grade in order to get assignment to Garfield (of course, that may be changing soon as well).

What is the program, really, other than messed up?
Exactly. I would welcome an examination of the program from an academic perspective, starting with some of the issues brought forth with the 2007 report - teacher training, curriculum framework, etc. I continue to be amazed by how much energy is focused on who gets into the program, while the program itself languishes.


Anonymous said...

Here's an interesting article about various "exam schools" across the country and, among other things, looks at whether they are "effective".

The article also mentions how CollegeBoard is distorting high school curricula (IMO, purely in the pursuit of making money).


Anonymous said...

Evidence Please - The rates of private-tester-inners for HCC as published by the district was posted on this blog just a few months ago. Yes. I saw it. It was something pretty high like 30%. That seems to validate my experience anecdotally. We all know this to be true - there is a HUGE entrance to HCC based on appeal and private testing. The fact that HCC students staying in general ed perform better than those in segregated ed was noted in the APP audit as far back as 2007. I have no reason to doubt the audit. They didn't have an axe to grind. That's the evidence.

I simply gave what I believe to be a plausible explanation for the performance discrepancy (meaning: lack of significant measurable academic gain for those in HCC-segregation, and lower performance in some areas) - HCC students remaining in general ed are MORE highly capable to start with than those in the segregated schools. Another plausible explanation - segregated education doesn't really produce better results. HCC parents seem to have problems with either reason. They do have an axe to grind.


Jon said...

Wow, these APP/HCC posts sure attract a lot of comments.

The venom for HCC seems odd to me. Here we have a successful program both parents and students like that doesn't cost more money.

What exactly is accomplished by destroying this? Is this all about "fixing" the achievement gap by no longer educating the top achievers, which reduces the spread? That's a pretty cynical view, that it's okay to hurt children as long as you can falsify your metrics.

Anonymous said...

@Reader. I am a teacher. Every year I have kids who appeal, are HC qualified and choose to stay at the neighborhood school. It's not that simple.

Anonymous said...

If memory serves me, the appeal numbers were showing the percent of appeals that were successful (for both Spectrum and HCC?), not the percent of HCC students that qualified through appeals. Different things.

Anonymous said...

Reader is not interested in facts. In their world, this is proof - "that seems to validate my experience anecdotally." As we all know, anecdotes and facts are interchangeable.

I still want proof that privately tested kids do not do as well in HCC as kids who test in via the district. Can you please let me know when to expect it?

This is classic coming from you: "They do have an axe to grind." You come on to virtually every post about HCC (and you have for years and years) and make sweeping generalizations about how awful the program is. Talk about pot calling the kettle black.


Anonymous said...
This comment has been removed by a blog administrator.
Anonymous said...

Here's the APP/now HCC analysis that was done by the University of VA
and posted at the APP blog. Not sure why it wasn't posted
or linked on this blog by one of the numerous cross posters
between this blog and the APP one.

Here's a link to the APP blog:

--about time

Anonymous said...

Now it's been double cross posted...the report link was already posted 5/11@11:34am above.

-double yawn

Charlie Mas said...

It has been suggested that families who keep their HCC-eligible children in the neighborhood school have no motivation to appeal HCC eligibility decisions and, therefore, have children who were eligible without an appeal.

I don't see the logic. Whatever motivated these families to have their child tested in the first place would provide the same motivation to appeal an adverse decision. If they didn't want their child to qualify, then why have them tested at all?

Charlie Mas said...

It is correct that the grade-level state test pass rates for HC students outside the HC program have, historically, been comparable or better than the pass rates for HC students inside the program. However, this data alone does not allow for a definitive conclusion about why those are the results. A number of plausible reasons can be proposed.
1. The program is ineffective.
2. The students in the program don't take the test as seriously as their peers in general education classes.
3. The students in general education classes get more test prep.
4. The students in general education classes get more grade level instruction so the test topics are fresher in their minds.
5. Changing schools disrupts students education temporarily.
6. Some neighborhood schools do a fine job of educating highly capable students.
7. HC students outside the program are supplementing.

There are probably a number of other possible explanations and, of course, there is the possibility of a mix of these.

Given the ambiguity around the test results and the complete absence of any meaning in them, they should not be the focus of the discussion. The discussion should focus instead on the fact that the District is dismantling programs saying that MTSS will replace them, but the District is not, in fact, checking that MTSS evens exists, let alone provides an effective replacement.

Anonymous said...

So, we need to support APP qualified kids AND bring up achievement for black kids. Then we can be found by researchers to have the highest achieving kids in the nation rather than the most racial disparity. Wouldn't being known for high achievement be better than aiming for mediocrity?

Why does this district keep trying to cut high achievers off at the knees? Let's meet all children where they are and help them all move forward. Is this really such a terrible goal?


Anonymous said...

For what it's worth...

We had our son tested privately & he qualified for HCC but we're keeping him at his current elementary (not neighborhood school).

Also, I used to be wary of the private testing results too! But that's before we had our daughter tested and she qualified for HCC as well. I think the test AL uses to identify kids misses many many kids who should be in these programs. I say blame the testing, not the kids or their parents.

Mag mom

Anonymous said...

Most districts like Northshore up in Bothell which was discussed in a thread earlier this week, does not allow private testing. They have a very easy to understand policy and implementation that also identifies single-subject gifted.

here's their appeal procedure:

All we know is many, many HC kids do very well without the cohort.

That's all we can say without more data.

Yawning Two

Lynn said...

Here's a link to the results of a survey of Northshore parents of highly capable students:

There's a chart on page 33 which reports their satisfaction with services received in neighborhood schools vs their locations with self-contaIned classrooms. They are not surprising. Given the district's limited appeal process, we can assume there are many gifted students in neighborhood schools receiving no services at all. Oh - and guess what services as provided to highly capable students who choose to remain in their neighborhood middle schools? They are cluster-grouped in Challenge classrooms. (Which have been discontinued.)

Anonymous said...

All we know is many, many HC kids do very well without the cohort.

I'm not sure this is wholly accurate.

First, the data indicate that only about 175 HC kids grades 4-8 aren't in the cohort--so while I guess this could technically be considered "many, many kids," it's a small portion of those who qualify. Ninety percent of HC-qualified students in this grade range have opted INTO the cohort, presumably because these parents thought it would better meet their children's needs. Only 10% stay at the neighborhood school--and it's probably a small number of schools that retain many HC kids.

Second, while they may not be officially participating in "the HCC cohort", the kids who stay in their neighborhood school may still be part of an HC-qualified cohort. Families often stay where they are because the school does a good job of meeting the needs of these kids (because the school has a lot of them), or because their kid has finally found a couple good friends (often other HC-qualified kids). I didn't move my son in elementary because his best (practically only) friend, who was also academically gifted, wanted to stay at the neighborhood school due to family convenience. Our families both agreed to stay put, so they could be together. We figured they got a lot more out of the work they initiated together than they would from a slightly more advanced (but still insufficient) curriculum.

Context Matters

Anonymous said...

I have two HCC kids. Both were referred to the SPS testing by teachers in our local school. First one tested in on second go. Second tested in on first try. We also had the luxury of having them tested privately for a couple different reasons, the first being that we were new to the idea of APP, were not sure of the meaning of the test results and wanted to be sure about acclimating to APP. Since we did it for the first, we also did it for the second. Every situation is different and decisions are different for every family. We moved them both over, and it was primarily because boredom was setting in, despite some great efforts by some great teachers in wonderful neighborhood school. Since then, my children have absolutely flourished academically. It was the right move for them and for us as a family. I am thankful for the school and very much for the teachers-it is not just the cohort. I'll also mention that we have not had our kids participate in the SBAC tests. Not this year, not last year. Look, I have all the data points I need from the teachers and the tests (some standardized, some not) they administer. Both of my kids took the practice tests which they seemed to feel were pretty easy. Well, one of my kids was not able to take the practice test because the computer was not staying connected so the practice test experience was looking over someone's shoulder…. And it was confirmed that the kids did not seem to take it seriously because it was pretty easy. I don't want to start down the anti or pro SBAC topic. But, I think it is wholly unfair that the SBAC was positioned as a gate this year for HCC. And even as the district backed down on that requirement, they didn't exactly publicize it and many schools are using that Refusal Form that still threatens no access to programs, etc. The AL group is little but a testing team and yes, they might work very hard, but as someone mentioned in a previous post, most of that work is misplaced. Testing is important to a degree but...curriculum and aligned pathways are what's needed. And I do agree that we should be meeting all kids at their level. I am pretty tired of the misplaced notion by SPS that mediocrity is acceptable. SPS seems to be in it for the adults, not the kids, and mediocrity hurts everyone, low middle and high, and provides no incentive to be the inquisitive, wonderful and imaginative kids that we all want to see grow up and flourish. And by the way, Reader, I have yet to meet the parent that you so frequently describe in your posts. And yes, I do get out much!
-SPS Tired

Anonymous said...

there are 1000 HC students who are not in the cohort in SPS, about 20% of the total HC population.

the survey of Northshore parents was not done by the district and was not limited to how many times a family or even parents in the program.

"This survey was conducted by the HiCap PARENT Advisory Board, and is NOT sponsored in any way by the Northshore School District. We will use the results to guide our parent advocacy efforts next year, and will share aggregate/anonymized results with the district as well.
Please answer this survey with one particular childin mind. If you have multiple children who have the need for "highly capable" services, please take the survey again for each child.This survey is anonymous."

Really Yawning

Anonymous said...

@SPS Tired: thank you for the positive HCC testimonial. I'm sure there are hundreds out there who would write a positive testimonial for the great experience their child is having in the HCC cohort, but they're busy leading happy and productive lives. We appreciate the time you took to represent.

@reader: Your generalizations and accusations about the HCC cohort and the families and students involved are entirely too negative and I find them to be inaccurate. Calling people prejudice who attend north Seattle homogeneous schools is out of line. We live in north Seattle, loved our neighborhood school, but moved to Cascadia when our child suggested the idea to us. We had him tested through SPS and he easily qualified first time through, scoring 99% in at least four areas. His grandmother is 1/2 Black and Native American, his other grandparents grew up dirt poor in Highland Park and Tacoma, attending Sealth and Wilson high schools, and his parents worked their fingers to the bone from very young ages and paid their way through graduate school and were proud to buy their first little fixer upper before turning 30. Little did they know they would be squeezing a growing family into this little home, but moving to a big house in the burbs was not appealing. To now be accused of being racist elitist people because we live in a little old home in north Seattle with a kiddo in the HCC cohort strikes me as absurd. We love having the environment-saving option of biking to work, and enjoy having such close proximity to restaurants and GREAT schools. Our child was doing very well in our neighborhood school, and is now flourishing at Cascadia -- we were not desperate to move to the cohort, quite the opposite in fact. We are not prejudice for living where we live, and though I have no idea who you are - I would venture to guess we are much more accepting, open-minded and kind that you imagine yourself to be with your "HCC is elitist and racist" comments.

-bait taken

Ouch said...

Well said bait!

Anonymous said...

Keep in mind that when you look at numbers for who is in the cohort versus who is not, that the numbers are not static.

Many of those students will eventually end up in the cohort. Just because their parents chose to leave them in a neighborhood school this year does not mean they will next year.

Many, many years ago when I was in elementary school I tested just one or two percentiles short of the APP (or whatever it was called then cut-off). I was Horizon/Spectrum qualified, and it was self contained but a one hour bus ride across town to our "cluster"
Spectrum site.

Each year, a couple kids from my elementary school would slip away for Spectrum. By the time I wrapped up fourth grade, my teacher really pushed my parents to send me down there as well as I consistently ahead of the class in my neighborhood school. When I finally went, for fifth grade, it was incredible to see how many kids had trickled down there for third and fourth grade. They were all kids I knew from K-2, but after that point, they, a couple of kids at a time, shifted schools and programs.


Anonymous said...

"Many of those students will eventually end up in the cohort."



Anonymous said...

As President Obama said about the claims by his FBI director that videos of police abuse were fueling a crime wave:

"We do have to stick with the facts. What we can't do is cherry-pick data or use anecdotal evidence to drive policy or to feed political agendas."

The School Board is going to have to deal with the HCC issue soon and I hope they use facts not emotion.


Anonymous said...


Since we don't have information on individual kids, we don't know specific, detailed information. What we do know, however, is that HCC enrollment gets bigger each grade you go up (until 7th grade when people have mostly already moved to MS). As their kids get older, more and more parents choose the program over the neighborhood school. Just look at enrollment data for proof of this.


Anonymous said...

You can't really take any shocking message from the fact that cohorted kids are scoring lower than non-cohorted kids. Non-cohorted kids are at schools who spend the three weeks before the test in focused test prep. Cohort schools carry on curriculum as usual with no special prep, and no emphasis on the alleged importance of the test. Why should they? They've already shown that they are high testers or they wouldn't be in the program. When you're already in the program, WHO CARES about the stupid test?

open ears

Anonymous said...

Apologies, but can someone please link to the most recent HCC enrollment data that show total numbers by grade, at HCC vs non-HCC school, etc.?

Maybe we could have a "Data and Reports" thread, where people can just post links to frequently requested info?


Maureen said...

Someone who actually understands data analysis could separate out many of the confounding effects that have been mentioned here. The simplest version might be to collect time series data on students who eventually tested into HCC and see if their score differentials were positive, negative, or unchanged after they transitioned to the cohort (I forget what that is called, and not willing to look it up--basically a before and after comparison)(*). You could also look for significant differences between kids who appealed and those who didn't. The data is there. SPS chooses to look at it in a superficial way, or maybe they don't have the expertise (not super hard--maybe a 2 or 300 level stats or econometrics level.) Probably ten people who read this blog weekly could run the analysis. I could have done it fifteen or twenty years ago, but not anymore!

Of course, that assumes that the test scores mean anything at all (other than that kids who are good at tests and whose parents let them take tests score high on tests.)

(*)It's obviously somewhat more complicated than I am implying here in three sentences without the ability to type Greek letters!

Anonymous said...

As reader notes " HCC students staying in general ed perform better than those in segregated ed was noted in the APP audit as far back as 2007". "Perform better" only refers to how they show on the state test about which no one cares at the HCC school. It says nothing about how they perform at school in areas the school actually cares about.

open ears

Anonymous said...

I'm probably the odd duck here, but I think it's good news that APP students are doing well in neighborhood schools. That's a positive note. It's good to know some students and parents do have good local choices. I'm not sure how meaningful the comparison is without more info and analysis. It's not like the cohort groups are doing poorly or even average.

I don't want to take it away from these non cohort students by saying they did well by test prep (If it was all test prep, then the non APP kids will benefit too) or they didn't buy into rejecting the standardized tests and thus did better because they were more motivated to do well on the tests. There are parents, APP and non-APP, who rejected standardized testing on principle and excused their children from taking them. Parents from many groups, including SPED, supplement outside schools. So what? I guess I'm not ready to dive off into an empty pool over this.


Anonymous said...

Are there other data on students that SPS tracks? Are there comparisons for cohort/non-cohort students on attendance, graduation, admission to four-year college, etc.?


Anonymous said...

@ LisaG, what's the point in comparing cohort vs. non-cohort students until there's also data on why people choose the cohort or not, and what they get in terms of educational experience in either case. In the absence of such data, what would the results of the comparisons you suggest even mean?

Context Matters

Anonymous said...

Context Matter says (I think) that any comparison of outcomes of OptionA/OptionB is meaningless unless we have an explanation for all the inputs.

Maybe this is just a reading comprehension problem on my part, but I don't get it. If instead of similar graduation rates between OptionA/OptionB, the data showed that one option had a 50% graduation rate and the other option had a 100% graduation rate, would you seriously say the comparison didn't mean anything? Wouldn't you want to try to figure out whether something in the option programs might be affecting the graduation rate?

The AAP recommends later start times for middle schools and high schools because studies showed later start times gave adolescents more sleep time. And more sleep time led to fewer car accidents, less depression, better grades, etc. The AAP did NOT say the results of the comparisons don't mean anything because we don't know why schools chose different start times.


Charlie Mas said...

Again, the test data is completely inconclusive of anything.

Here's what is conclusive: the District Staff are telling us that they are replacing Spectrum with MTSS, but they are also telling us that they have no way of implementing MTSS, measuring the use of MTSS, determining the efficacy of MTSS, or holding schools accountable for implementing MTSS. In other words, MTSS exists only in the JSCEE, not in the schools. They are swapping something real for something hypothetical.

Anonymous said...

Melissa & Charlie,

If we feel a member(s) of the Board is using the data in a selective manner to achieve their political agenda, what's the most effective way to make our voice heard?

Thank you for all the hard work you do to keep us aware and informed.

- Concerned

Anonymous said...

If writing is a part of the ELA results, then there is a simple answer to why HCC kids are not doing better: writing ability is developmental and not a function of other higher academic abilities. For example, a 1st grader could be reading and comprehending at a 3rd or even 7th grade level, but still only writing at a 1st grade level. That's completely to be expected from gifted kids.
It was eye-opening when my son's Spectrum teacher explained that to me many years ago.


Anonymous said...

Those ELA scores are not just writing. They are reading comprehension and writing.

I'll repeat some of the posters above. If this is a one year anomaly on a new test then so be it. If their is a trend of ELA scores within the self-contained cohort noticeably lower than those HCCers staying in their home schools, I want to see that data and I want that portion of HCC to be examined and improved. I see no need for excuses, especially having spent time in the classrooms. The test results mirror my impression: weak ELA. I expect my kids to come out of self-contained HCC doing more than reading big books. I want them to comprehend the themes of the books and I want them to write well. If there is data anywhere that says this is happening for the cohort, great. Let's see it. Otherwise, fix the curriculum and upgrade the teaching presentation.

No excuses

Melissa Westbrook said...

Concerned, interesting question.

I would first ask the Board director if he/she is also aware of evidence A, B, C on the given topic. Perhaps he or she didn't get enough information (or didn't do enough research.)

If he or she doesn't respond or brushes it off, then you might sign up to testify on the subject and point out that the Board (without naming names) doesn't seem to be aware of evidence A, B, C and that you worry that other city leaders or community members are not being truly informed of all the data.

You could let me or Charlie know more details and we could look into it.

(I'll be honest; I can only think of one person on the Board who seems to have or support a political agenda but I could be wrong.)

Anonymous said...

I agree with @no excuses. In middle school HCC LA/SS, my child seemed to do more drawings and projects than writing, and there was very little writing instruction or whole class reading of challenging texts. I sometimes think classes got away with doing limited teaching because the students were already working above grade level. There were a handful of teachers that truly pushed students, but overall, we were underwhelmed with what were supposed to be "significantly" advanced classes (original district wording). Is this the same in gen ed classes? I don't know, as I don't have a comparison, but let's hope the data leads to T&L looking at LA instruction district wide, and not just saying there's no need for advanced classes.


Anonymous said...

@ Lisa G,

I don't mean to imply that there's no point to doing such comparisons if one were willing to then go to the next level and try to figure out what the differences mean. However, I don't have the sense that SPS staff want to do that. I think the general sense is that they are content to draw conclusions from these superficial data. When the data come as an attachment to a Friday memo that ignores a laundry list of recommendations from the 2007 evaluation that were never implemented and instead chooses to "spin" a recommendation that there not be a single or two regional self-contained 1-8 arrangement (elem PLUS middle school) into a recommendation to eliminate self-containment altogether, it's hard to believe that staff want to dig deeper and figure out why scores of the small percentage of HC kids who remain in their neighborhood school are a little better than those in the cohort. And if they had any curiosity about it, you think they'd at least break things down by middle school and elementary as a first step instead or presenting data like this.

As presented, these data don't tell us anything aside from what we already knew--that it would be good to dig deeper.

Context Matters

Anonymous said...

And if they had any curiosity about it, you think they'd at least break things down by middle school and elementary as a first step instead or[of?] presenting data like this.

Yep. Let's hope members of the Board ask for a more detailed summary of the data. You know you've been in this district too long when you can't take info at face value...

-what's next?