Washington State SBAC 2016 Scores Released

OSPI released the SBAC scores today.  Scores increased but then again, you have to see where they started.

The high school scores for LA went from 26 to 76% which is quite a wild swing but the scores for math are dismal.


Tenth and eleventh graders again opted out below the 95% needed participation rate (92.9% and 88.1%, respectively).

Seattle Schools' scores were mostly stagnant by grade level although the high school scores for LA went from about 10% last year to almost 80% this year.  Third grade math seems to should the most growth.  I await the district's analysis.

Percent of students proficient, 2015-16


ELA
Math
Grade
2015
2016
Diff.
2015
2016
Diff.
3
51.7
54.3
2.6
56.4
58.9
2.5
4
54.1
57.0
2.9
53.7
55.4
1.7
5
57.1
60.1
3.0
47.7
49.2
1.5
6
53.5
56.5
3.0
45.2
48.0
2.8
7
56.2
58.5
2.3
47.6
49.8
2.2
8
56.4
59.7
3.3
45.8
47.8
2.0

For the Class of 2017, three out of four students (75.5 percent) are college and career ready in ELA as they enter their senior year, compared to one out of four students (26.1 percent) of the Class of 2016 as they entered their senior year. The 75.5 percent includes students who met as 10th graders and those who met as 11th graders. In math, the proficiency rate for the Class of 2017 is 21.8 percent, compared to 13.7 percent of the Class of 2016.

Students in the Class of 2018 – 10th graders in Spring 2016 – also performed well on the Smarter Balanced ELA test. A total of 70.8 percent who tested met the college- and career-ready standard in ELA and 55.0 percent met the same standard in math.

Participation

Schools tested 97-98 percent of their students in grades 3-8, with no more than 3 percent of students in any single grade refusing to take the tests. For 11th graders, the refusal rate was larger. Including students who passed the test as 10th graders in 2015, 11th grade participation in the ELA test was 88.1 percent and 61.4 percent in the math test. By comparison, the participation rates in 2015 were 53.3 percent for ELA and 49.6 percent for math.


Annual Assessment State Score Release

Of note, "other states have not released their SBAC scores yet." How is this helpful to anyone?

Page 11 of this document has a pixy-stick-like graph the likes of which I have never seen before.

Comments

Anonymous said…
The high school scores for LA went from 26 to 76% which is quite a wild swing but the scores for math are dismal.

I suspect the large increase in percent proficient is because more students took the test. If you don't take it, you're not considered proficient. That likely explains a lot of the math vs. LA disparity, too, since more HS students took the LA portion. There were probably some who didn't need it since they'd already passed the EOCs.

HF
Anonymous said…
On the HCC blog (which is off limits to posting now, at least for me, because I eschew social media and tracking) the moderator states "The previous version of this data was used by Shauna Heath's in board meetings last year to suggest that the cohort model was ineffective"

Would you or anyone else mind explaining to me how these data speak to the cohort model? Or whether any conclusions can be drawn from these data to set district policy?
Thanks for any insight.

-SPSParent
Anonymous said…
SPSParent, if testing data for HCC self contained students does not significantly outperform HCC qualified students who stay in home schools - then the case can be made that HCC students do just as well without self containment. Self containment is supposed to provide better academic results, not provide a social privilege or a peer group for it's own sake. Failure to outperform neighborhood hcc qualified students is the case that has been made in the past.

History
Lynn said…
I'd like to see where you read that the point of the cohort is to increase student scores on grade level standardized tests. I've looked but can't find it on the SPS and OSPI websites.

The district policy states The framework for such programs or services will encompass, but is not limited to, the following objectives:
A. Expansion of students’ academic and intellectual skills in every year of education;
B. Stimulation of students’ intellectual curiosity, independence and responsibility;
C. Development of students’ social and emotional wellbeing; and D. Development of students’ originality and creativity.
Anonymous said…
District uses standardized tests to comply with OSPI program efficacy assessment component, which is measured by student achievement. From OSPI:

"Program Evaluation, Review & Monitoring
Districts are required to set program goals, establish a plan for how district will evaluation the program goals and measure student achievement outcomes.
WAC 392-170-030 Substance of annual school district plan -
The school district's annual plan shall contain the following:

(3) A description of the highly capable program goals;
(4) A description of the services the highly capable program will offer;
(5) A description of the instructional program the highly capable program will provide;
(7) A description of how the highly capable program will be evaluated that includes information on how the district's highly capable program goals and student achievement outcomes will be measured;

FWIW
Anonymous said…
Must every thread be turned into an attack on APP/HCC, @FWIW (aka @enough already, or @about time??)?

The aggregated data (based on grade level assessments) comparing HCC in cohort vs HCC qualified/not in cohort was questionable in it's presentation - the data was not broken down by school or grade cohort, elementary was lumped with middle school, and it was based on what, one year of data with a new assessment? Were the differences in scores possibly attributable to lack of appropriate curriculum? Or something else? In what grades and at what schools were differences most pronounced? Is the data consistent from year to year? So little was presented...it's difficult to speculate and would be irresponsible to make programming changes based on the data as presented. The data seemed like a good starting point for more analysis, and nothing more.

The district could use other means of evaluation, including out of level testing, district based assessments, parent/student surveys, etc., but simply chooses to use grade level state assessments, which are readily available, with no added cost. Shouldn't grade level tests be the first, but not only, means of comparison? If students in the cohort aren't performing as well as or better than similarly identified students, there is reason for more evaluation.

-abc
Anonymous said…
SBA, for whatever reason, is used as an entrance criteria to advanced learning. Therefore, it is reasonable to use it for further testing too. It's not perfect, but it is pretty clear that the cohort hasn't all topped it out, and it costs nothing extra. But you're right. The district isn't clear about it. We can't really see how the performance is measured without having the same data. But, is there really a reason to doubt them? Given the size of hcc now, it's predictable that it's performance would start to move towards the norm. The standard for keeping the program, especially at that size, should be that it provides significant academic gain over staying in a regular school. Otherwise, why keep it?

Personally, I think the program is weakened, and performance in the self contained model is lower because it has such a high private test in rate. It's pretty obvious that most private testers enter the program. Most district testers remain in regular schools.

History
Anonymous said…
The program has been weakened by years of neglect, lack of an appropriate curriculum, principals and administrators (and teachers) who don't support it, and so on. Teaching and Learning intentionally eliminated what advanced LA/SS curriculum existed for HCC by aligning it to grade level standards. If you provide a cohort, but not an appropriately advanced curriculum, what do you expect for student outcomes? If you eliminated private testing, which I don't think is the cause of program weakening, would the program miraculously become stronger? Doubtful.

According to data from AL (few years back, when retesting was required), most students who qualified chose to enroll and not remain at their neighborhood school. That may have changed somewhat as students who qualify now maintain their eligibility if they stay in their neighborhood school. Many families may choose to remain in their neighborhood elementary and make the switch at middle school. If students are opting out of the cohort, I would guess they are most likely at an option school or a high performing neighborhood school.

-abc
I will have a separate thread soon about AL. I have heard from at least one director and I am surprised at her view of AL. I want to check with other directors to make sure I understand where the Board stands on this issue.

To note, I will moderate comments on that thread.
Anonymous said…
...back to SBAC.

When looking at scores, keep in mind the opt outs are averaged in, unless you look at the tables of grade level results where "Meeting standard excluding No score" is reported. Pass rates are several percentage points higher once you take into account No scores. District wide, "No score" rates gradually rose by each grade level, from around 3% in Gr3 to around 7% in Gr8. Thornton Creek has an especially high opt out rate - as high as 52%. When you exclude the No scores, TC pass rates almost double.

The tests also have less of a ceiling this year (was the previous version not an adaptable test, but this past year's administration was?). You can see the difference by selecting "scale" when viewing results.

-onward
Anonymous said…
I checked the scale scores for our high performing elementary versus Cascadia (the north end cohort) at my child's grade, and the Cascadia scores ARE significantly higher at the upper ends of the distribution (presumably where the advanced learners scorers lay)

They should disaggregate the data by grade before making any claims. You cannot make any judgments on elementary self contained when it is lumped in with the middle school mess.

NP
Anonymous said…
@Melissa -

The 'pixie stick' lines on page 11 are trend lines with only two data points. Once a 3rd data point is added, the lines will look more like a typical data trend line.

data geek
Anonymous said…
"The critical mistake here is to assume that ability is fixed, not constantly
developing. It ignores the fact that to maintain a particular rank, a child must not only get better each year but must improve at the same rate as others who had the same initial score. Using status scores such as percentile ranks (or derivatives such as IQs) masks this year-to-year growth. If the same dimension were labeled "language development" rather than "giftedness," then we would expect to find some whose development was unusual at one point in time but not unusual at another."

From David Lohman

FWIW
Carol Simmons said…
"Also, among eighth-grade students, the gaps in performance among racial and socio-economic groups were evident: 78%of Asian American students and 66 % of white students passed the language arts test. About 42% of Black, Hispanic and Native Hawaiian/Pacific Islander students, and 35% of American Indian/Alaska Native students."

Seattle Times August 17, 2016 State schools chief"ecstatic"over better student test results

Disproportionality continues.
Anonymous said…
And did the Seattle Times actually fail to report performance gaps in regards to students with disabilities? Oh that's right, I forgot: we're supposed to take those for granted. It's the disability, stupid.

Clumsy me
Anonymous said…
Do people here actually expect equal outcomes, that Seattle will be the one district that magically overcomes poverty or learning disabilities? The goal is to progress each child, however far is possible. That doesn't mean they are all the same. Similarly, its not about gaps which everyone knows exist. Its whether they are getting larger or smaller. Sometimes the various comments remind me of Madame Defarge knitting in the corner and plotting to drown the streets with everyone else's blood.

-Arghh
Anonymous said…
If schools, on average, improved in all grades in both ELA and Math, both district wide and state wide, how much of that "improvement" was a function of the cut scores?

Mercer Middle School (69% FRL) still seems strong, comparatively. Mercer outperformed Jane Addams (29% FRL) for 8th grade. Aki Kurose had significant gains for Gr8 ELA and Math.

Gr8 Math meeting standard (excluding No scores):
Aki Kurose 57.8%
Denny 59.5%
Eckstein 75.3%
Hamilton 85.8%
JAMS 68.9%
Madison 59.2%
Mercer 72.2%
Washington 64.0%
Whitman 63.2%

Gr8 ELA meeting standard (excluding No scores):
Aki Kurose 57.8%
Denny 68.5%
Eckstein 87.6%
Hamilton 84.6%
JAMS 67.0%
Madison 66.3%
Mercer 68.7%
Washington 69.7%
Whitman 67.9%

-onward
Lynn said…
That's interesting data. Here are the math passing rates for low income 8th grade students:

Aki 55% total low income 8th grade students = 90
Denny 53% (105)
Eckstein 34% (11)
Hamilton 56% (14)
JAMS 31% (15)
Madison 40% (25)
Mercer 64% (145)
Washington 43% (69)
Whitman 25% (20)

I'm not sure this is actually useful information when some schools have less than 30 low income students per grade. (Though that's news to me.)
Lynn said…
Oops - I missed McClure 38% (14)

Here is the number of low income 8th grade students in each region:

NE 26
NW 34
Central 83
SE 235
SW 130

This does not include the 88 students attending K-8 schools.
Lynn said…
My count of low income students is off - those are the numbers who were proficient in math rather than total students.
Anonymous said…
Lynn, how do you explain those discrepant levels of achievement at schools, given
that you have continually stated that students from "poor families" are unable
to attain high levels of achievement due to cognitive deficiencies related to
poverty?

FWIW
Anonymous said…
Mercer is 69% FRL, but also 43% Asian. Mercer also was/is? using a different math program.

2015-16 Gr8 Math meeting standard by OSPI category
(%age pt. increase/decrease from 2014-15)

Asian 89.6% (+6.2)
Black/African American 45.3% (+6.3)
Hispanic/Latino 59.1% (+9.1)
White 85.7% (+3.1)
Limited English 52.5% (+14.8)
Low income 64.1 (+0.9)

-onward

Popular posts from this blog

Tuesday Open Thread

Why the Majority of the Board Needs to be Filled with New Faces

Who Is A. J. Crabill (and why should you care)?