Peters and Patu Offer Motion to Suspend Smarter Balanced Assessment

Directors Patu and Peters are offering a motion tomorrow night at their Board meeting to suspend Smarter Balanced testing. I support this motion if only for discussion purposes.

I absolutely concur that on a technical level, I do not believe most third-graders are ready for this kind of testing on a computer.  I think it has a degree of motor skill AND computer skills that many third-graders are unlikely to have.  And, if your household does not have a computer, even more so.

It's tentative because they have to get four votes to support it going forward to committee and then coming back to the full Board for a vote.

*PLEASE, write to the Board and ask that they vote for this motion if ONLY to have a real discussion.  Please put SBAC in your subject line.

 schoolboard@seattleschools.org

OR
sharon.peaslee@seattleschools.org
sherry.carr@seattleschools.org
betty.patu@seattleschools.org
stephan.blanford@seattleschools.org
harium.martin-morris@seattleschools.org
marty.mclaren@seattleschools.org
sue.peters@seattleschools.org
 
(If no SB test, then the district can just use the MSP and HSPE tests.)
 
About computer versus paper and pencil SB testing (from SBAC's FAQs page):
Smarter Balanced will make a paper-and-pencil version of the summative assessment available during a three-year transition period. Both the paper-and-pencil and computer adaptive tests will follow the same test blueprint—meaning the same content areas and skills will be assessed. Smarter Balanced will conduct research to ensure that results are comparable across the two modes of assessment.
It almost doesn't matter because the district is saying they are going all computer except for students with IEPs (and even then they say "should continue to have that option," not "will have that option."

It's important to look at the supporting data in the motion because it shows in the field test data that many, many students will do poorly on this test (as they have in other states, see New York).  The district, in its explanation to parents about SB, dances around this issue but parents really need to know that it may be up to a 30-40% failure rate. 

There has never been a real discussion in Seattle about this test (or even Common Core).

 Better late than never.  

Comments

Anonymous said…
some anti-SBAC / pro-Hale? thing got killed at the Seattle Education Association Board of Directors meeting last night by Jonathan Knapp.

Given how the union which takes our dues does:

NOT have an clear accessible to members open process for New Business Items (NBIs), (see prior obstruction by "leadership")

NOT have a defined open to members reasonable comment period on NBIs, such as a week or two, (see prior obstruction by "leadership")

NOT have a robust online means for members to debate NBIs which are being proposed, (see rule by competing back door back room cliques)

Who knows exactly what was proposed, who supported it and why, and who opposed it and why ---

other than word of mouth tales of how, once again, Jonathan-on-the-side-of-Gate$ Knapp shut down attempts at member involvement in this disaster known as SBAC.

CliquesRule
CowardlyLionWizardOz said…
SEA President- Jonathan Knapp does not have the courage to lead. If I recall, Garfield led the MAP boycott. When the tide was turning...Knapp showed-up for support.

Kudos to Director Peters and Director Patu for their courage to begin this conversation!!
Anonymous said…
I too would like to see this resolution move forward in order to have the discussion. I think this discussion has been lacking (except for this blog, of course).

Unfortunately, though, the district can't simply revert back to the MSP. The MSP no longer exists. The testing contractor that developed and administered the MSP no longer has a contact with the state. It wouldn't be possible to administer the MSP this spring for sure.

If SPS passes this resolution, then, it would essentially be declaring that it would not administer a state or federal accountability assessment.

--- swk

Anonymous said…
--swk

I intellectually agree with you. But emotionally, I say "who really cares." We can say we are sorry and we will do better next year.

-fed up
Anonymous said…
nice to see swk jump right in with 1 of those Big Lie gems "accountability assessment"

you're on a Gate$ payroll to advance their talking points?

CliqueRule

Anonymous said…
CliqueRule, I actually read the resolution. Did you?

I took the language regarding the MSP and the state and federal accountability measures from the resolution.

--- swk
Anonymous said…


IF

You L I K E what these two Directors are doing, but, not being on the Board, cannot vote for this amendment directly, then heck -- show them your unflinching support by opting out your student(s)!!!


Have their back - they have yours!

Do what serves your student best.


Having all third graders, about 5,000 8 and 9 year olds, sitting at a computer for hours and hours -- what is that going to prove -- and why would we do this to them?!

It's nuts. Let the teachers teach, and the principals monitor their buildings, and the students learn and grow. If the grown ups all do their jobs, then there is no reason for any young child to be subjected to this Smarter Balance tricky test. Parents, your children need you to really think deeply about what you think is in their best interest, and act accordingly.

I am not anti-testing, but, I am anti-group think, and this Smarter Balance fiasco has mushroomed out of control (too much, too often, too intense, and too pointless).

Get informed, think about the needs of your child, think about what School could be/should be like, then act accordingly. But after FAST - because the testing has begun.

Newly opting-out
This movement is spreading rapidly across the country right now. New Mexico students are getting the most attention today but protests and opting out is taking place in large numbers in Colorado, Pennsylvania, New York, New Jersey, and elsewhere: http://news.yahoo.com/mexico-students-planning-walkouts-over-tests-152312919.html

Even Florida's Republican governor, Rick Scott, an extremely pro-testing politician, suspended the PARCC test for 11th graders - the same thing Nathan Hale voted to do last week with SBAC.

I support Peters and Patu on this and hope the other board members will follow suit.
Conversation Starts said…
Until now--we have not had leaders bring-up the difficult questions!!

This is very exciting.
chunga said…
swk - MSP is still used. Refer to http://www.k12.wa.us/assessment/StateTesting/MSP.aspx
Anonymous said…
If any of you bothered before the month of testing began to look at the sample tests for the way 3rd 4th 5th graders are supposed to do math on a computer then you would have been shouting long before now. It's so stupid. Most students that age will have no idea how to handle the computer input. How can any result be valid if the kid has no expertise on computer input in some silly form. Especially kids with no computers at home.

SBAC unbeliever
Anonymous said…
Swk, do you have any idea the resources burned in testing???? Do you know that high schools have a FULL TIME person on staff
... test coordinator.. Middle schools essentially have one too, either the counselor or the vice principal. We lament the lack of counselors, why? So we could use them for testing? Computer resources? Gone. Basically 1/3 of the instructional time is now spent testing. After spending one month doing winter Amplify, some schools are even administering the "Amplify check point". Evidently, teachers have forgotten how to do anything else. That's 4 years less instruction over a public school career, and 4 years less instruction than students enrolled in private schools. Public schools have abdicated their role in "closing the achievement gap" when they embrace testing to the extent they have. Testing doesn't make students better at anything. And really, it's Public Testing, not public school. Why do we put this on our kids? We don't test any other groups this way. Maybe we should start testing everyone, and tatoo the scores on people's foreheads.

Empl
Anonymous said…
chunga, since all we're talking about here is English/language arts and math, I will reiterate that the ELA and math MSP no longer exists.

Go back and read the link you posted again.

--- swk
Anonymous said…
Has any parent received Amplify test results (MAP results are posted to the Source)? How are schools actually using the data? Is it just considered practice for SBAC?

-questions
Anonymous said…
Robert, Florida doesn't use the PARCC tests. Curious about your statement, then.

--- swk
SWK, you know the answer but I'll let others know.

Florida had been signed up for PARCC but backed out and had created their own standards and bought another test. But in Miami-Dade, Boward and other large districts, they had to suspend testing because of technical problems. From Alberto Carvalho, super in Miami-Dade:

"In light of the fact that the FLDOE has not yet provided districts with assurances on the stability of the testing environment for the computer-based FSA ELA/Writing assessment, M-DCPS will suspend computer-based testing in grades 8 , 9, and 10 again for tomorrow, Wednesday, March 4, 2015."

"With many questions unanswered and with many doubts looming, state writing assessment begins today. Of particular concern, beyond the reliability and validity issues, is the fact that many students with limited exposure to technology will be assessed on a computer for the first time. Was there consideration given to the real possibility that technical skill will influence content mastery results? Nothing short of a reasonable accountability transition, which must include treating this year as a baseline development year, makes sense. Getting it right must trump getting it done."

Today, Wednesday the 4th, it is being reported that half the districts in Florida have problems and they have suspended testing AGAIN. 3rd and 4th graders have been taking a paper-based exam and now grades 4-7 will start next week.

So really, this massive change in testing is using students as guinea pigs and has now opened itself up to major questions about the validity of the testing.

At the least, SPS should use paper tests for 3rd graders. That's the most fair thing to do for kids that young.
Po3 said…
Current pass rate for 10th grade Reading with the HSPE is 75%. Anticipated pass rate for the SBAC test is less than 40%.

So what does this mean for these students-who were likely to meet their graduation requirement with the HSPE and now don't?
What is SPS plan? Lower the SBAC pass rate so that 75% pass the test? Or...?

Seems like a good talking point for the board to discuss. Seems like a great FAQ for the district website.
StringCheese said…
SBAC Unbeliever is spot-on. I have never even considered opting my child out of testing before. I have no real issue with the basic idea of achievement tests. HOWEVER, the SBAC is a TERRIBLE test. Period.

Links to the SBAC website where anyone can sign in as a guest and take the tests have been posted here on multiple occasions. Here it is again:

https://login4.cloud1.tds.airast.org/student/V42/Pages/LoginShell.aspx?c=SBAC_PT

Not only would the instructions given on the ELA Performance task never make it past any decent editor, but the many questions themselves are designed to confuse. Is this really the best way to approach learning for an 8 year old?

Don't feel like taking the test? You can go ahead and read the insane instructions here:

http://www.wweek.com/portland/blog-32408-are_you_smarter_than_a_third_grader_smarter_balanced_practice_test_flummoxes_teachers_and_parents.html

I will be writing the School Board. I will be attending the Seattle Opt Out meetings. I will be opting my child OUT of this ridiculous test that was clearly made for the benefit of the testing companies and not for the benefit of the students taking the stupid thing.
Anonymous said…
This comment has been removed by a blog administrator.
Anonymous said…
For the Peters - Patu amendment to pass, MARTY AND SHARON have to vote YES. The other 3 will NOT under any circumstances vote for this.

So, be strategic. Reach out to Marty and Sharon.

Both of them can be highly independent thinkers, and very supportive of kids and what is directly in the children's best interests.

Reach out to Marty. Email Sharon. Ask them to vote yes.

Having children sit for an experimental hours-long exam only to be told they fail is a waste of resources, time, and a demoralizing, toxic experience for children. It is the definition of pointlessness.

Marty & Sharon
Anonymous said…
Melissa, that's right. And here's some additional information --- the testing company hired by Florida to provide their new online tests is the same one WA hired to provide its new tests.

Let me be clear --- the online testing system that Florida is currently using is the same online testing system that our students will be using starting next week.

I'm not suggesting that WA will experience that same technical problems as Florida. I'm just putting out some information.

--- swk
Anonymous said…
Next week? Students are taking the tests as early as next week? They will have covered only a portion of the standards. MSP testing was traditionally done the 2nd week after Spring week - usually late April, not early March.

The testing schedule for our school has not been announced. Will parents get a heads up?

crazy&crazier
Anonymous said…
crazy, the state testing window is March 10 through June 15. There will be schools in the state that start testing next week.

--- swk
Linh-Co said…
I've heard the SBAC is adaptive. Can someone tell me how an at grade level test can be adaptive?

Anonymous said…
Linh-Co, I’m going to assume that your question is real and not rhetorical.

For any test, an item bank (or database of test questions) is created. Each item in that bank is tagged with at least three pieces of information: the content area of that item (English/language arts or math, in the case of SBAC), the grade level of that item, and the difficulty of that item (ranging essentially from easy to difficult).

When a test is developed --- let’s say a 4th grade math test --- test developers will have at their disposal all of the 4th grade math items ranging from easy to difficult in that item bank. If the test developers were creating a “fixed form” test --- like the MSP --- they would focus their attention on those items that were tagged at the middle range of difficult to best indicate scores of “proficiency.” In other words, because deciding a proficient score is the most important, most of the test questions would be clustered around proficiency (or the middle range). And because tests are finite, they are limited in the number of items that can go on any given test. So what that means is there will be fewer test item clustered around the low and high ends of difficulty. They cluster around proficiency because they want to be as precise about a proficiency score of a student as possible. In other words, they want to eliminate as much measurement error around the critical determination of proficiency as possible. Scores that students receive at the lowest end and highest end of any given test are then less precise or have higher degrees of measurement error than the scores of students who achieved proficiency or just missed achieving proficiency, for no other reason than there are fewer items at these ends of the test. To that end, much more confidence can be stated about a proficient score than one of “below basic” or “advanced.”

Now when test developers are creating a computer adaptive test, they still use the same item bank as a fixed form test. However, they are not limited to the clustering of items to the middle range or proficiency. Test developers still have at their disposal a range of difficulty of 4th grade math items. All of the items on that adaptive 4th grade math test are 4th grade math items. As a student proceeds through the 4th grade math test, all students will begin with the same set of questions and they will get these questions correct or incorrect. The online test system then begins to present students with different sets of questions according to how they respond to this initial set of questions. Students who answer correctly will begin to get more difficult questions until they exhaust the grade level item bank and finish their test. Students that answer incorrectly will get easier questions and process works the same. With an adaptive test, much more confidence in student scores at the lower and higher ends of difficulty is attained. However, this 4th grade adaptive test is still at grade level; however, we can have as much confidence in an “advanced” score as we do about a “proficient” score. The same can’t be said about a fixed form test. A computer adaptive test provides much more precise scores at all ranges of a test than a fixed form test. But it is still “an at grade level test.”

--- swk
Anonymous said…
Everett has "practiced" taking sample SBAC tests online with their 10th and 11th grade students. They had ridiculously frustrating, time-consuming problems with the technology (kids getting kicked off the program, logging back in, teacher approval of their log in, rinse and repeat all day long). The tech people stated that they were "working with the vendor to resolve the issues."
This was last week. Hopefully, they can be resolved soon.
- here's hopin'

Anonymous said…
The following link to SBAC sample items had been posted previously:

http://www.hawaiipublicschools.org/TeachingAndLearning/Testing/StateAssessment/Pages/Smarter-Balanced-Practice-Questions.aspx

It is useful in that you can look at sample questions by grade and content (math or ELA), with scoring guidelines. An example of an item with multiple answers, and what looks like an all or nothing score, is in the second problem of the 7th grade math link. Students must select expressions that are equivalent to 2(2x+1). The answer is YYNYY (4 expressions are equivalent and 1 is not), and the item is worth 1 point total. It is considered medium difficulty.

The next item has four answers, and full credit (2 points) is given only if all four answers are correct. 0 points are given if 2 out of 4 answers are wrong.

Another problem is of the open response/justify your answer variety (see the circle in a square item). The example of a top-score response requires a lot of typing...are students given sketch paper to do their work, and then expected to type their work in detail, or do they have to do all work via computer at the outset (no scratch paper)?

[I had the same question about adaptive testing - thanks for the clear explanation, swk]

-parent
The map test was adaptive too, however this led to the main problem my son had with MAP; you cannot 'pass' on a question and go back later. This 'fail now if you cannot answer immediately or run out of time' he found deeply problematic.
Po3 said…
I see that SBAC is designed for grades 3-8 and 11.

But SPS is requiring 10th graders to take and pass the SBAC test to fulfill their Reading requirement for graduation.

So, 10th graders have to pass a test designed for 11th grade. How in the world did that ever get approved?
Anonymous said…
My plea to the SPS Board this a.m.

Sent: Wednesday, March 4, 2015 11:11:19 AM
Subject: Motion to introduce SBAC Resolution 03-04-15



Thank you for your service.


Please vote to introduce the SBAC and Testing resolution to the floor this evening.

Whether or not you support the ultimate resolution - a discussion would be healthy and transparent and an educational opportunity no matter your politics (and it is political - very - especially the transparency/discussion part).

My reasons are:

Loss of education time

Loss of access to libraries throughout the district

Loss of social emotional strides for 60% of the children that will be labeled as "failures" accd'g to current forecasts

Loss of integrity to evaluation systems - both of teachers and schools (Common Core/SBAC was not psychometrically designed for these uses)

Inappropriate skill levels on SBAC as to computer skills for youngsters and those without computers in the home (we talk an awful lot about equity and the fact that a great many of our children/students are not lucky enough to be born with privileges e.g., country of origin, skin color, two parents with educations, good jobs, and have the resources to compete) this is one of those incidious assumptions that harm our kids and especially ironic given the lock up of libraries and computer labs - because of testing!

Recognition of the current and oncoming onslaught of opting out both by parents, teachers and ultimately districts to the Gates driven corporate reform agenda - Common Core/PARCC and SBAC.

An old WA state senator Gene Lux from the 37th used to say (with perfect timing and a big beautiful baritone voice) in the Legislature when banks and insurance companies fought - - "when alligators and crocodiles fight - the people always win." When both the left and the right political spectrums agree on something - perhaps something is indeed, wrong.

Would so much rather spend our time (and money) dealing with MTSS, PD for teachers including for differentiation with more and more "blended" programs and focusing on social-emotional, adopting well overdue curricula, buying textbooks, addressing equity, and the like.

Thank you for listening.

Respectfully,


Leslie Harris
Parent, Communitymember, Taxpayer
tele: 206.579.2842
email: harrislsh@comcast.net
Linh-Co said…
Thanks for the response SWK. It was a real question.
Anonymous said…
Harium won't be voting for resolution. Here is his response to my email:

"Thank you very much for your email and for expressing your concerns around the smarter balance assessment. I will not be supporting that resolution I believe that it is ill-conceived ill-timed and will hurt the very children that we hare trying to help. This resolution has the potential of us losing over $20 million dollars that helps our struggling students and our special education students. In addition we would be in violation of state law.

Regards,

Harium Martin-Morris
School Board Director, District III"

~ SBAC Worried
Let's face it, Carr, Martin-Morris and Blandford will vote no, Peaslee and McLaren are the only two who may hop on board.
Gads said…
Washington State has a history of signing onto failed education reforms such as charter schools.

Then again, we're also talking about corporations wanting access to the public trough.

Here is what Linda Darling= Hammond had to say today:

Test-based accountability produced no gains from 2000-2012 on PISA. Time for a new approach? https://t.co/OolPNYl3s3 pic.twitter.com/cBPBdkJURL

— LindaDarling-Hammond (@LDH_ed) July 2, 2014

One thing to consider (and I'm sure that any of the incumbents running in the fall are thinking about) is if the testing goes wrong.

Meaning, huge numbers of kids fail. The technology goes down. There's confusion. There's large numbers of refusals.

The question voters can ask is, why didn't you even allow a discussion on this issue even when you knew - from media reports - of issues nationwide?

Because, on the one hand, you can go the Martin-Morris route. I'm not sure that all his claims are true but he can say that's what he thought/was told.

On the other hand, if a director votes against this measure, it certainly doesn't look like they even were willing to give it a full airing. That's not much transparency.
Supporting Hale said…
The resolution is an introduction item ONLY and provides an opportunity for public discussion regarding concerns.

Directors owe the public this conversation-especially the folks at Nathan Hale.
Anonymous said…
Loss of Funds

The claim that schools will lose funding is true only in a limited way.

For states without a waiver: Under NCLB, a school that fails to make AYP for two years must set aside up to 15% of its Title I federal funds to use for transporting volunteer students to a non-failing school. Because nearly all schools are now ‘failing,’ the transfer issue has become irrelevant. If a school does not make AYP for three years, it must put aside up to 15% of its Title I funds for ‘supplemental educational services’ (‘tutoring’). The 15% funds are not available for regular school use. Districts, however, are eligible to run their own SES programs.

In a state with a waiver, a “priority” school must set aside 5-15% of its federal Title I and II funding to use in state-approved programs in the school. The money is not ‘lost.’ It generally may be used for various school improvement efforts.
-fairtest.org
Anonymous said…
This book is timely, and sounds worthwhile - The Test, Why Our Schools Are Obsessed With Standardized Testing, But You Don't Have To Be. by Anna Kamanetz

http://www.nytimes.com/2015/02/08/books/review/the-test-by-anya-kamenetz.html

bookish
Po3 said…
"ill-conceived ill-timed"

Please run Melissa.
Anonymous said…
Concerned, I can't speak specifically to NWEA or SBAC's test engines, but a computer adaptive test should allow a student to skip a question. The test should not count that skipped question as an incorrect response but rather provide another question in the same difficulty range of that skipped question.

The student would eventually need to return and answer the question, though. A skipped response would only be counted as incorrect once the student ends the test.

--- swk
That seems a reasonable compromise swk, but MAP/NWEA does not support this. I will ask the district on SBAC

As to re-election, I would guess some are not standing again so they may not care what constituents think. Others like Carr never seemed to have started to care.
Anonymous said…
bookish, I went and saw Anna Kamanetz speak at Town Hall not too long ago on her book.

When she began the evening by telling a story about how she wrote a different book (about Montessori, Waldorf, etc.) but her agent told her he probably couldn't get it published. She then decided instead to write a book about testing because that would probably get published and maybe sell.

She spent 18 months "researching" the complicated issue of testing and wrote and published her book. I was underwhelmed by her to say the least.

--- swk
Anonymous said…
I guess I'm stumped on why stepping back for a minute and discussing the issue would be tantamount to losing $20 million as Dir. Martin-Morris suggests. That seems like "all or nothing" thinking which is what usually gets SPS into trouble.

Discussing the matter is a good thing - the prudent thing, and just plan good management...

reader47
seattle citizen said…
swk - my understanding of adaptive tests is that if a student "skips" a question (clicking on "skip") you are correct that they get another, similar question that tests the same skill at the same level, then moves on. But I believe it is incorrect to say that the student must go back (somehow...?) and answer that question they skipped. The skipped question was replaced with a similar question, AND the program took their answer and then "adapted" - made the next question easier or harder, on down the line of questions, so going back to the skipped question would make that question, now, out of order, or degree, of difficulty.

What I wonder about adaptive tests that test a range of different skills is whether they, on the first skill tested, use a couple of questions and answers to determine level of THAT skill, and then go onto test OTHER skills using questions at the level determined by the answers to questions around the first skill? Or, for each DIFFERENT skill tested, does the program reset to, say, mid-level and start over determining level of THAT skill. In other words, does the test adapt all the way through to determine overall level, or does it adapt only at each set of questions around each skill, the reset for the next skill?
Linh-Co said…
No surpise, but nevertheless disappointed with Marty's vote.
Anonymous said…
swk,

I attended the (unpaid) two-hour SBAC class for teachers, given by Sean Cook (?), SBAC manager for SPS. I specifically asked if each test assessed only the standards for its grade level (e.g., does the fifth grade math SBA assess only fifth-grade math standards?), and Sean replied that, because each test is adaptive, each test can assess standards from up to two grade levels above and two grade levels below. This surprised me, in part because the MSP could assess ONLY the standards for the grade level it assessed. With the adaptive SBA, it seems that students are essentially penalized for not having been taught standards above their grade level, which most students aren't. Also, this seems to contradict your comment that "All of the items on that adaptive 4th grade math test are 4th grade math items," no? Am I misunderstanding something? I appreciate your careful understanding of all things SBA.

--TeacherMom
Linh-Co said…
I understand adaptive tests, but what isn't clear to me is how cut scores apply to SBAC to determine proficiency? On the MSP, proficiency levels were determined based on the number of correct answers. If the test is adaptive and there are a range of difficult questions, wouldn't the kids who are seeing more difficult questions get less of them correct?
Anonymous said…
SBAC provides scale scores, similar to NWEA's MAP. A FAQ sheet on SBAC adaptive testing:

http://www.smarterbalanced.org/wordpress/wp-content/uploads/2014/10/SmarterBalanced-Adaptive-Software.pdf

It explains 2/3 of the test is at grade level and then "the question pool is expanded to include (as needed) questions either from below (or above) the student’s grade level."

-parent
Anonymous said…
TeacherMom, Linh-Co, et al, you are stretching the limits of my knowledge of test construction and psychometrics.

When a student takes a standardized test, there are generally three different kinds of scores generated: (1) a raw score --- the number of questions answered correctly out of the number of questions on the test, (2) a scale score --- a relatively arbitrary set of scores corresponding to a range of raw scores and the difficulty of the range of questions answered (e.g., the HSPE has 400 as the minimum proficiency scale score), and (3) a performance/achievement level score (e.g., the HSPE/SBAC has 4 performance/achievement levels --- Level 1 – Below Basic, Level 2 – Basic, Level 3 –Proficient, and Level 4 – Advanced). The individual student score reports that parents received on state tests don’t usually include scale scores because, essentially, those are not as useful to indicate a student’s performance. I will come around to that, hopefully.

So, when a student takes a computer adaptive test (CAT), the test adapts to the student according to how the student answers questions. I think we’re all straight on that. But let’s get back to TeacherMom and Linh-Co’s questions about the potential of students answering the first set of questions correctly and thus receiving more difficult questions. If the student answers those questions correctly, the student will receive exponentially more difficult questions throughout the test, leading the student to possibly receiving questions above the student’s actual grade level. [Sorry I didn’t acknowledge this possibility above. I almost did and it seemed too confusing and I cut it out of my explanation.] The student may start answering these more difficult questions incorrectly. If so, the test will adapt again and give easier questions. But the student is not “penalized” for answering more difficult questions incorrectly. This is because the raw score is fairly useless at this point. What we’re really trying to find out is the performance/achievement level of this student. Let’s take a 4th grade student who answered correctly the initial set of grade level questions aligned to proficiency then proceeded to answer exponentially more difficult questions correctly but then started to answer incorrectly some 5th grade questions. We would conclude from the results that this student is at Level 4 – Advanced due to how the student performed on the range of difficult questions. And this student would also be in the upper range of scale scores. But this student might have had a raw score of 25 out of 30 questions. This doesn’t tell us much. Hypothetically, another 4th grade student could have a raw score of 25 out of 30 questions but the majority of those questions could have been easier questions, including even some 3rd grade questions. That raw score then tells us very little.

--- swk
Anonymous said…
The individual student score reports that parents received on state tests don’t usually include scale scores because, essentially, those are not as useful to indicate a student’s performance. I will come around to that, hopefully.

The MSP and EOC reports to parents, on the Source, include both scale scores and proficiency levels. On request to OSPI, a parent can get more detailed reports that show how many questions were missed.

The NWEA MAP provides a scale score plus a percentile (no proficiency levels, as MAP is not used for state accountability purposes). Percentile conversion charts are available to compare scale scores across grade levels.

The sample SBAC reports for parents indicate a scale score and performance level, very similar to what parents receive with the MSP and EOC.

-parent
Anonymous said…
Sorry, parent, that was a typo. I meant to state that the individual score reports don't "usually include raw scores because, essentially, those are not as useful to indicate a student’s performance."

My mistake. And I re-read my comments several times before I posted.

--- swk
seattle citizen said…
swk, a variety of skills are tested on an SBAC. If the first questions relate to, say, "character development" and the students knows this well and moves up to far above level (and questions get harder) and then the test starts testing another skill, say, "metaphor", will the test system still be set at the higher level and thus start asking questions at that higher level but in a new skill?

That seems wonky - How many questions would a student get about "character" and then "metaphor"? If it starts high on metaphor and student tanks the question, how many chances do they get on easier questions on metaphor, and then does system start them low on the next skill, say, "irony"?

I don't get how this effectively tests each skill if it starts questioning on one skill at a high level and starts testing the next skill at a low level. It doesn't seem there would be time to give a series of questions on each skill in order to determine level on each skill. It seems like it will be testing all skills together and merely guesstimating an approximate level at the end of the test (wandering up and down, depending on student answers until end, where it spits out a general level.

Will parent/guardians and educators have access to data that shows student proficiency in each skill (or knowledge) tested?

My understanding is that no one can see the test the student took itself, and the answers the student gave. THAT would be helpful in trying to figure out what student knows about each skill.
Anonymous said…
Well, seattle citizen, you're stretched my knowledge of the SBAC tests beyond what I could comfortably discuss. I really don't know with any confidence the answers to your specific questions.

--- swk
Anonymous said…
This comment has been removed by a blog administrator.

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

MEETING CANCELED - Hey Kids, A Meeting with Three(!) Seattle Schools Board Directors