Peters and Patu Offer Motion to Suspend Smarter Balanced Assessment
Directors Patu and Peters are offering a motion tomorrow night at their Board meeting to suspend Smarter Balanced testing. I support this motion if only for discussion purposes.
I absolutely concur that on a technical level, I do not believe most third-graders are ready for this kind of testing on a computer. I think it has a degree of motor skill AND computer skills that many third-graders are unlikely to have. And, if your household does not have a computer, even more so.
*PLEASE, write to the Board and ask that they vote for this motion
if ONLY to have a real discussion. Please put SBAC in your subject
line.
schoolboard@seattleschools.org
OR
sharon.peaslee@seattleschools.org
sherry.carr@seattleschools.org
betty.patu@seattleschools.org
stephan.blanford@seattleschools.org
harium.martin-morris@seattleschools.org
marty.mclaren@seattleschools.org
sue.peters@seattleschools.org
schoolboard@seattleschools.org
OR
sharon.peaslee@seattleschools.org
sherry.carr@seattleschools.org
betty.patu@seattleschools.org
stephan.blanford@seattleschools.org
harium.martin-morris@seattleschools.org
marty.mclaren@seattleschools.org
sue.peters@seattleschools.org
(If no SB test, then the district can just use the MSP and HSPE tests.)
About computer versus paper and pencil SB testing (from SBAC's FAQs page):
It almost doesn't matter because the district is saying they are going all computer except for students with IEPs (and even then they say "should continue to have that option," not "will have that option."Smarter Balanced will make a paper-and-pencil version of the summative assessment available during a three-year transition period. Both the paper-and-pencil and computer adaptive tests will follow the same test blueprint—meaning the same content areas and skills will be assessed. Smarter Balanced will conduct research to ensure that results are comparable across the two modes of assessment.
There has never been a real discussion in Seattle about this test (or even Common Core).
Better late than never.
Comments
Given how the union which takes our dues does:
NOT have an clear accessible to members open process for New Business Items (NBIs), (see prior obstruction by "leadership")
NOT have a defined open to members reasonable comment period on NBIs, such as a week or two, (see prior obstruction by "leadership")
NOT have a robust online means for members to debate NBIs which are being proposed, (see rule by competing back door back room cliques)
Who knows exactly what was proposed, who supported it and why, and who opposed it and why ---
other than word of mouth tales of how, once again, Jonathan-on-the-side-of-Gate$ Knapp shut down attempts at member involvement in this disaster known as SBAC.
CliquesRule
Kudos to Director Peters and Director Patu for their courage to begin this conversation!!
Unfortunately, though, the district can't simply revert back to the MSP. The MSP no longer exists. The testing contractor that developed and administered the MSP no longer has a contact with the state. It wouldn't be possible to administer the MSP this spring for sure.
If SPS passes this resolution, then, it would essentially be declaring that it would not administer a state or federal accountability assessment.
--- swk
I intellectually agree with you. But emotionally, I say "who really cares." We can say we are sorry and we will do better next year.
-fed up
you're on a Gate$ payroll to advance their talking points?
CliqueRule
I took the language regarding the MSP and the state and federal accountability measures from the resolution.
--- swk
IF
You L I K E what these two Directors are doing, but, not being on the Board, cannot vote for this amendment directly, then heck -- show them your unflinching support by opting out your student(s)!!!
Have their back - they have yours!
Do what serves your student best.
Having all third graders, about 5,000 8 and 9 year olds, sitting at a computer for hours and hours -- what is that going to prove -- and why would we do this to them?!
It's nuts. Let the teachers teach, and the principals monitor their buildings, and the students learn and grow. If the grown ups all do their jobs, then there is no reason for any young child to be subjected to this Smarter Balance tricky test. Parents, your children need you to really think deeply about what you think is in their best interest, and act accordingly.
I am not anti-testing, but, I am anti-group think, and this Smarter Balance fiasco has mushroomed out of control (too much, too often, too intense, and too pointless).
Get informed, think about the needs of your child, think about what School could be/should be like, then act accordingly. But after FAST - because the testing has begun.
Newly opting-out
Even Florida's Republican governor, Rick Scott, an extremely pro-testing politician, suspended the PARCC test for 11th graders - the same thing Nathan Hale voted to do last week with SBAC.
I support Peters and Patu on this and hope the other board members will follow suit.
This is very exciting.
SBAC unbeliever
... test coordinator.. Middle schools essentially have one too, either the counselor or the vice principal. We lament the lack of counselors, why? So we could use them for testing? Computer resources? Gone. Basically 1/3 of the instructional time is now spent testing. After spending one month doing winter Amplify, some schools are even administering the "Amplify check point". Evidently, teachers have forgotten how to do anything else. That's 4 years less instruction over a public school career, and 4 years less instruction than students enrolled in private schools. Public schools have abdicated their role in "closing the achievement gap" when they embrace testing to the extent they have. Testing doesn't make students better at anything. And really, it's Public Testing, not public school. Why do we put this on our kids? We don't test any other groups this way. Maybe we should start testing everyone, and tatoo the scores on people's foreheads.
Empl
Go back and read the link you posted again.
--- swk
-questions
--- swk
Florida had been signed up for PARCC but backed out and had created their own standards and bought another test. But in Miami-Dade, Boward and other large districts, they had to suspend testing because of technical problems. From Alberto Carvalho, super in Miami-Dade:
"In light of the fact that the FLDOE has not yet provided districts with assurances on the stability of the testing environment for the computer-based FSA ELA/Writing assessment, M-DCPS will suspend computer-based testing in grades 8 , 9, and 10 again for tomorrow, Wednesday, March 4, 2015."
"With many questions unanswered and with many doubts looming, state writing assessment begins today. Of particular concern, beyond the reliability and validity issues, is the fact that many students with limited exposure to technology will be assessed on a computer for the first time. Was there consideration given to the real possibility that technical skill will influence content mastery results? Nothing short of a reasonable accountability transition, which must include treating this year as a baseline development year, makes sense. Getting it right must trump getting it done."
Today, Wednesday the 4th, it is being reported that half the districts in Florida have problems and they have suspended testing AGAIN. 3rd and 4th graders have been taking a paper-based exam and now grades 4-7 will start next week.
So really, this massive change in testing is using students as guinea pigs and has now opened itself up to major questions about the validity of the testing.
At the least, SPS should use paper tests for 3rd graders. That's the most fair thing to do for kids that young.
So what does this mean for these students-who were likely to meet their graduation requirement with the HSPE and now don't?
What is SPS plan? Lower the SBAC pass rate so that 75% pass the test? Or...?
Seems like a good talking point for the board to discuss. Seems like a great FAQ for the district website.
Links to the SBAC website where anyone can sign in as a guest and take the tests have been posted here on multiple occasions. Here it is again:
https://login4.cloud1.tds.airast.org/student/V42/Pages/LoginShell.aspx?c=SBAC_PT
Not only would the instructions given on the ELA Performance task never make it past any decent editor, but the many questions themselves are designed to confuse. Is this really the best way to approach learning for an 8 year old?
Don't feel like taking the test? You can go ahead and read the insane instructions here:
http://www.wweek.com/portland/blog-32408-are_you_smarter_than_a_third_grader_smarter_balanced_practice_test_flummoxes_teachers_and_parents.html
I will be writing the School Board. I will be attending the Seattle Opt Out meetings. I will be opting my child OUT of this ridiculous test that was clearly made for the benefit of the testing companies and not for the benefit of the students taking the stupid thing.
So, be strategic. Reach out to Marty and Sharon.
Both of them can be highly independent thinkers, and very supportive of kids and what is directly in the children's best interests.
Reach out to Marty. Email Sharon. Ask them to vote yes.
Having children sit for an experimental hours-long exam only to be told they fail is a waste of resources, time, and a demoralizing, toxic experience for children. It is the definition of pointlessness.
Marty & Sharon
Let me be clear --- the online testing system that Florida is currently using is the same online testing system that our students will be using starting next week.
I'm not suggesting that WA will experience that same technical problems as Florida. I'm just putting out some information.
--- swk
The testing schedule for our school has not been announced. Will parents get a heads up?
crazy&crazier
--- swk
For any test, an item bank (or database of test questions) is created. Each item in that bank is tagged with at least three pieces of information: the content area of that item (English/language arts or math, in the case of SBAC), the grade level of that item, and the difficulty of that item (ranging essentially from easy to difficult).
When a test is developed --- let’s say a 4th grade math test --- test developers will have at their disposal all of the 4th grade math items ranging from easy to difficult in that item bank. If the test developers were creating a “fixed form” test --- like the MSP --- they would focus their attention on those items that were tagged at the middle range of difficult to best indicate scores of “proficiency.” In other words, because deciding a proficient score is the most important, most of the test questions would be clustered around proficiency (or the middle range). And because tests are finite, they are limited in the number of items that can go on any given test. So what that means is there will be fewer test item clustered around the low and high ends of difficulty. They cluster around proficiency because they want to be as precise about a proficiency score of a student as possible. In other words, they want to eliminate as much measurement error around the critical determination of proficiency as possible. Scores that students receive at the lowest end and highest end of any given test are then less precise or have higher degrees of measurement error than the scores of students who achieved proficiency or just missed achieving proficiency, for no other reason than there are fewer items at these ends of the test. To that end, much more confidence can be stated about a proficient score than one of “below basic” or “advanced.”
Now when test developers are creating a computer adaptive test, they still use the same item bank as a fixed form test. However, they are not limited to the clustering of items to the middle range or proficiency. Test developers still have at their disposal a range of difficulty of 4th grade math items. All of the items on that adaptive 4th grade math test are 4th grade math items. As a student proceeds through the 4th grade math test, all students will begin with the same set of questions and they will get these questions correct or incorrect. The online test system then begins to present students with different sets of questions according to how they respond to this initial set of questions. Students who answer correctly will begin to get more difficult questions until they exhaust the grade level item bank and finish their test. Students that answer incorrectly will get easier questions and process works the same. With an adaptive test, much more confidence in student scores at the lower and higher ends of difficulty is attained. However, this 4th grade adaptive test is still at grade level; however, we can have as much confidence in an “advanced” score as we do about a “proficient” score. The same can’t be said about a fixed form test. A computer adaptive test provides much more precise scores at all ranges of a test than a fixed form test. But it is still “an at grade level test.”
--- swk
This was last week. Hopefully, they can be resolved soon.
- here's hopin'
http://www.hawaiipublicschools.org/TeachingAndLearning/Testing/StateAssessment/Pages/Smarter-Balanced-Practice-Questions.aspx
It is useful in that you can look at sample questions by grade and content (math or ELA), with scoring guidelines. An example of an item with multiple answers, and what looks like an all or nothing score, is in the second problem of the 7th grade math link. Students must select expressions that are equivalent to 2(2x+1). The answer is YYNYY (4 expressions are equivalent and 1 is not), and the item is worth 1 point total. It is considered medium difficulty.
The next item has four answers, and full credit (2 points) is given only if all four answers are correct. 0 points are given if 2 out of 4 answers are wrong.
Another problem is of the open response/justify your answer variety (see the circle in a square item). The example of a top-score response requires a lot of typing...are students given sketch paper to do their work, and then expected to type their work in detail, or do they have to do all work via computer at the outset (no scratch paper)?
[I had the same question about adaptive testing - thanks for the clear explanation, swk]
-parent
But SPS is requiring 10th graders to take and pass the SBAC test to fulfill their Reading requirement for graduation.
So, 10th graders have to pass a test designed for 11th grade. How in the world did that ever get approved?
Sent: Wednesday, March 4, 2015 11:11:19 AM
Subject: Motion to introduce SBAC Resolution 03-04-15
Thank you for your service.
Please vote to introduce the SBAC and Testing resolution to the floor this evening.
Whether or not you support the ultimate resolution - a discussion would be healthy and transparent and an educational opportunity no matter your politics (and it is political - very - especially the transparency/discussion part).
My reasons are:
Loss of education time
Loss of access to libraries throughout the district
Loss of social emotional strides for 60% of the children that will be labeled as "failures" accd'g to current forecasts
Loss of integrity to evaluation systems - both of teachers and schools (Common Core/SBAC was not psychometrically designed for these uses)
Inappropriate skill levels on SBAC as to computer skills for youngsters and those without computers in the home (we talk an awful lot about equity and the fact that a great many of our children/students are not lucky enough to be born with privileges e.g., country of origin, skin color, two parents with educations, good jobs, and have the resources to compete) this is one of those incidious assumptions that harm our kids and especially ironic given the lock up of libraries and computer labs - because of testing!
Recognition of the current and oncoming onslaught of opting out both by parents, teachers and ultimately districts to the Gates driven corporate reform agenda - Common Core/PARCC and SBAC.
An old WA state senator Gene Lux from the 37th used to say (with perfect timing and a big beautiful baritone voice) in the Legislature when banks and insurance companies fought - - "when alligators and crocodiles fight - the people always win." When both the left and the right political spectrums agree on something - perhaps something is indeed, wrong.
Would so much rather spend our time (and money) dealing with MTSS, PD for teachers including for differentiation with more and more "blended" programs and focusing on social-emotional, adopting well overdue curricula, buying textbooks, addressing equity, and the like.
Thank you for listening.
Respectfully,
Leslie Harris
Parent, Communitymember, Taxpayer
tele: 206.579.2842
email: harrislsh@comcast.net
"Thank you very much for your email and for expressing your concerns around the smarter balance assessment. I will not be supporting that resolution I believe that it is ill-conceived ill-timed and will hurt the very children that we hare trying to help. This resolution has the potential of us losing over $20 million dollars that helps our struggling students and our special education students. In addition we would be in violation of state law.
Regards,
Harium Martin-Morris
School Board Director, District III"
~ SBAC Worried
Then again, we're also talking about corporations wanting access to the public trough.
Here is what Linda Darling= Hammond had to say today:
Test-based accountability produced no gains from 2000-2012 on PISA. Time for a new approach? https://t.co/OolPNYl3s3 pic.twitter.com/cBPBdkJURL
— LindaDarling-Hammond (@LDH_ed) July 2, 2014
Meaning, huge numbers of kids fail. The technology goes down. There's confusion. There's large numbers of refusals.
The question voters can ask is, why didn't you even allow a discussion on this issue even when you knew - from media reports - of issues nationwide?
Because, on the one hand, you can go the Martin-Morris route. I'm not sure that all his claims are true but he can say that's what he thought/was told.
On the other hand, if a director votes against this measure, it certainly doesn't look like they even were willing to give it a full airing. That's not much transparency.
Directors owe the public this conversation-especially the folks at Nathan Hale.
The claim that schools will lose funding is true only in a limited way.
For states without a waiver: Under NCLB, a school that fails to make AYP for two years must set aside up to 15% of its Title I federal funds to use for transporting volunteer students to a non-failing school. Because nearly all schools are now ‘failing,’ the transfer issue has become irrelevant. If a school does not make AYP for three years, it must put aside up to 15% of its Title I funds for ‘supplemental educational services’ (‘tutoring’). The 15% funds are not available for regular school use. Districts, however, are eligible to run their own SES programs.
In a state with a waiver, a “priority” school must set aside 5-15% of its federal Title I and II funding to use in state-approved programs in the school. The money is not ‘lost.’ It generally may be used for various school improvement efforts.
-fairtest.org
http://www.nytimes.com/2015/02/08/books/review/the-test-by-anya-kamenetz.html
bookish
Please run Melissa.
The student would eventually need to return and answer the question, though. A skipped response would only be counted as incorrect once the student ends the test.
--- swk
As to re-election, I would guess some are not standing again so they may not care what constituents think. Others like Carr never seemed to have started to care.
When she began the evening by telling a story about how she wrote a different book (about Montessori, Waldorf, etc.) but her agent told her he probably couldn't get it published. She then decided instead to write a book about testing because that would probably get published and maybe sell.
She spent 18 months "researching" the complicated issue of testing and wrote and published her book. I was underwhelmed by her to say the least.
--- swk
Discussing the matter is a good thing - the prudent thing, and just plan good management...
reader47
What I wonder about adaptive tests that test a range of different skills is whether they, on the first skill tested, use a couple of questions and answers to determine level of THAT skill, and then go onto test OTHER skills using questions at the level determined by the answers to questions around the first skill? Or, for each DIFFERENT skill tested, does the program reset to, say, mid-level and start over determining level of THAT skill. In other words, does the test adapt all the way through to determine overall level, or does it adapt only at each set of questions around each skill, the reset for the next skill?
I attended the (unpaid) two-hour SBAC class for teachers, given by Sean Cook (?), SBAC manager for SPS. I specifically asked if each test assessed only the standards for its grade level (e.g., does the fifth grade math SBA assess only fifth-grade math standards?), and Sean replied that, because each test is adaptive, each test can assess standards from up to two grade levels above and two grade levels below. This surprised me, in part because the MSP could assess ONLY the standards for the grade level it assessed. With the adaptive SBA, it seems that students are essentially penalized for not having been taught standards above their grade level, which most students aren't. Also, this seems to contradict your comment that "All of the items on that adaptive 4th grade math test are 4th grade math items," no? Am I misunderstanding something? I appreciate your careful understanding of all things SBA.
--TeacherMom
http://www.smarterbalanced.org/wordpress/wp-content/uploads/2014/10/SmarterBalanced-Adaptive-Software.pdf
It explains 2/3 of the test is at grade level and then "the question pool is expanded to include (as needed) questions either from below (or above) the student’s grade level."
-parent
When a student takes a standardized test, there are generally three different kinds of scores generated: (1) a raw score --- the number of questions answered correctly out of the number of questions on the test, (2) a scale score --- a relatively arbitrary set of scores corresponding to a range of raw scores and the difficulty of the range of questions answered (e.g., the HSPE has 400 as the minimum proficiency scale score), and (3) a performance/achievement level score (e.g., the HSPE/SBAC has 4 performance/achievement levels --- Level 1 – Below Basic, Level 2 – Basic, Level 3 –Proficient, and Level 4 – Advanced). The individual student score reports that parents received on state tests don’t usually include scale scores because, essentially, those are not as useful to indicate a student’s performance. I will come around to that, hopefully.
So, when a student takes a computer adaptive test (CAT), the test adapts to the student according to how the student answers questions. I think we’re all straight on that. But let’s get back to TeacherMom and Linh-Co’s questions about the potential of students answering the first set of questions correctly and thus receiving more difficult questions. If the student answers those questions correctly, the student will receive exponentially more difficult questions throughout the test, leading the student to possibly receiving questions above the student’s actual grade level. [Sorry I didn’t acknowledge this possibility above. I almost did and it seemed too confusing and I cut it out of my explanation.] The student may start answering these more difficult questions incorrectly. If so, the test will adapt again and give easier questions. But the student is not “penalized” for answering more difficult questions incorrectly. This is because the raw score is fairly useless at this point. What we’re really trying to find out is the performance/achievement level of this student. Let’s take a 4th grade student who answered correctly the initial set of grade level questions aligned to proficiency then proceeded to answer exponentially more difficult questions correctly but then started to answer incorrectly some 5th grade questions. We would conclude from the results that this student is at Level 4 – Advanced due to how the student performed on the range of difficult questions. And this student would also be in the upper range of scale scores. But this student might have had a raw score of 25 out of 30 questions. This doesn’t tell us much. Hypothetically, another 4th grade student could have a raw score of 25 out of 30 questions but the majority of those questions could have been easier questions, including even some 3rd grade questions. That raw score then tells us very little.
--- swk
The MSP and EOC reports to parents, on the Source, include both scale scores and proficiency levels. On request to OSPI, a parent can get more detailed reports that show how many questions were missed.
The NWEA MAP provides a scale score plus a percentile (no proficiency levels, as MAP is not used for state accountability purposes). Percentile conversion charts are available to compare scale scores across grade levels.
The sample SBAC reports for parents indicate a scale score and performance level, very similar to what parents receive with the MSP and EOC.
-parent
My mistake. And I re-read my comments several times before I posted.
--- swk
That seems wonky - How many questions would a student get about "character" and then "metaphor"? If it starts high on metaphor and student tanks the question, how many chances do they get on easier questions on metaphor, and then does system start them low on the next skill, say, "irony"?
I don't get how this effectively tests each skill if it starts questioning on one skill at a high level and starts testing the next skill at a low level. It doesn't seem there would be time to give a series of questions on each skill in order to determine level on each skill. It seems like it will be testing all skills together and merely guesstimating an approximate level at the end of the test (wandering up and down, depending on student answers until end, where it spits out a general level.
Will parent/guardians and educators have access to data that shows student proficiency in each skill (or knowledge) tested?
My understanding is that no one can see the test the student took itself, and the answers the student gave. THAT would be helpful in trying to figure out what student knows about each skill.
--- swk