Common Core Testing Promoters Circling the Wagons

Seattle Schools has now finished just one part of the state testing cycle.  According to the schedule at OPSI, the 3rd grade reading is done.  That leaves plenty more to do before the June 15th cutoff date.  (I'm thinking the district will be done a lot sooner than that.)

CPPS and the Equity in Education Coalition are having an "informative community conversation on the Common Core State Standards and the Smarter Balanced Assessments" on April 27th at the African American Museum.  It includes Eric Anderson, the head of assessment for SPS.  The panel also includes two people from OSPI, a teacher from Kent and someone from the Office of Education.

But this appears to be by invitation only, sorry.

The invitation ends by saying:

We hope this will be an ongoing dialogue and the first of many community conversations on Common Core and the Smarter Balanced Assessments.

It's great that CPPS and the Equity in Education Coalition are trying to create an opportunity for dialog around this issue.  I wish our own district had wanted to do this in any meaningful way. 

I mean a roll-out of new standards AND new assessments over two+ years and there hasn't been much in the way of community discussion (not to mention information) on Common Core and its assessments?  Why not?

As well, today Bill Gates weighed in on the latest drag on Common Core.  And boy, did he huff and puff out some doozy quotes.  From the Huffington Post:

According to prepared remarks provided to The Huffington Post, Gates told educators at the National Board for Professional Teaching Standards' Teaching and Learning Conference that the Common Core is the key to creativity for teachers.

He then argued:

Consistency of the Common Core across states, Gates argued, is a key ingredient in its potential success.  

So standards are created at one end and assessments at the other and somewhere in-between those two is "the key to creativity?"  I would think it would narrow what teachers can do. 

And how do you keep fidelity - across 50 states - to Common Core standards and yet, encourage creativity?

What's fascinating in his remarks is that he keeps circling back to teachers.  Over and over, almost as if he trusted and believed in them.  

The final laugher?

"Maybe we can't answer every tweet or post, but the authoritative voice on this is teachers," Gates said.

This from the guy who paid for Common Core standards to be created in the first place?  They were not even written by teachers (oh right, teachers gave "input").

The Times has an article today about how this opting out might affect teacher evaluations and school scores.  They say juniors are "required" to take the test and that, of course, is not true. 

I have several questions into OSPI to clarify the issue of what score is given to a student who opts out.  I have been told by several sources that there is a "zero" score and a "test refusal" score and that the latter is not recorded as a zero.  I am also asking for clarification on any federal dollars over a school scoring lower because of opt-outs. 

I end by saying that very soon we all know that the Times will have to have an editorial lambasting parents over opting out.  In 5, 4, 3, 2, 1...

Comments

Po3 said…
Is 3rd grade testing really complete? Would love to hear from parents in the trenches if their 3rd grader is done. And how long it took them to complete the test.
Anonymous said…
My 3rd grader spent 2 half days on Language Arts and 2 half days on Math but it did not take her the entire 2 1/2 hour time slot each day. I think she took most of the time in math both days and finished with some time left with LA. She said she has one more piece of testing the Thursday after break, but the testing schedule her teacher sent us is buried away in paperwork and I don't feel like digging it out to see what it is for.

NE Mom of 3
Hmmm.. said…
Directors Peter and Patu attempted to begin a public conversation regarding SBAC and vulnerable populations, but Directors McLaren, Blanford, Carr and Martin Morris shut down the conversation.

CPPS bills themselves as a community organization and has gotten into the business odF promoting charter schools . The lack of transparency is more than disturbing. Why is OSPIgoing to be at the meeting.
Anonymous said…
Don't forget that Peaslee emailed in her red herring issues re: the Peters/Patu resolution, claiming she wanted it to be "broader" and go beyond SBAC (essentially providing McLaren her fig leaf to fret, ponder, and then stonewall the resolution in committee

barf
Anonymous said…
Melissa, I'll let OSPI provide the official answer but I'll provide one here in regard to how opt outs (i.e., refusals) are counted in case OSPI neglects to respond or is delayed in responding.

When a student opts out/refuses to take the assessment(s), that refusal is noted in the data file submitted to the district and to OSPI. However, when calculating proficiency rates of a school and district, that refusal is counted as a zero. In other words, all students (except those exempted from testing due to illness, etc.) are counted in the denominator of the proficiency percentage. And since opt outs don't have a scored, they are counted as a zero in the numerator.

Now, if a teacher's evaluation is based on the performance of students in her/his class(es), the same would be true. All students in that teacher's class(es) are counted in the denominator (except those exempted) and the refusals are counted as zeros in the numerator.

What I don't know is how that calculations are done if there is a significant number/percentage of opt outs.

So the district and the school should be able to calculate proficiency percentage with all students in the denominator AND with all opt outs stripped from the calculation. They have the data files to run both calculations.

--- swk
Desegregate Data said…
Clearly, SPS and OSPI simply need to create a different category for opt outs. We are, after-all into desegregated data...:). Failure to desegregate this information is simply an attempt at coercion.

Barf is correct. Peaslee is not a member of the c and I committee, but she sent a letter to the C and I Committee to shoot down a good resolution. Why board members did not support a resolution asking to delay linking SBAC to AYP is outrageously foolish.

The district handling of SBAC has been pathetic-at best. There was never the intention of informing families that their children are being used to set cut scores and baseline numbers.
seattle citizen said…
swk tells us that proficiency rates (building, district) AND, where applicable, teachers evaluations using test scores, are calculated by including zeros for those who opt out. How on EARTH would anyone in their right mind think this is rational, statistically valid, or fair to teachers, schools, districts? Why should they get dinged if some students don't take the test? Worse, important decisions regarding funding, etc, are being made based on these scores, and with a bunch of zeroes factored in the results are obviously not indicative or representative of any sort of proficiency, they are totally skewed.
What a rabbit hole. Who thinks this s*** up?
Anonymous said…
Uhhh. People like swk.

Reader
Anonymous said…
My 10th grader has spent 4 hours on the SBAC,says about another 2-3 hours needed to wrap it up. Only one student finished out of 30+ students taking the test.

SBAC Parent
Anonymous said…
Pay attention, people, there will be a quiz at the end of the period.

The test being given to 11th graders is the "Smarter Balanced Assessment". Its acronym is SBA. (note the absence of a C in that acronym...0

It was developed by the "Smarter Balanced Assessment Consortium", a group of states to which Washington is a member. That entity's acronym is SBAC.

Calling the test the SBAC is every bit as wrong as calling the US Government Form 1040 tax form you file file every year your "IRS", or calling any particular school building in the city the "Seattle School District".

As I promised...here's your quiz:

The test currently being taken by 11th graders in Washington was developed by the ____, and is called the ___.

Acronym Police
"Failure to desegregate this information is simply an attempt at coercion."

Yup.
Anonymous said…
Sadly, most of the opt out students are from privileged or educated families, especially at the high school level. ELL, special education and poor students are the ones taking the test. No one honestly advocates for them, although some grand- standing educators hold press conferences claiming to represent the disenfranchised. When in reality these children are being used as a political football. There are no winners here. SBA proponents did not calculate the enormous pitfalls and demands this test has on school buildings. Opponents of SBA are somewhat reactive and have not thought out the consequences. Teachers are woefully split and borderline insubordinate; not smart for SEA considering this is a contract negotiation year. Also High Schools are more oppositional than K-8. SBA may not be the best assessments and may someday wither on the vine. But without an assessment to measure progress, schools will become islands within themselves, hiding disaggregated student data on reading and math.
Anonymous said…
Desegregate, the cut scores for grades 3-8 and 11 for the SBAC assessments have already been set. SBAC and the states used the field test results to set the cut scores.

The State Board of Education will use the results of this year's 10th graders taking the 11th grade SBAC assessments to set the high school graduation cut scores. I don't believe there are many opt outs in this group.

Also, I assume you meant disaggregate data. Desegregate has a totally different meaning.

--- swk
Desegregate Data said…
I am sorry. swk is correct- I meant disaggregate. Although, SBAC will further segregate our schools.

Nyland and the district should lead the effort to disaggregate data related to opt- outs. I doubt this will happen .- they appear to be blind sheep that follow the powers that be. The same goes for OSPI and SBE.

Clearly, failure to create an opt-out category is an attempt to force SBAC on students, schools and districts. One could argue that failure to create a separate opt-out category will promote the notion that public schools are failing and prompt privatization etc.
Anonymous said…
I would bet that the zeroes are counted because they don't want the schools to encourage students who might score low to opt out. A zero is worse than a low score.

HP
I'll reprint for Anonymous (no anonymous comments):
Sadly, most of the opt out students are from privileged or educated families, especially at the high school level. ELL, special education and poor students are the ones taking the test. No one honestly advocates for them, although some grand- standing educators hold press conferences claiming to represent the disenfranchised. When in reality these children are being used as a political football. There are no winners here. SBA proponents did not calculate the enormous pitfalls and demands this test has on school buildings. Opponents of SBA are somewhat reactive and have not thought out the consequences. Teachers are woefully split and borderline insubordinate; not smart for SEA considering this is a contract negotiation year. Also High Schools are more oppositional than K-8. SBA may not be the best assessments and may someday wither on the vine. But without an assessment to measure progress, schools will become islands within themselves, hiding disaggregated student data on reading and math."

Again, (and again), I have found very few people, including myself, who say no assessments. No one is saying that.

But you seem to be saying, better a less-than-useful one than none. Nope.

I will wait for OPSI's reply but yes, quite punitive to punish a school for the decisions made by parents.

That's another irritating meme - it's all teacher driven. It is not.
Anonymous said…
I would like to clarify my response above. Students who have opted out/refused to participate in the assessment(s) do not actually have zeros entered in the data file. Technically, they are counted as not meeting standard.

When calculating AYP, actual student scores are not entered. Only whether the student met standard/proficiency or not is entered in the calculation. The numerator of the proficiency percentage calculation includes the number of students who met standard and the denominator is all students who should have been assessed. Student who opted out/refused are not included in the numerator (since they didn't meet standard) but are included in the denominator (since they should have been assessed/were not exempted). Thus, refusals count against a school and district's AYP proficiency rates.

All of what I'm sharing can be found in pages 6-8 of the OSPI AYP FAQ at:

Adequate Yearly Progress Questions and Answers July 2014

--- swk
Anonymous said…
From an OSPI webinar:

Q: Does a refusal to test count against a district and school?
A: Yes, both in terms of participation and proficiency.

For teacher evaluations, I was under the impression opt outs would not be counted against a teacher, but perhaps that is a question for the district, not OSPI.

opt out?
Anonymous said…
I decided to opt out my children from this assessment, but after reading this thread I'm starting to doubt myself. I really don't want to punish my children's school.

-Doubting
Doubting, I can understand your concern.

But, in the history of this country, we only saw change when large numbers of people rise up. It is the same in this case.

If you believe that SBAC is a good test, have your child take it.

If you believe spending a large number of non-teaching hours on a single test that is a data point that will overshadow all other data points on your child, have your child take the test.

If you believe that the money and time for the classroom is better spent than other ways, have your child take the test.

But understand, if parents speak - loudly and throughout this country - things will change.
Anonymous said…
Thanks, Melissa.

-Doubting
Anonymous said…
Melissa, you wrote, "If you believe spending a large number of non-teaching hours on a single test that is a data point that will overshadow all other data points on your child, have your child take the test."

This is a highly debatable point you make. It is highly doubtful that anyone really cares, except colleges/universities, about your child'd individual test scores. The only people who really care about any individual child's performance besides their parents/family and their teachers are colleges/universities, college scholarship organizations, and car insurance companies. And these organizations care most about grades/GPA.

I can't think of a single instance in which a standardized state test score would "overshadow all other data points on your child."

Frankly, this sounds like the kinds of scare tactic of which you accuse school administrators, education reformers, etc. and seems outside of the realm of information that you reasonably provide.

--- swk
Anonymous said…
swk, AYP is based on a single test score. No other data points need apply. They don't matter. That's exactly what "overshadowed" means. DFERs claim is, SBAC is actually such an important measure.... that your house will lose value if this single measure isn't ascertained. And guess what, schools are now impoverished both in terms of time spent, curriculum neglected, and computer infrastructure hijacked. Bill Gates' kids get to go to Lakeside, and no way do they spend their time on this.

Reality Check
Anonymous said…
I don't know exactly how many hours are spent in Standardized Test prep, but it's a lot. For our 3rd grade class it looked something this.

-Interim Benchmark assessments: 6 Amplify Tests (3 for ELA and 3 for Math) approximately 12 hours (here I am assuming that all 6 tests happened, we didn't get any info about testing because our class had a sub.)
-Amplify worksheets during class (time taken unknown)
-Summative assessment: SBA approximately 8 hours
SBA practice test approximately 2 hours
-Typing practice: 30 mins once a week (on most weeks during the school year unless the lab was used for testing.) Hopefully this ends 3rd grade.

That's just an absurd amount of time lost to Standardized Testing. The loss of instruction time will hurt all children.

-nh
Anonymous said…

Here is a comment from a teacher, Ursula Wolfe-Rocca, regarding the 11th grade ELA practice SBA. It's a pretty serious flaw in the test if teachers cannot figure out the answer. The full comment can be found under
http://dianeravitch.net/2015/04/15/whats-wrong-with-the-parcc-test-a-reader-explains/



Three weeks ago, a group of roughly 10 high school educators, mostly English and Social Studies teachers, spent a couple of hours taking the English Language Arts SBAC practice test for juniors. As a professional who has taught reading, writing and critical thinking skills to juniors in high school for over 14 years, I found this test more than problematic on multiple levels. First, the Smarter Balanced marketing materials celebrate that these tests hold students to higher standards because they are no longer multiple choice. However, many of the questions that masquerade as open-ended are in fact multiple choice. For example, a number of questions called for students to read a text and highlight sentences that demonstrate a particular argument or meaning. Ostensibly, these were open-ended reading comprehension questions; however, when I was asked to highlight a passage from the Life of Pi that demonstrated the main character’s concern for the tiger, the passage I wanted to choose was unavailable! It turns out only some of the sentences in this “open-ended” exercise are clickable, which means the questions are multiple choice after all (to say nothing of the fact the a college-educated 40 year-old who has read the book thought the best answer was one that was not possible to select)! More upsetting to me was another question, again, about the Life of Pi, where students were asked to sum up the main emotions of the main character. At the outset of this group of questions, the student is asked to read a very long passage; for this specific question they are asked to read just one paragraph from that longer section. The question is insanely misleading because if you have read the longer passage, ALL the answers regarding the shorter passage are true. The whole point is that the character is experiencing a mixture of emotions, some of which are contradictory. None of the teachers in the room could figure out what the correct answer was since a nuanced reading of the text would make all of them possible. What did the test makers intend? Great question! We never knew, because the practice tests do not allow you to see correct answers nor do they provide explanations of strategies students might consider: yet another flaw. My fellow teachers and I created a list of the additional defects we saw with the practice test and it is long. I know I cannot detail them all here. But surely, if teachers were consulted, in legitimate numbers, many of these same flaws would have been identified.

-nh
Anonymous said…

Regarding the amount of time spent on Standardized Test Prep:

I forgot that there was also some class time devoted to test prep just for the SBA writing section, but I don't know how much time that took.

-nh
Anonymous said…

From the Seattle Opt FB page:

Jesse Hagopian has an update on SBA stories around Seattle.

http://iamaneducator.com/2015/04/18/common-core-testing-meltdown-in-seattle-teachers-speak-out-on-technological-breakdowns-and-the-loss-of-class-time/


-nh
Me? And a scare tactic?

Well, if state standardized testing were not:

- sucking time up from real teaching and learning
- sucking up space in computer labs/libraries
- wasn't going to be used to evaluate teachers
- is THE major "college/career-ready" data point

maybe you would be right about a scare tactic.

I didn't create all those events and realities; ed reform did.
Anonymous said…
Melissa, many parents rightfully turn to you on this blog for sound advice and good information. In this case, I think you gave bad information.

And despite Reader's misinterpretation of the point of my comment, AYP has little to no effect on individual students. Therefore, the use of that student's individual test score does NOT overshadow every other data point on that student. Grades and attendance are WAY more important than state test scores.

And I would disagree that SBAC assessment scores are THE major "college and career ready" data point. GPA is still way more important. It's not even close.

--- swk
I note you do not disagree with my facts about this testing.
Again, it is not me who created this situation. It is not me who has to bear the brunt of it.

Don't shoot the messenger.

I DO consider it sound advice - at this point - for parents to opt their students out of this test (except at the 10th grade).
Anonymous said…
swk says: I can't think of a single instance in which a standardized state test score would "overshadow all other data points on your child."

Really swk? I "misinterpret" your meaning. ??? Then you are pretty inarticulate. AYP is at least one "single instance" in which standardized testing from SBAC is overshadows all other data points.

Given the punitive nature of AYP, and the fact that the test is designed for many/most students to fail - that seems like a pretty big deal. A pretty large "overshadowing". In the case of AYP, funding to schools is further drained to go to "tutoring" companies - who have no track record in reducing any achievement gap. Pearson's is probably hoping to scoop up that business too. And, if we are to believe DFER - our home values are all on the line - due to a single test, SBA. (there we go, another "instance" of single data point) And that test will "prove" to us that our kids are really getting a crappy education after all. DFER would have us believe that parents are just being duped into believing in education - and they need the SBA to set them straight.

Absurd.

Reader
Anonymous said…
Reader, please feel free to argue against a point that I was not trying to make.

My point is this: The state test score overshadows no data point ON YOUR CHILD.

I'm talking about how a state test score affects an individual child (or doesn't rather), not how a state test score affects AYP or any other factor.

Again, you misinterpret my meaning.

--- swk
Anonymous said…
If a kid fails to graduate because of this one data point --- that failure is the "overshadowing" of all other information. If students then have to spend their precious time in high school, fooling with a COE class, collection of evidence, or a LDA, locally determined assessment... that is a direct impact ON AN INDIVIDUAL CHILD based on a single data point, overshadowing all others.

And, if school resources are further drained, on these activities, like COE and LDA, based on "using the SBAC as a single data point", then many students are damaged by the huge waste of resources that this SBAC effort has become.

Reader
Anonymous said…
swk: I am troubled. Often, I find that your posts help me to avoid (or correct) either hyperbole and exaggeration, or the habit of using one person's post to argue something different that is more interesting/compelling to ME -- thereby changing the subject (often unintentionally, I just find my own arguments SO compelling :>))!

But your rebuttal to Melissa (and the ongoing conversation with reader) seems otherwise. At first, I just thought perhaps it was what I have described above. But my thoughts ran along reader's lines. Wait! At least with 10th graders, we are talking the ability to graduate, and the dissemination of the scores to colleges, who will use it in determining whether remedial coursework needs to be taken, etc. Which in turn leads to possible financial repercussions (and may imperil the ability to pursue certain majors), admittance or not into college honors programs, etc. etc. (And we have seen before with MAP how easily this District starts to later use standardized test scores for invalid purposes, like admission to APP, etc.) So why, I wondered, would you quibble with Melissa's characterization of whether this exam score as one that would overshadow all other data points?

For starters, the "only colleges/universities" thing seems odd. Why is that not a HUGE deal to the many many kids who want to go to college (and who have in years past struggled over how to achieve good SAT or ACT scores for that very reason)?

I also think that the characterization of this test score vis a vis a GPA as each being "one data point" is misleading. A GPA is comprised of many many smaller data points, collected over a four year period, with far less opacity, much more fairness in terms of the child knowing what is expected and how to turn hard work into good grades, and (in general -- though not each case) a much greater ability by a child to challenge built in error (by talking to teachers about how items were graded, performing extra credit work, etc.)

Set against all of that input by kids and the corresponding evaluative output by teachers over a four-year period, you seem to suggest that an equivalent data point is a test, the questions on which are secret and are never divulged to those who take the test, their parents, or the colleges/universities that will sort kids based on it. Teachers are forbidden, often on peril of termination or other discipline, from discussing it among themselves or with parents or community members (though those who have risked it believe it to be deeply flawed -- to the point of invalidity for ANY legitimate purpose. Students are threatened with discipline (and evidently monitored without their knowledge or consent) so that they cannot discuss test questions, even after the test is over.

(And then, there are the other collateral issues -- e.g., the test at issue is terribly expensive, both in dollars and time, which would be a problem, even if it were a GREAT test -- which it evidently is not).

Out of all this, you choose to debate whether Melissa is using scare tactics in stating that the test will "overshadow all other data points on your child." I don't get it. To me, this has the feel of being a deliberate attempt to focus on one tree, so as NOT to have to see the entire forest -- to a degree that seems unlike you.

You are the ultimate arbiter of "your meaning." So if you want to call people on misinterpretation, it is ultimately your call. But in the context of the larger argument -- what is your point? Reader (and Melissa) seem so clearly right here in arguing the harm to students of performing poorly on a test that we know very little about, that has been riddled with technical problems, that many teachers feel is substantively problematic (bad/ambiguous questions, incorrect reading levels, etc.) and that is designed for a high rate of "failure" (however defined), and you seem to be arguing small niceties for reasons I totally cannot understand.

Jan
Anonymous said…
Jan, thank you for your post and questions to me.

Let's go back to the beginning. A parent earlier in this thread showed some doubt over her/his decision to opt her/his child out of the SBAC assessments. In advising this parent to not doubt the choice to opt out, Melissa stated "a single test that is a data point that will overshadow all other data points on your child."

I have to admit that alarm bells went off for me. I took Melissa's meaning to be that the SBAC assessment scores would negate all of a child's hard work, grades, etc. In other words, if this parent allowed her/his child to take the SBAC assessments, then all other performance measures would be for naught. Melissa responded to me but didn't disagree with my assumptions regarding her statement. She seemed only to make further arguments that the SBAC assessments are terrible on the whole.

As for how colleges/universities choose to use SBAC assessment scores, all they have stated is that they will use "passing" scores to allow students to avoid remedial courses. There is nothing in any statement of theirs that the inverse is true --- that "failing" scores will place students in remedial courses. In other words, if a student shows up a college/university with Level 3 or 4 scores, that student could proceed directly to college-level, credit-bearing courses. However, if the student shows up with Level 1 or 2 scores, they would still have the option of taking the college's placement test(s) to place into college-level courses. This is especially true given that that student would have had at least another year of high school coursework prior to arrival at the college. The college couldn't/shouldn't place that student directly into remedial coursework based on a test taken at the end of the 11th grade. Therefore, there is no certainty that the SBAC assessments would have any negative financial repercussions or any other negative effects on this student's college pursuits. The SBAC assessment scores in this matter would not overshadow any other data point.

As for my arguments about the GPA, I think you and I are saying the same thing. For the reasons you stated, the GPA is a far better indicator of student performance than any single test score. That why I argued that Melissa's statement was erroneous. A single test score could not possibly negate the power and usefulness of the GPA.

As for your argument that the test could not/should not be used for legitimate purposes because the teachers do not receive prior access to the test questions does not make sense. Because these test are high-stakes --- I'm stating a fact, not stating support for the high stakes --- it's not appropriate for teachers to have access to the questions. The SBAC assessments are designed to assess student knowledge of the Common Core State Standards. If teachers teach the CCSS well, they should know what is on the SBAC assessments. And the SBAC assessments have been shown to have "internal validity" --- meaning it is valid for the purposes in which it is designed.

Finally, and related to that last point, I will state again that the SBAC assessments were NOT designed "for a high rate of failure." That is a common error often shared here on this blog and elsewhere. It is not a norm-referenced assessment; therefore, it is not designed to elicit any specific range of scores. It is a criterion-referenced assessment and is designed to determine student proficiency of the CCSS. More specifically, it is designed to determine student college and career readiness. Based on the field test results, SBAC estimated that 60-70% of students would not be college and career ready as determined by the CCSS and the SBAC assessments.

--- swk
Anonymous said…

Yes, now the Standards (Common Core) and Assessments (Amplify and SBA) line up. All that's left is the curriculum that teachers present in the classroom and I worry that the teachers will be under pressure to teach to the test (which is now the same as teaching to the Standards.)

Do the principals receive a bonus based on these test scores?

I liked what Lisa Hansel wrote on the core knowledge blog regarding the reauthorization of ESEA:

What Darling-Hammond and Hill should have written is this: Because cognitive science shows that broad knowledge is essential to meet technology, economic, and citizenship demands, an accountability system must encourage a content-specific, well-rounded curriculum that inspires high performance and continuous improvement by testing what has been taught and thus providing data that teachers can actually use to inform instruction.


http://blog.coreknowledge.org/2015/04/07/no-progress-on-accountability-no-hope-for-equity/

-nh

Popular posts from this blog

Tuesday Open Thread

Why the Majority of the Board Needs to be Filled with New Faces

First Candidates for Seattle School Board Elections 2023