Open Thread

Here's an open thread for anything on your mind.

(Please note: if you want advice or have a concern to express, that's fine. We are all glad to help or guide to help if we can. But if you are going to write about something factual, please state where you found your information. It's important that this blog remain a place where we do base non-opinion statements on factual information and state our sources. Naturally, you can disagree with someone's take on what data means but I just want to make sure that non-opinion statements get backed up. Thanks.)

Comments

Cailean said…
Some parents at our school are rallying the troops to email the directors to take off the table the idea to cut all elementary school counselors next year. Have I missed this being discussed here? It is such a major decision that will impact many kids and families!!
MJ said…
How do I actually create a post to put on this community blog? The MAP test scores are posted on the NWEA web-site and are public record. Half of the entire Seattle School district students failed the MAP this year. It is a seriously flawed test and, I learn from this blog, that our superintendent is on the NWEA board that creates the test. That seems a conflict of interest. This is not how we are going to guarantee quality schools for all of our children through the promise of this test which, according to the research,(I will re-post the university that did this research after I google where I found that) is not even correlated to state standards. The test is flawed.
MJ said…
Dr. Merwin of Heritage University did research to show that the MAP test is not correlated to the state standards assessment. Why can't we at least find a test that fits the state standards? And why do we have a math curriculum (Every Day Math) that was placed at 13th to fitting our state standards? If it is the state standards each district is expected to follow, how can this district be justified in not following them?
MJ said…
This comment has been removed by the author.
wsnorth said…
What does an elementary school counselor do, and how many of them are there? I've had 5 kids in SPS elementary school at various times, and have never even heard of this position until now. Sadly, it is the unsung heros that are easiest to cut some times.
seattle said…
College and career councelors have been cut from SPS high schools and they are desperately needed. Now nobody is doing their job and kids are suffering for it. Does this district really want their college bound rates to drop? Most at risk kids and kids who are on the fence about going to college won't pursue it without some help and guidance from a councelor. Ugh
Dorothy Neville said…
Elementary school counselors:

Frankly, the counselor involved my my son's kindergarten SIT was worse than useless, she was damaging. But some people are just like that.

Lowell didn't use to have a counselor. They also (in APP) didn't used to have kids with IEPs much either. Twice exceptional? No such thing. I know several people who were simply told APP couldn't accommodate them and they needed to transfer. Some teachers accommodated some needs, but it was hit or miss.

They got a counselor the year my son was in fourth grade. Now at the very end of third grade he had burst into tears and admitted that he was being picked on. This was the day before the last day in June. I spoke with his teacher and her reply, "Oh that's been going on for months." She did say she would make sure all the kids involved ended up in different fourth grade classrooms. But had she intervened or spoken with me or with the principal? Nope.

Well, second day of fourth grade I met with the counselor because day one hadn't been good. She was in her office eating a late lunch while looking over a yearbook. Seems she walked the playground at lunch and witnessed/intervened in an incident and was trying to identify the kids involved. Guess what? Yup, it was an incident involving my kid. What a difference having a counselor made. She and the fourth grade teacher actively monitored things, intervened and changed the dynamic.

The next year was Julie B's first as principal. Between getting a counselor, a younger and more activist principal, some necessary retirements and younger teachers, I do believe having SITs and dealing with IEPs and twice exceptional kids in APP is more the norm.

I cannot understand losing either the elementary school counselors or high school career counselors. Not when central administration is still so big.
hschinske said…
MJ, how did you access that data? Thanks.

Helen Schinske
MJ said…
Talk to the principal of your child's school about how half of the district MAP scores from Fall to Winter dropped. They should give you a username and password so you have access to the NWEA site. All of the principals in the district are aware that half the scores dropped and the superintendent even debated with the principals at one of their cluster meetings whether to send the scores for the MAP home or not. Ask your principal point blank what percentage of scores actually went up not just in your child's school but in the entire district. If they tell you they don't have that information, ask them to get it. This is the kind of information that is obviously very embarrassing for the entire district but without this open communication and sharing of data between parents, teachers, the school board, and principals, how can the balance of power be checked and absurd tests like the MAP thrown out? And why is NWEA unable to give any information on what specifically they are testing? How do we know this is a valid test? It is expensive to administer and the district needs to let us know exactly how much is going into NWEA's pocket and whether the superintendent gets any incentive to use this test given she is on the NWEA board. We see our child's scores and without knowing that half of the district's children have failed this test we are dismayed that they are doing so poorly. Is this fair to us as parents and to our children who think they have failed a test without knowing that fifty percent have also failed?
Okay MJ, I think the MAP has only been given once so far (yes? no?) so half of Seattle students, across the board, failed it? Could you clarify what you are saying?

Yes, Dr. Goodloe-Johnson is on a NWEA board (non-paid) and she has publicly acknowledged this. The Board doesn't seem to think its an issue. Her relationships with several institutions does seem problematic to many of us.

I think the elementary counselors do what you might think which is to try to help kids who have issues that teachers can't solve. I don't think all elementaries have them (it's probably a budget choice by principals) but many do because of the number of high needs kids they have.

I concur about the career center counselors for high schools. It is absolutely ridiculous to want kids to apply to college, fulfill the community service requirement or find an internship or summer job totally on their own. Many kids are first-generation for college and absolutely need the guidance. But we can have 28 FTE in the BEX program and 100+ academic coaches with no discussion on whether that money is worth it.
Lori said…
MJ wrote: "Half of the entire Seattle School district students failed the MAP this year."

How exactly does one "fail" the MAP? There are no passing grades, just RIT scores that can be converted into percentiles. By definition, about half the children in any grade should have a RIT score below the 50th percentile, assuming Seattle public school children are representative of the population upon which the RIT scores are normed.

I really can't get too alarmed about this without more information.
Mr. Edelman said…
MAP has been administered TWICE this year, so MJ is correct about that.

MJ, how do you know that the superintendent debated MAP results with principals at a cluster meeting?
Eric B said…
The MAP test has been given twice this year. It is scheduled to be given 3 times. MJ's assertion does not make sense. There is no passing or failing the MAP test. They _do_ release the "norms" the score that half the students in a particular grade achieve. It would make sense that half the scores would be lower than that. Unlike Lake Wobegon, half our students are below the median.
Lori said…
We talked at our school in early January about how we parents should not read too much into the Winter MAP scores. For any given child, there can be day-to-day variability. Maybe a child who did well in September wasn't feeling physically well the day she took the test in January. Might end up with a lower score just due to whatever life circumstance was happening that day.

Also, they started the MAP testing right after the return to school from winter break. That seems silly to me, to test the kids when they've just had 2 weeks off school and may still be riled up and not into their school routine yet. But for whatever reason, that's what they did.

MJ, I'd also like to know whether these supposed drops were statistically significant relevant to baseline. My daughter's Math RIT went up 2 points while her Reading went down 1 point; however, both scores are right within the score range, which I believe is a confidence interval. All I can say is that my daughter's second test did not show significant improvement from baseline. I'm not at all concerned that she lost progress in Reading however even though the second RIT was indeed a lower number.

Our MAP report also shows that first graders in the district improved from a mean RIT of 164 in Math in the fall to 174 in Winter. Reading went from mean 162 to 170. Those are positive trends overall. So while it's very difficult to interpret exactly what any one child's scores "mean," it seems pretty clear that the district as a whole made progress, in keeping with what is seen in the nation at large (ie, Norm group averages)
hschinske said…
Dr. Merwin of Heritage University did research to show that the MAP test is not correlated to the state standards assessment.

Source, please? The only association I can find between Dr. Merwin of Heritage University and the MAP test is that she supervised two master's projects that had to do with attempting to increase MAP scores.

Helen Schinske
Kathy Barker said…
No college and career specialists= fewer college and job visits = more military visits
We sent this letter last week...so far, a nibble by Kay Smith-Blum, nothing else.

March 4, 2010

Dear Superintendent Goodloe- Johnson and Seattle Public Schools School Board members,

With the loss of the career specialists in the high schools, and the diminished economy, we have seen a strong increase in military recruiter presence in the schools. This is often in disregard of SPS policy, and it is in disregard of the unofficial SPS policy that all students deserve an equal chance.

We would like to meet with you, in committee or individually, to discuss this.

Briefly, we’ll mention some of our concerns:

Military recruiters are not following regulations in the schools. We have reports from schools that military recruiters are coming at unscheduled times, they are coming out of uniform, they do not sign in or out.
Furthermore, military recruiters are pushing the spirit of the regulations. For example, one of the military recruiters for one high school is also a football coach there.

Schools are not following regulations. This is largely because the counselors do not know the regulations. Recruiting rules are not posted, counter military recruiters are not invited. Without counter military recruiters there, military recruiters are free to minimize the chances of deployment to Iraq or Afghanistan, to not explain the 8-year military contract, or to exaggerate the educational benefits.

SPS has some of the best policies in the country, but those rules are being circumvented. No Child Left Behind (NCLB) says that schools must give military recruiters access to students in schools, and must hand over student contact information every year- but can opt-out of giving their home contact information.
SPS policies try to make sure access is no more than allowed to college and job recruiters, and allow students and parents to opt-out just once for the student’s high school career. But it appears that the military keeps each year’s list and uses it to contact even the students who have subsequently opted out. SPS policy needs to accommodate this.

There are fewer college and job visits at the schools. Career centers are actually closed at some schools. With the continued and often increased presence of military recruiters, and the decreased exposure to other options, a segment of our students are being steered only to the military.
We hope that the College and Career Center Specialist can be funded. But the futures- no, the lives- of some of the students are at immediate risk, and we’d like to offer our help to curb aggressive military recruiting in the high schools.

Sincerely,

Kathy Barker
Parent of 3, 2 still in SPS
Garfield HS PTSA Board
Washington Truth in Recruiting
206 328-2804

Mike Dedrick
Veterans for Peace
Washington Truth in Recruiting
(206) 226-2435
SPS mom said…
This comment has been removed by the author.
Kathy, I am sorry to hear this is happening. Roosevelt has had a very active parent who keeps track of the military visits and comes that day to provide a counter-balance to the military visits. Let me just say I have never seen the military personnel aggressively court students. They come with their materials and giveaways (pencils, keychains) and only talk to students who approach them. The parent who speaks about options other than the military does the same thing. I think RHS only gets two visits a year.

I know it really varies from school to school as obviously, the military knows which schools will likely get more recruits. It is wrong and unfair to allow the military (or any other group) to not act within SPS rules. I am surprised that a principal would allow this to happen.

Military service is a noble calling and I think we all respect those who serve. But there has to be total honesty about what life in the military is and isn't and any benefits (or lack thereof) involved.

FYI, you can't meet with the Board in total. That would make it a public meeting that they would have to open to everyone. The best thing to do is to set up separate meetings with each director or whichever ones you feel are key.
lendlees said…
Re: Admin budget cuts--Harium answered my question about why we are making cuts in schools while we have a 'bloated' administration:

"We made cuts last year in the total Central Office staff and are in the process of cutting Central Office staff again this year to the tune of $6 million dollars. This will put our staff at the lowest level in has been since 1997 when our enrollment had more that 2,000 more students that we have now."

Evidently $6 million = 73 jobs in CA. Not sure who/where these jobs are coming from. My only fear is that they will be considered senior level teachers who will displace some of the amazing new teachers in SPS.

(A bit off topic but, Melissa who is the best person to contact about building maintenance issues? There are some issues coming to light at my child's school that I would like to make sure get known. Thanks.)
lizge said…
Does anyone know what the assignment policy is for non-entering grades? ie, if I have a child enrolling for 1st grade (who attended K at a non-public school)? Are those children guaranteed a spot at their attendance area elementary school, or is that only the case for "entering" grades (K)?

My son is currently at a preschool which goes thru Kindergarten, and I am considering keeping him there for K in '10-'11, and then moving him to our attendance area elementary for 1st grade (which is Coe) in '11-'12. However, I am concerned that there will not be room for him and he will be placed somewhere else.
Lendlees you reminded me I've got to try to get some of the handouts from the Board Work Session on the Budget posted. There was an eye-opening one on the number of positions at the central office over the last several years.

For building maintenance issues, I would contact Mark Pflueger, the head of Maintenance. I would cc whoever your Board member is just as a heads up. Board members should know when you make an attempt to contact the district with an issue. One, it makes the district staff a bit more accountable. Two, it allows Board members to track continuing issues about individual schools. (If they don't know what is happening on the ground at schools, they have no way to help.)
Kathy Barker said…
Melissa-
The parent who does the countermilitary recruiting at Roosevelt is also a Veteran for Peace: Roosevelt has been lucky to have Dan, who as a parent, a vet, and a smart and lovely guy, is there for the kids with every visit. (SPS policy of 2007 says that a counter recruiter can be there whenever a military recruiter is.)

Not all of the schools are so lucky. There are a network of vets, parents, and peace activists who have struggled for years with district policy, and with its implementation in schools. The Vets for Peace have stepped up mightily as parent volunteers leave the schools with their graduating students....but some schools now have no one. (Ingraham, anyone?)
Every year, a very small group of volunteers gets the opt-out forms given out at open houses and curriculum nights, often with the help of the PTA. It would be terrific if the PTA could do this as a task- it does, at Garfield, where we have a military recruiting monitor (me!)
If a college or job recruiter had a quota and were as aggressive as the military recruiters, principals would rise up more. Some in SPS have been very careful. But people are often afraid to question the military, and war.
We will try to talk individually to Board members, thanks. It was easier a few years ago, when we had some supportive board members and a general activist feeling among students. Lots of beaten down people these days.... kb
grousefinder said…
MAP Results: The second round of MAP results were accompanied by a scoring system that denoted the level of growth students achieved vs. the expected level of growth for all students. This was done on a grade-by-grade basis at each school, then compared to the District as a whole. So if a school showed "below-average" growth for grade 5, it means that (according to NWEA) the 5th grade population for that school is not progressing according to the NWEA expectations (and ipso facto District expectations). If a school's 5th graders achieved "well-above-average" growth, it means that they are exceeding the expectations that NWEA recognize for a given time period. These numbers were crunched by Brad Brenneke (Sic?).

What MJ is saying is that SPS schools made "below average" growth too frequently between MAP #1 and #2. In fact many went backwards. This is what (I believe) she means by "failing."

Using these subjective indicators of "growth," it is not difficult to surmise that SPS schools, in fact, performed dismally on MAP #2. The District refused to release individual school scores (other than their own) upon request by many teachers and principals. Thus, there is no way to compare a school like Alki against say Montlake. Without site scores across district there is no way to compare schools of similar socio-economic status.

Having seen the MAP numbers personally for the District as a whole, it is clear that one of two problems exist. Either the test is terribly flawed, or the math curriculum K-12 is seriously misaligned to the State Standards (to which the NWEA purports to align when designing Seattle's MAP).
lendlees said…
Thanks Melissa. I will be sure to cc the appropriate board member so maybe they can understand the magnitude of neglect at our school...
MJ said…
I have not finished reading all of the responses yet but let me reply to the few that pertain to my response that I have just come across. Half of the students in the school district "failed" in the sense that their scores actually went down from Fall to Winter. In other words, they took the test in September and then took the test again shortly after mid-winter break and instead of showing the expected growth (or any growth for that matter) their scores actually went down from when they took the test in September. This would seem to indicate that they have not only not learned for the half year they have been in school but they have somehow regressed! This in NOT a Lake Woebegone story, this is a story about a seriously FLAWED test that shows half of the school district regresses and individual scores drop. You tell me, a student enters third grade scoring at a third grade level than 6 months later they score at a second grade level. Either the entire Seattle public schools somehow manages to make half of their students forget what they have learned in 6 months and has the rather dubious distinction of un-teaching children so they drop grade-levels from test to test or THE TEST IS FLAWED!
MJ said…
Thank you, Grousefinder, whoever you are, for explaining to the bloggers exactly what I meant by "failing". You put it better than I. What can we do to show the superintendent that this is a flawed test? I have actually been in a position to see some of the questions on the MAP test and if only the bloggers here could see the test they would understand why I am so skeptical. For example, is it appropriate to expect a third grader to understand the negative quadrant of a scatter plot? Scatter plots and negative quadrants are not only not part of the third grade state standards but can you imagine the frustration of this younger person who believes they are "stupid" because they cannot answer that question? I'm not as radical as they are in Vancouver, B.C., in believing that tests should be done away with because they are a form of child abuse, but good grief, there is a human element here. This test ensures that each child will feel like a failure as it takes them to the level that they do fail at to assess their ability. My children HATE this test and have told me on several occasions at night that they feel stupid because they have failed the computer tests twice and are worried about their scores being in some record somewhere that is somehow important. Especially my third grade daughter who already takes life too seriously as it is. These two tests have her so worried. !
StepJ said…
lizge,

The answer to your question is not known yet.

Until the plan is fully implemented (projected to be 2015) the guarantee portion of admission will be determined by the Transition Plan. The Transition Plan for 2011-2012 enrollment will not be introduced (per District stated schedule) until December of this year, and then voted upon in January of 2011.

If the current Transition Plan rules are in effect for 2011-2012 then you would have guaranteed entry to your Attendance Area school as you would be considered "new" to the District. New to the District gives you a guaranteed seat at a non-entry grade under the Transition Plan rules for 2010-2011.
MJ, I agree there are issues. But just because the scores went down doesn't mean students (or schools) did worse. I had a conversation with two teachers about education and MAP came up. They mentioning the lowering of scores and said that it was troubling but that they didn't see it across the board with their own in-class measures.

There could be any number of factors as to why students would do worse but both parents and the district can't just say because of MAP scores, schools are failing.
seattle citizen said…
New York Times Sunday trifecta -
Three interesting articles on education, timely all:

In Harlem, Epicenter for Charter Schools, a Senator Wars Against Them (NY State Senator Bill Perkins)
http://www.nytimes.com/2010/03/07/nyregion/07perkins.html

A Wholesale School Shake-up Is Embraced by the President, and Divisions Follow (Rhode Island mass firing of teachers)
http://www.nytimes.com/
2010/03/07/education/
07educ.html?scp=1&sq=
wholesale%20school%20shake-up&st=cse

Building a Better Teacher (Doug Lemov, charter operator, explores what makes a "quality" teacher)
http://www.nytimes.com/2010/03/07/magazine/07Teachers-t.html?scp=1&sq=Building%20a%20Better%20teacher&st=cse

WV says that when you're finished reading those, it'll be time for resses.
hschinske said…
"The 2008 NWEA RIT scale norms show negative growth models for the highest RIT scores (so some percentage of scores are expected to go down)."

And that makes perfect sense. Once you get past material that the child is currently being taught, you would expect to see some random variation from test to test, as the child gets different selections of out-of-level material that s/he may or may not know. You see that kind of thing happen with SAT scores all the time.

Helen Schinske
grousefinder said…
MJ...I, too, have seen the questions presented to SPS students on the MAP test, albeit at the 5th grade level. I can understand, and even accept, testing above grade-level (to a point). Many teachers who supplement with Singapore Math (or use it as a primary curriculum) know that it is designed to be one year above grade-level, and thus a MAP exam will demonstrate that those students have been exposed to higher-than-grade-level problem solving skills. However, when questions involving factorials and trig functions come up on a 5th grade test, I begin to wonder what the test is really aiming at.

There are schools that test well-above-average growth in SPS. That means they progress faster than other schools, and that they have higher level skills. If your school is not one of those with "above" or "well-above average" growth I would ask your PTA to look into funding an after-school program that supports above grade-level skill support. For example: You have a 3rd grader. Multi-digit addition is a core skill for your child. An after school program that focuses on multi-digit multiplication would put your child in a higher MAP category on the final exam (MAP #3).

The MAP test is actually a very simplistic test of basic algorithmic skills and mundane problem solving. It is nothing like the rigor a student might find in Saxon or Singapore Math.

As to the Supt.: I am hoping that the District is cross-referencing the high performing MAP schools with their respective math programs. We all know what the District adopted curricula is for K-12, but each school supplements with custom programs that build problem solving skills. I understand that some schools have waivers and do their own programs. These schools may illuminate our decision makers regarding which programs are most efficacious. I sincerely hope that the data-crunchers are looking deeper at the high performing schools and asking why they did so well.
SPS mom said…
This comment has been removed by the author.
hschinske said…
However, when questions involving factorials and trig functions come up on a 5th grade test, I begin to wonder what the test is really aiming at.

The whole point of the MAP is that it isn't a fifth-grade test. It tests fifth-graders on factorials and trig functions if they get that far, and it regresses to subtraction without borrowing if they do worse and worse. I still don't see what's wrong with that. Seems to me exactly the kind of information that parents have been paying Sylvan and Johns Hopkins good money to get for years.

Helen Schinske
grousefinder said…
hschinske: This comment (if it were true) is exactly opposite of the intention of the MAP according to the District coordinators: "The whole point of the MAP is that it isn't a fifth-grade test. It tests fifth-graders on factorials and trig functions if they get that far, and it regresses to subtraction without borrowing if they do worse and worse."

The MAP is a "benchmark assessment," [I am quoting SPS leads] and thus tests students at the levels at which they are expected to achieve at grade level. I know this for a fact because one of the MAP Coordinators employed by the District said it to a group of teachers involved in training for MAP applications. You cannot test outside grade-level unless the exam is an aptitude test or IQ test. MAP is neither (according to the District). Remember, Edusoft did not test above grade level...ever. MAP cannot be one type of test to one group of children (aptitude), and another to a different group (benchmark assessment).

MAP is a "benchmark assessment," and therefore must test at grade level based on the Standards. If it is an aptitude test, then they must call it that and let parents know that their children are being evaluated by MAP based on aptitude to do higher level math than their children see in class.

Testing above grade-level with foreign material creates stresses in test takers which leads to panic, indifference, or futility. It is not what we teachers refer to as "best practice."
Lori said…
Grousefinder said "The MAP is a "benchmark assessment," [I am quoting SPS leads] and thus tests students at the levels at which they are expected to achieve at grade level."

This is not at all what we've been told verbally or in writing at our elementary school. My understanding of MAP is the same as Helen's: it's an adaptive test that identifies an instructional level, independent of the actual grade. The test starts with questions appropriate for the grade level and gets harder or easier depending on whether the child gets the question right or wrong.

As evidence, I can tell you that my 6 year old got multiplication and division questions on the MAP in September; these are clearly not grade level expectations!! No child entering first grade is expected to be doing division. But she got them anyway, precisely because she was doing well with the grade level expectations, and the test wanted to see how far she could go (ie, what is her instructional level, independent of grade?)
MJ said…
NWEA is just another snake-oil salesman who has sold his expensive test to the school district promising results. How much does this test cost to begin with? Does anybody know? It is a little too easy for the district ,who is already in the hole, to just spend so casually our tax dollars on a test that has no research to back it up. And the fact that our superintendent is on the board of NWEA, that really takes the cake. But for those who are so blindly supporting a test they have never seen, I suggest you actually take a look at the test before you get behind it. At lease the company that produced the WASL allowed you to see your child's test (I believe they were legally required to), this company is so veiled in secrecy, you will never be able to get a hold of your child's test only their mysterious RIT score. I guess that makes it even more magical and believable. You should be very suspicious of a company that refuses to release test items, refuses to publish the specific skills they are testing at each grade level, and cannot even explain the bizaare logic behind how they come up with their grade level designations. Can someone on here please tell me what NWEA's credentials are exactly?
Shannon said…
As a parent, I like the MAP test. It seems that some educators are concerned about it. I have heard various theories as to why that is the case.

I have had the test explained at our school and know it is not a benchmark assessment except to the extent that it is normed to the percentile. Thus, it measures how kids are doing relative to grade scores.
SPS mom said…
This comment has been removed by the author.
sixwrens said…
The MAP is a "personalized" benchmark, and the point is to show how an individual child is doing through a school year, and eventually over their academic career. If the test was a 5th grade test for 5th graders you'd miss info on the very high and very low performers. Personalization seems good to me.

As mentioned, of course some of the very high come down (and some very low go up). That's just reflecting variability in the measurement and kids performance over time.
grousefinder said…
Doesn't anyone find it odd that nobody can agree on what kind of test MAP is?

What is most disconcerting is that it seems different teachers have been given different information about what the test is.

Some say a "personalized" benchmark assessment, others profess it's aptitude, still others have been told by the MAP leads that it is a "benchmark assessment."

Some questions to ponder:

Why can't you see your child's test?

Why are the test results given in Rausch Units (RIT) instead of "student meets/does not meet" standard?

Why, in an optimal assessment, does NWEA want students to answer half the questions incorrectly? (Yes, this is true!)
Charlie Mas said…
First, I never heard anybody say that MAP was a benchmark assessment - whatever that is. I have always heard that it is a formative assessment and that it had two purposes.

1) To determine individual student strengths and weaknesses to inform instruction for that individual student. For example, if a student was weak in some part of the curriculum the student could get extra support in that area.

2) To indentify trends of strength and weakness in classrooms to inform classroom instruction. For example, if a significant portion of a class were weak in some part of the curriculum the class would review that area.

Second, for people to come around now and apologize for a drop in MAP scores as "naturally occurring variance" or "kids having a bad day" is to weaken the validity of the assessment. If the assessment is so imprecise that we expect the variance in any student's score to exceed the growth we are looking for, then the test is pretty useless for measuring that growth, isn't it?
grousefinder said…
Charlie...I am not saying that I agree with the statements that MAP is a "benchmark assessment." I am parroting what I was told in a training about how to use map data.

Personally, I believe the MAP is an aptitude test, given that it is designed to test students on material they have never seen, nor will see for several years (in math anyways). However, even that moniker breaks down when elementary students start seeing trig functions during the test.

NWEA has designed this test such that students will get 50% of the math questions correct and 50% incorrect (optimally). Out of 42 questions, a student will face 21 questions that he/she cannot answer. That is both bad practice and cruel.

The Edusoft test was a much better method to determine if a student was meeting standard. It was dumped last year in favor of MAP. I recall there was a committee designing the Edusoft test at great expense. I remember that an outside consultant would come to District math meetings to observe and communicate with teachers as we discussed math issues. This consultant (wherever she was from) was really trying to create quality product. When we started using her tests they were actually very good "snapshots" of student achievement. I hope Edusoft is resurrected!
hschinske said…
grousefinder wrote:

"Why can't you see your child's test?"

Partly test confidentiality, and partly that every test is a different event. Parents were not allowed to see the ITBS, either, and I don't think they were allowed to see some of the other tests commonly given (DRA, Gray Oral Reading Test). Certainly not the CogAT or Woodcock-Johnson.

"Why are the test results given in Rausch Units (RIT) instead of 'student meets/does not meet' standard?"

Because it is a norm-referenced test, like the ITBS (which also had Rausch units, though they may have been called something else), not a criterion-referenced test like the WASL.

"Why, in an optimal assessment, does NWEA want students to answer half the questions incorrectly? (Yes, this is true!)"

Because it is a test that attempts to establish a ceiling as well as a floor on what the child knows -- in other words, it's just as useful to know which areas the child has NOT mastered as to know which ones they HAVE. After all, knowing what they HAVE mastered only tells you what NOT to teach them, rather than what TO teach them. Also, it's not true, as far as I can make out, that the child is likely to get half the questions in a testing session incorrect. At the level at which the MAP finally places the student, it's estimated that the student will get about half the questions correct, but at any level LOWER than that they answered most questions correctly.

Again, I don't know whether the MAP is itself a well-designed test, but I think its stated aims are admirable.

mj said: "You should be very suspicious of a company that refuses to release test items, refuses to publish the specific skills they are testing at each grade level, and cannot even explain the bizaare logic behind how they come up with their grade level designations."

There are tons of sample questions at http://legacysupport.nwea.org/assessments/ritcharts.asp illustrating the skills they look for at each level. Again, this is more detail than I ever remember getting for the ITBS (which did have a fairly detailed report). I have seen no evidence of any bizarre logic -- their grade norming looks very like that of any other norm-referenced standardized test to me.

Helen Schinske
MJ said…
I think the real question here--given the importance of voters for our school levies and to keep the school board in their elected positions--is will the superintendent realize just what a dangerous thing politically this MAP test is. If the Seattle Times should get a hold of the fact that half of the school district "failed" this test in that their scores went down instead of up that would be not a little embarrassing and make the district incompetent. I, as a parent, certainly would consider private school for my child if only 50 percent of the entire district had improving scores. Let's face it, with WASL scores being publishd in the Times every year and everyone aware of the district's budget deficit, it is only a matter of time before this does make it into the paper. "Only half of Seattle Public School students show growth on District's Expensive Bench-Mark Test".
hschinske said…
NWEA has designed this test such that students will get 50% of the math questions correct and 50% incorrect (optimally). Out of 42 questions, a student will face 21 questions that he/she cannot answer. That is both bad practice and cruel.

As I said above, I'm not sure that's true. Even if it were, though, it's not that unusual for a norm-referenced assessment test to have a lot of questions that children can't answer. Getting a 50th percentile result (dead average for one's grade) on the ITBS certainly meant missing an awful lot of questions, even if not half. The MAP is designed to be more efficient in zeroing in on the student's level, thus spending less of their time on questions that are far too easy and less of their time on questions that are far too hard.

I also think it's odd to suggest that it's always bad practice or cruel for children to be presented with too many questions they can't answer. Surely if they're actually learning things in class, any time a new topic is introduced, at first they know very few of the answers? And are teachers never to give pretests?

I know many, many parents who actively seek out testing that will show their child's above-level achievement (and also below-level, if they want to show that the child has an area of weakness or a learning disability). If the MAP really does provide such above- and below-level information inexpensively, that seems to me like a boon to teachers and parents.

Helen Schinske
hschinske said…
I think the "benchmark" versus "formative" question may have to do with two slightly different uses of the word benchmark. The MAP is not a benchmark assessment in the sense of seeing only whether the student met specific grade-level goals. It is, however, meant to be useful in the sense of seeing where the student falls in relation to a whole range of national norms in different areas. Those norms are sometimes also called benchmarks.

The press release at http://www.nwea.org/about-nwea/news-and-events/nweas-measures-academic-progress-selected-state-approved-formative-assess states: "The term 'formative assessment' refers to interim assessments, benchmark assessments, or any other similar tools that are designed and used to gauge the academic progress of students throughout a school year."

Helen Schinske
Anonymous said…
SPS Mom, I have the same question that you do. If this is the test that is to be used to evaluate a teacher's performance from year to year based on "student growth", how exactly will that work?

As I have said before, what if a child DOES have a "bad day", is hungry, tired or there are issues at home, does the teacher get points off of their evaluation for that?

And there is still the issue that most students have figured out, based on my discussions with my students and reading other posts here, they know how to get through the test quickly if they want to, just answer two questions incorrectly and the test becomes easier and they are done.

From what I am reading here, this test should not be used to judge a teacher's "effectiveness" or that of the principal.

I believe that $4M was set aside in the levy to have this test implemented citywide.

I'd rather see the money go to hire more teachers and decrease classroom size, bring in some enrichment after-school programs to supplement what the teacher does not have time in class to emphasize because of class size, behavioral issues and varying levels of ability.
Anonymous said…
As a recommendation, if you have issues with the test, contact your school board representative as well as your principal, and let them know. They do have the power to veto or require the superintendent to re-examine the test and IT'S effectiveness.

Below is a link to contact information for all of your members. If you send an e-mail to them, copy Pamela Oakes. She will print it out and put it into the board members box to ensure that they read it.

http://www.seattleschools.org/area/board/contact.xml
hschinske said…
Really good explanation of summative, formative, and benchmark assessments here: http://www.ascd.org/publications/educational_leadership/dec07/vol65/num04/The_Best_Value_in_Formative_Assessment.aspx

The whole article is well worth reading, but the following seems particularly relevant to how the MAP results may well be used (or rather not used) in Seattle schools:

"Benchmark assessments, either purchased by the district from commercial vendors or developed locally, are generally meant to measure progress toward state or district content standards and to predict future performance on large-scale summative tests. A common misconception is that this level of assessment is automatically formative. Although such assessments are sometimes intended for formative use—that is, to guide further instruction for groups or individual students—teachers' and administrators' lack of understanding of how to use the results can derail this intention. The assessments will produce no formative benefits if teachers administer them, report the results, and then continue with instruction as previously planned—as can easily happen when teachers are expected to cover a hefty amount of content in a given time."

Helen Schinske
hschinske said…
These numbers were crunched by Brad Brenneke

Do you mean Brad Bernatek?

Helen Schinske
Anonymous said…
Just saw this article. The Race to the Top is getting close to home.

The Herald in Everett
"Low test scores mean Totem Middle School principal likely leaving"

http://www.heraldnet.com/article/20100307/NEWS01/703079904

This is the result of student test scores having an impact on teachers and principals.
SPS mom said…
This comment has been removed by the author.
hschinske said…
Even the 1985 policy is too vague -- "validity" is supposed to have a specific statistical meaning (which I will bet you the WASL doesn't fulfill). Doing without any such policy at all is just frightening.

Helen Schinske
Dorothy Neville said…
As I said before, this testing thing is supposed to be a no-brainer policy update, but there are a couple of significant issues. SPSmom is correct, we should be concerned. At the C&I meeting where this was discussed, DeBell liked the fact that the language would now allow (or at least make it clear) that test data could be used for teacher evaluations. And then there's Enfield's comment about how the assessment policy could push fidelity of implementation. That seems off the wall to me. But anything that makes staff feel positive about fidelity of implementation is worrisome.

Getting rid of the language or intent that testing can be part of evaluations isn't going to happen. But getting the language about indicators of appropriate assessments and ensuring assessments pass standards of reliability and validity perhaps could. Does anyone on the board have a sophisticated understanding of statistics?
TechyMom said…
It's just one data point, but my child's MAP percentiles are very, very close to her percentiles on the private achievement testing we did 2 weeks later.

The private test was the WJ-III-B broad reading and broad math, done one-on-one with a psychologist. My understanding is that this test uses a similar approach of adding easier and harder questions based on previous answers.

Does anyone else have other standardized achievement test scores to compare MAP scores to?

If the self-administered computer test is showing similar results, that tells me that it might be a pretty good measure of achievement. It's surely less expensive than having every kid tested one-on-one.

Met standard/didn't meet standard is a waste of time, IMHO. I want to know where she's ahead, where she's behind, and by how much.
TechyMom said…
Dorothy,
I think KSB understands statistics. Wasn't there a 'scandal' about whether or not she had a college minor in statistics?
Shannon said…
My son tests at the same percentile in the private achievement test as he did in the Winter MAP tests.

Its not the same time period but indicates he's where we expect.
hschinske said…
The Woodcock-Johnson isn't an adaptive test in the same way that the MAP is, but it is an individual test that is not limited to grade-level material. However, it doesn't have a great many questions at each grade level, so it's not a diagnostic test in the same way -- it's meant more for establishing how unusual the child's level of achievement is. I would expect a child who scored very high on one to score high on the other, but not necessarily at exactly the same level.

Helen Schinske
grousefinder said…
hschinske...this may help alleviate your confusion about which type test teachers have been told MAP represents in the broad spectrum of assessments we administer throughout the year.

This email came directly from a MAP lead to an elementary staff when the question was posed, "What type of test is the MAP?

"1. MAP is a benchmark assessment measuring student skills in reading and math. The purposes of MAP are to inform instruction and monitor student progress."

I can see how some teachers would view MAP as formative, if they delve into the DesCarte tables that can be generated after each exam. But, I know few teachers with the time to undertake that monumental task. Most teachers examine strand performance data and differentiate accordingly.

If the MAP has morphed into something (other than a benchmark assessment) since the aforementioned email of last month, we (classroom teachers) have not been advised of the change. So, we work with the information we have: MAP is a benchmark assessment designed to measure performance against an established standard (GLE's and/or PE's).
udubgrad said…
Parents must have access to all student records including WASL and ITBS if requested under FERPA.
"FERPA is a Federal law which affords parents the right to have access to their children's education records, the right to seek to have the records amended, and the right to have some control over the disclosure of information from the records. FERPA requires that a school comply with a parent's request for access to the student's records within 45 days of the receipt of a request." See www.parentempowermentnetwork.org
udubgrad said…
The need for parental access to student work under FERPA may be the reason MAP is not feasible for
district use. Officials cannot use an assessment that is secret from parents.
ttln said…
Does the data you want exist? Does the program store a copy of the questions and answers for each test taker?
We get strand data which is tenuously tied to standards. I have found the statistical correlation between scores and 'the test formerly known as WASL' less than precise either as MS or not MS.
Several other districts in the area use it. What do they think about it?
grousefinder said…
ttln...Each question has a code number on the screen. The test must be scored against an answer key correlating to the test. To access the questions, a request could be made for your child's MAP test if you wished, for example, to challenge the test's accuracy. It is every parent's right to view the exams given to their children, particularly if they are used for placement purposes.

On another topic...This is from another nearby school district's parent/teacher information PowerPoint (I found this fascinating): "Computerized Adaptive Assessment [MAP] • In an optimal assessment, a student answers approximately half the items correctly and half incorrectly • The final score is an estimate of the student’s achievement level."

With the old EduSoft exams the goal was 100% for all questions. These questions were Standards Based. Thus, a 4th grade student was tested with 4th grade material. There is a monumental difference in the testing stratagem between these two exams.
MJ said…
I just wish this district were more fiscally responsible. This is an expensive test that is totally useless. We as teachers are unable to figure out how to adjust our teaching to fit the results because the results are so vague. There is no specific set of key objectives to teach to if a child fails in one subject area. Number Sense covers a range of concepts so to tell us that a child is weak in Number Sense tells us next to nothing. Which aspect of number sense--fractions? Which aspect of fractions--converting them to decimals, adding fractions? If it is not enough the Every Day Math is an incredibly expensive and useless curriculum we also have to pay for and get strapped with this pointless test. Okay, fine, some of the parents on this blog are happy to have their child's score graphed out and RITed for them. I, as a parent, have to wonder what kind of parent they are to be so gleeful about having their child so pointlessly labeled but hey, there you go, it takes all types. The elephant in the room is that this is a test we cannot afford. We cannot afford the cost and we cannot, given that half of the students are failing, afford the fact that it makes the entire district look incompetent when it is really the test that is at fault. And, for those parents who "like the test", let them be more fiscally responsible and pay for it. Our district, if you haven't noticed, is in debt and paying for these pointless assessments on credit.
hschinske said…
grousefinder, "formative" refers to what you USE the assessment data FOR, not what kind of test it is. There is no such thing as a specifically formative test. Benchmark tests, far from being the antithesis of formative assessments, are so often used for formative purposes that "benchmark" and "formative" have sometimes carelessly been used as SYNONYMS. See http://www.edweek.org/media/13testing.pdf

"Eduventures Inc. ... predicted that by 2006, what it called “the formative-assessment market”—using a term sometimes treated as a synonym for benchmark assessment—would generate $323 million in annual revenues for vendors."

The paragraph you quote reinforces the fact that MAP is intended to serve formative purposes -- what else does "inform instruction" mean?

In addition, "benchmark" merely implies the use of SOME permanent standard as a comparison, which need not be GLE-specific at all.

Helen Schinske
hschinske said…
The elephant in the room is that this is a test we cannot afford. We cannot afford the cost and we cannot, given that half of the students are failing, afford the fact that it makes the entire district look incompetent when it is really the test that is at fault.

MJ, even supposing for the sake of argument that you're right, wasn't all that far more true of the WASL? The best results in the world couldn't have informed instruction if you never *saw* them until the students under your hand were gone, and the MAP is vastly cheaper.

Helen Schinske
Dorothy Neville said…
wow.

Grousefinder, I am curious about that Edusoft test you speak about. I've never heard of it before. Can I ask what grades were using it? Was this district wide or just some schools?

The goal was 100% mastery. How did that work in practice? How good was the test for informing instruction? Could you, did you, differentiate instruction to fit the particular student? Since it was grade level standards, did you see WASL scores improving after implementing the Edusoft test to inform instruction better? What would you do with a student who early on got 100% of the questions correct? Did you have the time and resources to teach them further material or test them on out of level material to know what the appropriate level of instruction really was?

What about kids who got closer to 0% correct? Were the results useful? Would there be some kids for whom it would be useful to give a test a couple grade levels lower, so you could get a better grip on exactly where they were in the standards -- given that all you know is that they don't know anything about grade level standards?

I don't know anything about the MAP in practice. The goals for MAP sound pretty good to me though and seem like a tool that was missing for my son. 100% mastery on grade level assessments would have been a no-brainer for him and would just be rubbing salt into the wound if it didn't mean being identified as having mastered grade level and having the opportunity to be challenged. Same thing for a kid who got close to 0%. Without actual targeted intervention to raise achievement, taking such a test is meaningless.

The Edusoft tool has distinctly different sounding goals, but it sounds pretty good as well. Seems like it could be used to foster similar goals as MAP, if below grade level or above grade level assessments were administered as indicated.
hschinske said…
I had totally forgotten about Edusoft tests as well, but I looked them up and they were mentioned in the WMS newsletter for the period when my kids were there, so they must have taken them. Dorothy, your son probably had them at Eckstein.

Helen Schinske
Lori said…
MJ writes: Okay, fine, some of the parents on this blog are happy to have their child's score graphed out and RITed for them. I, as a parent, have to wonder what kind of parent they are to be so gleeful about having their child so pointlessly labeled but hey, there you go, it takes all types.

MJ, why such venom toward the other parents on this blog? We're all here because we care about kids and the state of our schools. Most of us try to have civil conversations and learn from each other, without resorting to denigration.

I'm sorry that your children hate doing the MAP test. If they feel like failures after it (which you wrote in an earlier post), perhaps no one has adequately explained the rationale for the test to them. My daughter enjoyed the test, even though she got questions wrong and saw things that she didn't understand. We explained to her the test would help the teacher know what to teach because it would identify things that she already knows, but just as importantly, things that she still needs to learn.

Our school is using the MAP data to inform instruction, exactly the point of the test. We've been told that there is a wealth of information available to the teachers, who can drill down on each student's results and learn more precisely about areas of strength and weakness. So although the report may just say "Number sense," a lot more information is available to the teacher.

In our case, my daughter scored about 2 grade levels ahead in the fall, a finding that was helpful to me because she was complaining about being bored and her homework appeared really easy, and I didn't know how seriously to take my concerns. So with the MAP results, we were able to better advocate for her and begin to explore whether her current school is a good fit. In the meantime, her teacher is using the information to inform instruction. For example, while the rest of the class continues with addition/subtraction exercises, my daughter and one other child are paired up to work on multiplication and division. Division was something she "learned about" thanks to the fall MAP test, and she wanted to know what it is and how to do it, and now she is learning it, at school, thanks to a skilled and dedicated teacher who wants to challenge each student in a heterogeneous class.

so, from my perspective as a parent, the MAP is a helpful tool. What you call "needless labeling," I call insight and actionable information that I can use to help my child have her unique needs met.
TechyMom said…
Thank you, Lori. You said that far better than I could have.
Shannon said…
Thank you Lori. Very articulate and my feelings entirely.

Regarding the point that as a teacher the RIT score and test results are not actionable - my experience with my sons scores were the opposite.

The test does not identify what he did not know. It showed his learning threshold. The results included and correlated with various bits of subject matter which he should learn next. It was very explicit. He should learn long division from fall scores. He did. His scores went up in Winter. I am puzzled about your experience of the test and wonder if you understand it.

My child is not labelled by the test. He is who he is and scores from what he knows and can show he knows. He found the test inoffensive. The information helped us.
TechyMom said…
I remember seeing a link in the fall to a NWEA page that listed grade-level RIT scores, but I can't find it now (even after googling and clicking around on the NWEA site). Does anyone have that link handy? Thanks.
hschinske said…
TechyMom, you might be thinking of this collection of links: http://www.bismarck.k12.nd.us/district/data/newteacher/

Helen Schinske
hschinske said…
This comment has been removed by the author.
TechyMom said…
yes, that was it. thank you.
Eric B said…
I too found the MAP test very helpful. It gave me good information about what skills my daughter had, what she was ready to learn, and where there were holes in her skills. Using the Descartes information which was provided by her teacher, it allowed us to talk about her learning and future instruction in a concrete, specific way.
It seems like there is still a great deal of ignorance about the test. If you don't like it then don't use it, but don't let your clear prejudices and outright misunderstandings cause you to advocate for the removal of an assessment that can be used to influence instruction in a positive way. I don't think it is a good evaluation of either the entire student or instruction - it is one tool in a big toolkit that can be helpful to many.
grousefinder said…
Final word on MAP: Parents have not been informed, but I know as a MAP lead, that the median RIT range delineating each grade level (by school or district-wide) is really a composite of multiple test-taker scores. Thus, if all children are performing at a low proficiency level, then the median will be lower for the entire testing population. The MAP test obfuscates the dismal academic performance district wide, because most students perform poorly in reading and math, thus shifting the Bell-Curve leftwards. What the test does do well is allow parents to feel good by exaggerating their child’s proficiencies in comparison to low-performing students. In part, this explains MAP popularity with the AP and Spectrum parents, as it serves as “evidence” of program success. However, the MAP, being adaptive, does not give an accurate picture of standards-based performance. But, parents sure feel good when their children outscore the nationally-normed reference chart (another illusion). That chart factors in low performing states like Texas.

So, if my classroom outscores your classroom, or our school outscores your school, or my kid outscores your kid, or your child appears to be three grades above their own, one must ask “compared to what?” The answer to that is: Compared to the blob; that hodgepodge of nebulous data with no correlation points to where your children should be in a standards-based school system during any given year.

As I said earlier, EduSoft exams told parents exactly where their children were based on what teachers are bound by contract to do; teach to the standards, differentiate for high performers, and remediate when required.
Anonymous said…
This comment has been removed by the author.
Anonymous said…
All I want to know is how a child will feel when taking one of these tests knowing that they might lose a valued teacher or principal or even their school if their test scores are not high enough, not up to a certain, perhaps even arbitrary, standard.

I am also curious about this reliance on tests to understand how your child is doing.

Between seeing how my daughter was doing with her homework, speaking to her teacher and following her progress and the understanding of the subjects that she was introduced to in school, I knew exactly how my child was doing.

Listening to some of you talk about these tests and the minuteness of parameters and percentages, I wonder exactly what you are talking about, your children, the wonderment of learning, their curiosity and where it might take them, their ability to succeed in the way that they can, in a way that is unique to them? Or someone who is to be measured tested and retested to your satisfaction. To help you feel that somehow and in some way they are average or better than the child next to them.

Is this where we’ve come to in terms of education?

Is this what it’s all about now?
Anonymous said…
And Eric b is right. The MAP test, as the WASL, is not mandatory. Any parent can opt out of the test.

My issue with the test is the emphasis on the test, particularly when it is equated to the performance or "effectiveness" of a teacher, a principal or a school.

A principal is losing her job in Everett because of low WASL scores over the last three years. The school is in a depressed area and it has been a struggle to get parents and the community involved in the education of these children.

So, instead of providing the school with funding to ensure that there is additional support for the students, the principal will be fired. It was either that or have half of the staff fired. What an awful decision to have to make as a principal.

Now, how will those children feel next time they take a WASL or a MAP test? Will they feel the weight of the world on their shoulders?

But, that's what the Race to the Top is all about. Student assessments based on testing. If the students don't do well, someone loses their job or a school closes and is "transformed". And in a profit based, corporatized and increasingly privatized society, that school will more than likely become a charter school.
Anonymous said…
And now for a little humor.

http://susanohanian.org/cartoon_fetch.php?id=539
hschinske said…
Dora Taylor writes

My issue with the test is the emphasis on the test, particularly when it is equated to the performance or "effectiveness" of a teacher, a principal or a school.

Again, it would be far worse to use the WASL for such purposes, as the WASL is not adapted for showing students' progress, and the only thing that seems to count is pass rates (which were never intended as data to judge individual students or teachers, only the broad effectiveness of schools and programs).

In theory (and we do NOT have enough data to know if this is a reasonable use, hence I echo Dora's concern to some extent), students who are being well taught can demonstrate their progress on the MAP, whether they're starting from remedial levels, average levels, or high levels. We would finally see teachers getting credit (again IN THEORY) for the students who start out well below level and make amazing strides in one year, even if they don't technically get up to grade level. In so far as students' test information is ever useful in evaluating teachers (and I know many people don't think it ever is), this is the kind that can be so used, in a way which could not be said of the WASL, the ITBS, or the Edusoft tests.

Will it work out well in practice? I doubt it, given the generally boneheaded way that this district has used test results in the past, and given the generally boneheaded way that NCLB has been playing out. But that doesn't mean that all tests are evil, or that they have no relevance to instruction. My own policy is going to be, as it always has been, to champion using test data intelligently or not at all.

I may mention that I have often posted on this blog about how useless average SAT scores are as a statement about a school's effectiveness. I happen to believe that the SAT is a very useful test for some applications, but I don't think that particular statistic has much meaning at all. Nor did I think Bob Vaughan was right in saying that the PSAT would be useful for tracking whether APP instruction was effective (ceiling effects, dude -- PSAT in 9th grade is too little, too late; incidentally the same appears to be true for the MAP in middle school from what I've seen). So it's not just that I'm all knee-jerk on recommending tests because my kids score high on them.

Helen Schinske
Lori said…
I continue to be dismayed by the characterizations being made about those of us who are generally supportive of the MAP as one tool, among many, to help teachers educate our children. I'd really like to understand some of the arguments that are being made, but when the posts devolve into ad hominem attacks, it's hard to have a two-way conversation.

Apparently, if I find the MAP helpful, I am some sort of robot parent who wants her child "measured, tested and retested" until I'm satisfied. Or, I'm a smug parent who wants to prove to the world that my child is superior to everyone else's children. Or perhaps I'm just too stupid to understand and interpret nationally normed data.

I'll be frank. I love numbers. I love data. I studied biostatistics in graduate school, but I cannot understand what Grousefinder is trying to say here: "the median RIT range delineating each grade level (by school or district-wide) is really a composite of multiple test-taker scores. Thus, if all children are performing at a low proficiency level, then the median will be lower for the entire testing population. The MAP test obfuscates the dismal academic performance district wide, because most students perform poorly in reading and math, thus shifting the Bell-Curve leftwards.

First, all the MAP data that I have seen (and I'm just a parent, not a teacher or otherwise involved) has been mean RIT scores, not median. But even if the data were presented as median scores, what does it mean to say that they are a composite of multiple test takers? Are you saying that they didn't take the actual median of the group (ie, the score at which exactly 50% of the students scored lower and 50% higher)? How does the next sentence follow logically after that? Of course if all the test takers have low scores, the median score will also be "low." That's how medians work; they are a measure of central tendency. If the median score in SPS is truly as low as claimed, then how does the MAP obfuscate that finding? You just said that the median is low because everyone is doing terribly, but then say that somehow we are failing to see that fact. If the bell curve is truly shifted left relative to the nationally normed population, where is the obfuscation? That alone would tell us something.

Maybe folks are ready to move on from this conversation. Maybe we can't even have this conversation without access to the district's data. But I am certainly not willing to concede that my interest in and nascent support of the MAP is coming from ignorance or arrogance. The leadership at my school is excited about MAP, and I support them in attempting to find out whether and how it can be used to inform instruction.
Dorothy Neville said…
Lori, I am glad you asked that because I haven't had statistics since college so figured I must be missing some higher level understanding. I didn't get that paragraph at all either. I don't put it past the district to obfuscate data, but I really didn't get the argument posited here.

I also don't understand this Edusoft thing. Evidently is was a benchmark assessment done district wide in elementary and middle school and was only recently replaced with MAP. Grousefinder says that parents got the results. I have never seen any such results. Don't know if my son took the assessment. I am wondering if others here recall getting such results?
hschinske said…
Lori, I took that paragraph to mean simply that the MAP is a nationally norm-referenced test. Which, it should be needless to say, is NOT news to parents.

More info about Edusoft math tests here: http://www.k12.wa.us/RTI/AssessmentGuide/MathematicsTechnicalReport.pdf

which is linked from http://www.k12.wa.us/RTI/AssessmentGuide.aspx, which has some more interesting stuff.

"The Assess2Know Benchmark item bank lets educators construct tests that are aligned to the Washington Learning Standards.
The Assess2Know Benchmark item bank allows districts and schools to create interim assessment programs that follow their district pacing guide or curricula, while being able to assess student mastery of the State Standards tested on the state’s summative assessments. The Mathematics item bank contains items for grades 3–11 aligned to Washington Standards. Districts can use the Edusoft® Assessment Management System, an online test generator, with this item bank to select the standards they want to assess on a particular test. After the items have been selected and the order of the items determined, a PDF or Word document of the form is created so that the assessment can be printed. Alternatively, the assessment can be administered online."

According to http://www.seattleschools.org/area/board/08-09agendas/061709agenda/nweareport.pdf, "Another key consideration was the district’s experience building assessments from an item bank, which is the strategy used currently with Edusoft. This process is very time-consuming for instructional coaches as it requires annual review for validity and alignment to the curriculum."

Helen Schinske
SPS mom said…
This comment has been removed by the author.
Lori said…
Thanks, Helen and SPS Mom, your posts make a lot of sense to me and help clarify the issues.
hschinske said…
Some correction is done on the data to fit it to national norms: it's not just a question of who showed up for the test that year. See http://www.nwea.org/support/article/980, which states: "Status norms are from a stratified data set that mirrors the proportions of the national school age population in terms of ethnicity and school level socio-economic status at each grade level."

Helen Schinske
SPS mom said…
This comment has been removed by the author.
hschinske said…
I have my doubts about whether including more private school students would raise the norms *that* much. They're included in SAT and ACT norms, and those are low enough, goodness knows. In any case, lots and lots of norms are available. You can always compare students to the 60th or 70th percentile in their grade if you like, or to the 50th percentile a grade up, or whatever, if you think the regular norms are skewed low.

Helen Schinske
MJ said…
I can understand why we teachers on this blog are so impassioned about this test. We have seen how absurd the questions are--really, asking a third grader about the elements of a sonnet to test their reading comprehendion, good grief! And you should know, I am a teacher and parent. As a parent, I know just by asking my child questions and having them read to me where they are academically, so I could care less one way or the other how my child scores. I am dubious that my two children are really two grade levels above where they are at but again I could really care less. As a teacher, I do care because my professional evaluation is tied to this test and I am expected to teach the elements of the sonnet to third graders when most of them are just starting to understand what a simile is. So, my real question is, why are you parents so impassioned about MAP? You know what is at stake for the teachers on this blog and why we care so much. It is hard to teach third grade and be held accountable for high-school concepts (sonnets!). Please, just tell me, the parents on here, WHY DO YOU CARE SO MUCH ABOUT THIS TEST?! Why is it bringing so much interest? What promise and hopes are you hoping it will fulfill? I am asking sincerely and with no bitterness.
SPS mom said…
This comment has been removed by the author.
hschinske said…
MJ, a lot of it is reaction to the WASL. The MAP has been brought up over and over again in the last few years as a possible better, cheaper, more useful alternative, and a lot of us were gobsmacked (in a good way) that the state actually adopted it. I've repeated several times in this thread what I think the possible advantages are.

Incidentally, according to http://technology.usd259.org/resources/NWEA/documents/Vocabulary.pdf, the term "sonnet" isn't used until RIT levels 221-230, which for a third grader is well above the 99th percentile (and even at that level, remember that only half are expected to answer any particular question correctly). If you're seeing it at some much lower level, you might want to find out if there's a bug.

Helen Schinske
Grey said…
MJ,

Thank you for your question & I do see your concern about teacher evaluation.

Here is how the MAP test has helped my child. My child always struggled with writing. At every teacher conference I asked about the writing. Every year his teachers told me that he just needed to try harder, focus more, not be so lazy or sloppy. He test scores declined in every subject as the years went by. Then last year he almost failed the WASL. His teacher was not surprised. The school would not evaluate him because he was not actually failing. The teacher was surprised by MAP scores that showed my son in the 99th percentile in every category. The teacher thought the MAP scores were not accurate. But I was able to use the difference between the MAP scores & the WASL score to push for a SIT & push for testing for a learning disability. Further cognitive & achievement testing supported the MAP scores. A learning disability related to small motor skills was identified. I hope that next year’s teacher will use the MAP scores when determining instruction for my child, because I can demonstrate that they better represent my child’s academic abilities than the WASL. Without MAP he is just a dumb, lazy kid.
Lori said…
MJ, I've tried to be clear that I think MAP might be a helpful tool, among many other that a teacher has at her disposal. I am not "impassioned" about it, and I apologize if I've come across that way. All I know is that my daughter enjoyed doing it, my school administration is enthusiastic about it as a teaching tool, and I have anecdotal evidence that my daughter's teacher is using the results to aid in differentiation.

I'm on the record elsewhere on this blog as being against using standardized tests for teacher assessment. There are just too many variables that go into test results, many of which are outside the teacher's control. I'd like to see teacher performance somehow evaluated with process measures rather than outcome measures. Unfortunately, I don't know that the state of education research has made clear what classrooms processes define an effective teacher. I did find the article in NY Times linked in another thread to be very intriguing.

I also didn't think that using the MAP for teacher evaluations was a done deal. You said "As a teacher, I do care because my professional evaluation is tied to this test and I am expected to teach the elements of the sonnet to third graders when most of them are just starting to understand what a simile is." I'm sorry, but this is just not what I've been told by teachers at our school. They seem to think that the union will successfully fight against merit pay based on the MAP. They also are not at all worried about having to somehow "teach to the test" or in your example, teach sonnets to 3rd graders. What am I missing? Is this a done deal? Are they telling you at your school to teach to the test?

Finally, you also said "As a parent, I know just by asking my child questions and having them read to me where they are academically..." Well, maybe some of us aren't that astute. I'm not a professional educator and I've never worked with children. I have one child who is 6. I look at her work and think that's what all 6 year olds are doing. Through a combination of teacher conferences, spending some time in the classroom, and yes, the MAP results, I started to get a clearer picture of what her needs are and whether or not they are currently being met. I wish it were as simple as having her read to me, but without some frame of reference, I couldn't tell if she was on track, behind, or ahead.
Eric B said…
MJ- I think I already answered your question about how I find it useful. I am sorry if you think the questions are absurd based on your extensive knowledge of the test which you have demonstrated in your many erroneous assertions. I saw many of the test questions while helping proctor the test (although NOT a complete not representative sample) and did not find them absurd.
Why is it that folks are so worried that the MAP test will/has become the one and only thing that determines if a teacher or a student is successful (whatever that means)? We never saw that with the WASL and we all know that there are many, many ways that both the teachers and the students in SPS are evaluated. How it is it harmful to have one more way of looking at things? Especially since this is the one thing that can be tracked from year to year.
MJ said…
Thank you, this helps me to understand a parent's point of view. Now I know the reason for parent passion and see that it is justified. Yes, the WASL was biased against those who had trouble with writing. I am glad we are moving away from written tests and the new WASL, MSP, will have no writing other than the fourth grade Writing portion. And, from a parent's point of view, I have to be honest, at home, I could easily see that my son was good at math and weak in reading and the MAP showed this but his teacher evaluated him just the opposite which, of course, made me question her method of evaluation. The MAP confirmed what I felt I had seen at home. Can you see the frustration a teacher might have, though? We are expected to teach very specific grade level objectives mandated by the state. Last year, every one in the district had to be on the exact same EveryDay Math lesson week by week. So, we teach exactly the grade level objectives we are required to but then we are evaluated on whether a child will improve and master objectives well beyond our grade. Our evaluation and the belief that we are not doing a good job teaching is based on the fact that each child must improve and that somehow somewhere in the day we will need to have taught sonnets, scatter plots, first, second, and third person voice, etc. ! And to add to this frustration, we are not given release test items, we do not know what obejectives will be tested, we have no template to tell us what to teach so we know we are covering it. So, if an extremely bright child is unable to figure out what foreshadowing is and we have not, heaven forbid, taught it, then we are seen as bad teachers. My classes have always scored in the 90th percentile on the WASL. I knew how to teach to the objectives that were being tested on the WASL. All of a sudden, only half of my class is improving and I have to go from being perceived as a good teacher to one of these poor ones who should be sacked. And I love teaching, all of my class time is focused and not spent on behavior issues, and I do all of the management things that were written about in that New York Times article, my students love coming to class and there is passion and joy and what I thought was so much learnig going on, and then to be hit with being in the category of those bad teachers who only have half their class improving. It is hard to stomach.
MJ said…
What does it mean "find out if there's a bug"? How would I do that? There are students in my class that are scoring in the 220 range. I teach a three-four split. As far as my assertions being erroneous, I also help proctor this test and have now proctored all three of last year's tests (our school was one of the pilot schools)and the two we have had this year and every question I have quoted as being on the test was. I have been honest about my role in this drama, I wonder if some of you blogging on here are being honest about your role. Some of you "parents" seem to have more knowledge about this test than a parent would. And if you are part of the adoption of this test and do have some influence in the district office, will you please just consider two things:
1. Add more reading to the reading comprehension questions. Having a child read no more than 4 sentences at a time is NOT reading and will not accurately assess how they will read chapter books and non-fiction text and will not help a parent to have accurate data on their child's reading ability in school and college. Please consider having them read at least a two paragraph selection if not more.
2. Look at the math questions and the state standards for each grade level and have them more aligned.
Thank you.
Maggie Hooks said…
This comment has been removed by the author.
Eric B said…
MJ- to me you showed exactly how this test can be used positively. You and your child's teacher were not seeing the same thing and the MAP test was a helpful tool in getting on the same page. I also see that you are receiving good information as a teacher. You are learning that many of your students are ready to go beyond the district and state mandated grade level objectives perhaps now you can begin to think about how to address those needs/their readiness. Before the MAP you may not have had that information - especially not in a way that you can use to defend your choice of material or curricula that meets the needs of the students in your class.
Your point about how to do this in the current era of standardization is spot-on. MAP is a tool that treats students individually and gives us information on them as individuals. It is designed to be used to tailor instruction to individuals - the antithesis of the standardized curriculum. How does my daughter's teacher tailor the math instruction to the individual students when he has to be on a particular lesson in the Everyday Math curriculum? I don't know the answer to it, but as someone who believes (without evidence) that instruction and curricula should be tailored to student needs, I hope that the MAP test will allow appropriate discussion of the role of standardized curricula in classrooms that are demonstrably diverse. And yes, I am just a parent - a parent that spent about 4 hours on the NWEA website when the MAP test was introduced. Before that I had basically never heard of it.
hschinske said…
What does it mean "find out if there's a bug"? How would I do that? There are students in my class that are scoring in the 220 range. I teach a three-four split.

Okay, I thought you might mean that you were seeing that kind of question pop up for a child who was more in the middle of third-grade level questions, and according to the NWEA materials, that shouldn't be happening. If such a thing did happen, it might be due to miscoding or something, so it should be reported as a bug.

So you've got a lot of students who are capable of above-level work, according to the MAP. When you judge them against grade-level standards, they naturally look terrific. This is all a Good Thing in itself. The question their MAP scores raise is whether they need curricular accommodations, and how practical it would be for you to provide above-level resources and instruction.

You've probably heard the jargon phrase "zone of proximal development" -- anyway, for those who haven't, it's about the fact that people learn best when they're learning something that's somewhat new, but not so new that they can't make sense of it or put it into context. The MAP scores are one piece of data indicating that some of your students' zones of proximal development may be further out than grade-level materials can handle.

First off, is that true, based on other things you know about them? Second, if it is true (and it's almost bound to be true for some of these kids), what can you reasonably be expected to do about it? Is there anything in the current curriculum that you can compact or have them pretest out of, to allow time to work on something higher? Can students do independent reading that would expose them to higher-level concepts?

I've heard of a lot of teachers who were being expected to differentiate while at the same time expected to teach a highly structured curriculum with no room for changes. You're right, it isn't a sane expectation. One of the worst examples of such district doublethink I remember was at Whittier, when the Spectrum classes for years were expected to teach to math standards one year up -- using grade-level textbooks. Yeah, neat trick. (That changed in our last year there: they do have one-grade-up math the last I heard.)

I did see some tools out there for providing enrichment for students who score at various levels. See for example
http://www.sowashco.k12.mn.us/ro/pages/studentlinks/map/reading.htm

http://www.sowashco.k12.mn.us/ro/pages/studentlinks/map/

Those are obviously kind of canned stuff, but they might be useful in some contexts, dunno.

I don't think you should be expected to tailor curriculum to the MAP any more than high school teachers tailor their curriculum to the SAT or the ACT (which isn't a lot, except that they cover the same broad areas). But I do think some differentiation is reasonable to expect, especially if your class is advanced in general and therefore doesn't need the usual amount of review.

Finally, I for one do not work for the district and never have. My first professional job was as a librarian, and I am now an editor who also does research and fact-checking. Though I've been interested in the MAP for years, a surprising amount of what I've posted on this thread is stuff I didn't even know before looking it up to post here.

The whole reason I can stand to be an advocate in the Seattle schools at all is that I'm a data wonk who gets enjoyment out of putting all this stuff together. Otherwise it would just be far too depressing.

Helen Schinske
Dorothy Neville said…
What Helen said.

Additionally, just because a child gets a question about sonnets doesn't mean you must be teaching them that. What it means for reading comprehension and thinking skills is that you could encourage more complexity in the stories they read and in the writing they do. And when 3rd graders are scoring that high, you don't expect positive growth on the MAP every three months, there's too much variability. You should be protected from worrying about that.

As for math growth. Well, as a former math teacher (not in Washington State) I feel your pain regarding the scripted fidelity of implementation. But perhaps the MAP can help? You probably have interested parents who can make sense of the scores and help advocate here. If you have some kids who are ahead (or behind), and not making progress on MAP, then this is more argument against the standardization of the math curriculum calendar. What I mean is that this is the sort of data that shows that you can be -- and should be -- teaching the kids, not teaching the pacing guide.

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

Education News Roundup