MAP (Measures of Academic Progress)

A thread was request on this subject. There are five previous threads on this subject to refer to as well.

(Sorry, I made a bad assumption that readers would know I meant the assessment tool and not new boundary maps. My apologies.)

Comments

zb said…
OK, you got me, what's a thread on "MAP"? Are we talking about the new boundary maps? or something completely different?
Robert said…
I thnk it is MAP testing.
gavroche said…
Thanks, Melissa.
ZB -- here's a brief description from http://seattle-ed.blogspot.com/:

[MAP are] the new computerized, standardized tests the district is administering this year to all students, from as young as kindergarten to grade 9.

MAP stands for "Measures of Academic Progress™" (yes, it is a trademarked product) and will be administered to the kids three times during the schoolyear. The test can take as much as two hours each session, according to the district's official announcement letter (see: http://bryantschool.org/index.php?option=com_content&task=view&id=487&Itemid=343.


Here are all my questions on the subject:

I'd like to know who decided to buy the MAP(tm) test for SPS, when and why. Was there any public input on this district choice and expenditure? How much did it cost? Was this the best use of district money when there are so many other more immediate needs? Are standardized, computerized tests appropriate for kids as young as 5 in kindergarten who aren't even reading yet? Have some kids indeed figured out how to outsmart the adjustable mechanism of the tests, thus skewing the results? Do we really want precious class time spent on even more testing? What are the tests really going to be used for -- to evaluate kids or to evaluate teachers? What have the first round of tests shown so far? I heard from one teacher that no one seems to know how to make sense of the results. Is that true district-wide? I and others would also like to know, why in Sept. 2008 did Supt. Goodloe-Johnson join the board of directors of the Northwest Evaluation Association which manufactures and sells the MAP(tm) test? Was this before or after the district decided to buy NWEA's product? Were any other companies or products considered? Did the Superintendent's position on the NWEA board influence the school district's decision to purchase that company's product? Does Supt. Goodloe-Johnson stand to profit financially from this association? Isn't this a conflict of interest?

(see:
http://www.nwea.org/about-nwea/our-leadership

http://www.nwea.org/about-nwea/faq
/General%20Information#faq-1043

http://www.seattleschools.org/area/news/sbnews/nwea
Anonymous said…
My 1st grader thought it was kind of fun to do. They did it in the beginning of the year, and the information was used in part to place children into appropriate reading groups based on skill. I am hoping that they use it to differentiate math teaching as well, but I have no evidence that that is happening yet for her as the homework she is getting is way beneath her skill set.

They are supposedly also using MAP results as part of gifted testing. My daughter took the cognitive abilities test up at school this past Saturday, and rather than bring her back for more testing of reading and math, the letters we got said they will use MAP results instead of standardized Woodcock-Johnson Achievement tests.

The staff at our school seems to like it. That said, I did talk to one mom whose older child had figured out that if he gave wrong answers, he'd assure himself easy questions, and some kids were discussing the "benefit" of doing this amongst themselves. So in some cases, MAP may not reflect the child's ability. I don't know what educators can do other than stress that children answer honestly, and if results are out of whack with what is known about a child, perhaps disregard the MAP results?
hschinske said…
It's not an SPS decision as far as I know -- isn't it statewide? Even before Dorn's decision to scrap the WASL, MAP was being piloted in a lot of districts in Washington, as early as 2006, according to http://www.leg.wa.gov/documents/legislature/ReportsToTheLegislature/waslfinalrpt_5bde0acd-a2f1-4f6f-9a5e-c36a600b2fb9.pdf

"In 2006, $250,000 was provided for grants to school districts to purchase diagnostic assessments (according to the statutory definition). Most school districts used the grants to purchase Measures of Academic Progress (MAP) from the Northwest Evaluation Association. In the 2007-09 biennium, $4.9 million was appropriated and about 30% of the funds went for MAP during the 2007-08 school year. Funds for the 2008-09 school year were redirected toward development of a statewide system of diagnostic assessments. There is still $3 million in the 2007-09 budget for diagnostic assessments that has not been allocated."

Helen Schinske
Unknown said…
At our school the MAP was used along with other assessments to do first grade math placement. I'm curious to see the MAP results at conference time.
hschinske said…
This post may be of interest: http://northbeachexcellence.blogspot.com/2009/10/request-your-students-map-scores.html

Helen Schinske
seattle citizen said…
I'm through-hither about MAP: On one hand, it's a standardized, computerized, systematized franchise...On the other hand, it can (and does) give a generalized (and common across schools) way to look at the general level a student might be at. If used over time, and if students do their best on it, it could provide some measure of where a student needs further instruction (comp, or vocab etc).

Of course, like with any single indicator one would need to do foillow-up classroom assessments to determine accuracy.

But it might be helpful over time to determine placement (eek, tracking! but if a student is a "fourth-grade reader" in 10th grade, SHOULD they be in regular LA10? Or maybe in LA10, but with a reading class also?

If you don't know this in August, how do you schedule the student appropriately?

The argument would be that grades tell us this, but there ARE 10th graders with 4th skills...are the grades accurate?

Additionally, MAP has a function they call "Descartes" which essentially brings up which skills a student might be lagging in given their score. This could help teachers "put a face" on the data, particularly if a teacher isn't a reading teacher and might not be regularly assessing comp, etc, or even math.

It's a snapshot, incomplete and perhaps too general (Descartes gives target skills for that particular score, not for that particular student, and its suggested areas of remediation could be off), but I think it might prove helpful.
SPS mom said…
This comment has been removed by the author.
Shannon said…
I really like the idea of it. My only concern is that the data be accurate - you know, nothing more frustrating that classes set up to differentiate based on scores and you know your kid was hitting the same answer every time because they don't want to do it or are after easy answers.

My son enjoyed it. He liked the lack of time pressure - he felt pressured in some tests (spelling) this year because of his handwriting and he liked the computer format. He freaked out a bit when the questions got too hard and he came home asking "should I know the square root of 225? Like, HUH!?? and what about questions with letters in them and not numbers?? HUH?" But that's more about joining a school from another and being very sensitive about being left behind in what is being studied.

All in all I am keen to see what our teacher is doing with them in Third Grade and am optimistic that she is finding it useful data. It took her AGES to conduct reading evaluations on all 29 kids at the beginning of the year!!
jp70 said…
Shannon, my first grade son did the same thing. He came home after the test asking what is the thing with a line and a dot above and below the line (division) and then asked what 4 divided by 2 was. He was upset because he guessed 4 because he had never been exposed to division before (at least with the division sign). He is pretty good at math and is an overachiever so he was pretty upset at first. I'm curious if we'll get the results at the PT conference.
Syd said…
I received a letter today: according to mt 9th grader's counselor, we should get my son's MAP results this month. He will be bringing them home.
Well, at our school the teachers have told me they're already finding that MAP is rather inaccurate. Plus, kids as young as 4th grade "bragged" that if they pretend not to know the answer, the questions get easier. Seems like a problem....
Joan NE said…
Here are a couple things I know about MAP. It costs $13/yr/students, that covers 3 benchmark tests during the school year, and 1 test in summer.

Jessica de BaRROS was put in charge of MAP (she was in charge when I enquired about it last March). She is a Broad trainee (may be a Resident by now).

I saw in mentioned in a minutes of a recent Board meeting that the District has decided to drop the DRA (Developmental Reading Assessment) in favor of the MAP.

If you have seen an elementary school progress report for your student, then you know what a DRA assessment looks like. It is very informative. I strongly doubt that a computer based test can be as good as teacher-administered assessment. Does it save teacher time? I don't know.

The Principal at Lowell told us that they probably won't be sharing the MAP reading assessment results with parents, becuase it is not easy to interpret (at least, that is my recollection of what he said). I will miss the DRA. It is quite informative for parents.

At minimum the District should have done a comparison of the results from the DRA and the MAP in the pilot test. That comparison would show whether the MAP results are highly correlated with and as informative as the DRA. DId the Board ask whether this comparison was done? I didn't notice any mention of this in the Board minutes. Has the District done such a comparison? Who knows?
SPS mom said…
This comment has been removed by the author.
seattle citizen said…
I wonder how, and if, MAP will fit into Performance Management?

Will we find out Nov. 12 when PM is explained? Does anyone know what PM actually IS?
zb said…
"Plus, kids as young as 4th grade "bragged" that if they pretend not to know the answer, the questions get easier."

I find that amusing and interesting, and wonder how one deals with it. "stair-casing" (i.e. making a task more difficult in order to find the threshold of performance) is a standard psychological/behavioral technique. Stair-casing's benefit is that it allows a quicker assessment of the threshold (i.e. where one goes from being able to do the task to not doing it). But, it's prone to specifically this kind of flaw (as well as the possibility that "accidentally" errors at critical trials ended up searching for the threshold in the wrong set of questions).
old salt said…
I have friends who teach in other districts that really like the MAP assessments.

One middle school teacher described how she reviews the feedback with each student & uses it to set individual goals for the next quarter. Then reviews the results at the end of the quarter. The implication was that her whole school does that.

She said that it helps her target holes in individual student learning especially below & above grade level, that she might miss when doing curriculum based assessments.
Dorothy Neville said…
FERPA! There is no such thing as not sharing results with parents. FERPA! All you have to do is remind them of FERPA and you will get the results. You should get access to any analysis of the results that they get.

As for the DRA, well, at Lowell it was a joke because many of the first graders ceilinged it first time around. And then, even though they had gotten straight 4s on the highest level assessment in first grade, the second grade teacher HAD to spend the time to redo the assessment.

As for the validity of the DRA, I know of one teacher (not Lowell, a more challenged population) so dedicated and careful she recorded each child and then at home relistened so she could more carefully score. I know of a different school where Parents! were given a little training and they administered it.
mkd said…
This may be a bit off-topic, but I'm wondering if their child's grade was penalized (i.e., lowered) because they missed six days of school because of illness, exposed to whooping cough, the doctor was adamant he stay home until all of the antibiotics were taken, these missed days were excused absences. All missed work was completed and turned in a timely manner. The teacher is an excellent instructor and I would never ask except I was very surprised.
Lynne Cohee said…
MKD, I'm curious, what's the level of school?
mkd said…
High school. Something else I read, grades at some schools don't get a "boost" for things other than classwork. One of my kids has "coupons" he can cash in at the end of the semester to add points to his grade. Not a problem for him, his grade is already higher than an A. Do all schools do this?
CCM said…
WMS (Washington Middle School)does have "homework passes" that the kids can either use each quarter (I think there are 1-3 in each class)or turn in at the end for extra credit (15 pts ec in Science for example). The classes that have them for our kid are math and science. My son actually used a pass once in math, but received extra credit for not using the passes in science.
With the amount of daily homework that he has and his extra-curricular schedule - which is not as busy as most - I'm pretty ok with the practice. I'm not a huge believer in enormous amounts of homework anyway.
Lynne Cohee said…
My son's teacher at Roosevelt has a policy where the students get no credit for participation on the days they miss, even if the absence is excused. They then have opportunities for extra credit work later in the quarter (I think there were maybe three of days) to offset those missed points. The policy is clear -- it's in the syllabus. To me that penalizes the kids at a time when they have already come back to school not yet 100% and are scrambling to make up actual homework assignments missed, plus keep on top of new assignments. I exchanged emails with the principal about it and he suggested I take it up with the teacher, but the teacher is excellent in every other way, my son is doing well in the class, and he does NOT want me to contact the teacher so I'm respecting my son's feelings on this one. According to Brian Vance, he's working with the staff to better clarify the school's class absences policy, and the current policy is that students with excused absences should have the opportunity to make up the work missed during their absences. I don't think this teacher's policy is consistent with that but don't intend to follow up with it. It's a bit surprising to me generally that the school hasn't delivered a stronger message encouraging students to stay home when they are sick, but maybe that's partially a reaction to past experience at Laurelhurst, where the school went through the tragedy of losing a little girl to flu a few years ago. They are very strong about telling families to keep their kids home.
ArchStanton said…
This comment has been removed by the author.
mkd said…
All of you are right, coupons, etc., are a positive way to reinforce esteem and good behavior. My complaint is that my 10th grader's grade was lowered quite a bit because he was out with the flu. Despite all missed work completed in a timely manner, the grade went from a mid-A to a high-B. Please tell me I'm being ridiculous. She's a great teacher and he loves her class. I'm just being a mom. Isn't MAP testing like STAR testing in CA? In CA though, I always received a copy of their results. Under the law, aren't we entitled to a copy of everything in our child's school records?
TechyMom said…
I'll ask at Lowell. I know the ALO teachers (or, at least my child's teacher) are using MAP to determine reading groups and such. Not sure how they can tell you what they're doing to differentiate without telling you how your kid did on the test?
SolvayGirl said…
I agree with mkd and others that children should not be penalized for being out sick—especially in tis crazy flu season.

At parent meetings at my daughters independent high school, the staff stressed repeatedly that they wanted us to err on the side of caution and keep our kids home when they had any flu symptoms what-so-ever. Homework is posted online and students have emails for all of their teachers so they can stay on top of things. I was allowed to go to my daughter's locker on curriculum night to get books she needed.

I don't think she had H1N1, but I was glad that kids would not be coming to school that might. Docking a student's grade for being out sick is too punitive IMHO.
dj said…
To my understanding, we're getting MAP results at Thurgood Marshall at parent-teacher conferences the week of Thanksgiving.

Forgive me, however, for saying -- I am not exactly sitting here with bated breath waiting for the results.
Joan NE said…
Seattle Citizen - can you comment on the relative value of the DRA versus MAP for teachers, students, and parents? Are they complementary or redundant? Are they equally informative? Is the MAP cost effective?

I presume that the district prefers the MAP for at least two reasons in addition to the pedagogical benefits: it provides business income opportunity for a public-private partner; it satisfies the need for quantitative score for each student that is needed for purpose of data-driven decision making. To my way of thinking, these are not good reasons for adopting the MAP. I hope the district had very good appropriate reasons for dropping the DRA in favor of MAP.
Joan NE said…
Seattle Citizen asked what Performance managment is, whether MAP fits into this, and noted that it would be explained Nov 12. I have not heard yet what was revealed on Nov 12.

I googled [K-12 "performance management" definition] at found this highly relevant doc from the Aspen Institute: http://www.tqsource.org/whatworks/WWC08buildingCapacity/resources/K-12_HCM_Framework.pdf [5 pp.]

I skimeed it very quickly: Here a couple quotes: "creating a performance management system that recognizes high performing teachers requires rethinking teacher evaluation, compensation and nonmonetary rewards for performance, the career development opportunities for exemplary teachers, and the creation of a professional culture that celebrates excellence and
continuous improvement."

Another qoute from same doc: "In education, to the maximum extent technically and practically feasible, evidence of impact on student learning should be the primary criterion of performance. At issue is what measures of student learning should be counted (e.g. value added measures based on standardized test scores, other student performance measures), what in addition to student achievement results should be included in the definition and measure of good performance (e.g. observable teacher behaviors, contributions to school improvement), and what levels of reliability and validity are necessary for making consequential decisions."

Does this speak to Sea Citizen's questions?

Popular posts from this blog

Tuesday Open Thread

Breaking It Down: Where the District Might Close Schools

Education News Roundup