tag:blogger.com,1999:blog-28765366.post2444047493091320407..comments2024-03-28T02:21:17.452-07:00Comments on Seattle Schools Community Forum: Reflections on Standardized Testing ForumMelissa Westbrookhttp://www.blogger.com/profile/17179994245880629080noreply@blogger.comBlogger59125tag:blogger.com,1999:blog-28765366.post-8588094995336595462011-12-01T12:06:12.670-08:002011-12-01T12:06:12.670-08:00An interesting development at NWEA: does this chan...An interesting development at NWEA: does this change your opinion?<br />NWEA.org/mpg-partners<br />TeacherTeacherAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-28765366.post-82099068697550146582011-10-17T10:56:35.307-07:002011-10-17T10:56:35.307-07:00This comment has been removed by a blog administrator.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-28765366.post-6810233684311748962011-04-11T19:12:09.390-07:002011-04-11T19:12:09.390-07:00OMG! We ARE Red Clay district!OMG! We ARE Red Clay district!StopTFAhttps://www.blogger.com/profile/08605108615707039386noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-20277993137186685472011-04-10T21:01:42.759-07:002011-04-10T21:01:42.759-07:00The TFA contract I pulled up has the release of st...The TFA contract I pulled up has the release of student data to TFA and TFA and disclose to 3rd party, if I am reading the following correctly.<br /><br />http://www.seattleschools.org/area/board/10-11agendas/110310agenda/tfacontract.pdf<br /><br />Pursuant to its obligations under the Family Education Rights and Privacy Act (“FERPA”), Seattle Public Schools hereby acknowledges that in the course of providing on-going professional development services for the purposes of improving instruction, Seattle Public Schools may disclose to Teach For America student identifiable data from individual Teachers, pursuant to 34 CFR §99.31(a)(6)(i)(c).<br />iii. Teach For America shall use and maintain such data as provided in 34 CFR §99.31(a)(6). In accordance with 34 C.F.R. § 99.33(b), Teach For America may re-disclose student identifiable information on behalf of Seattle Public Schools as part of Teach For America’s service to Seattle Public Schools of providing on-going professional development services.<br />iv. In accordance with 34 CFR §99.31(a)(6), Teach For America may also disclose student identifiable information on behalf of Seattle Public Schools to additional parties, provided that Teach For America, in advance, provide to Seattle Public<br />6<br />Schools the names of such parties and a brief description of such parties’ legitimate interest in receiving such information.Po3noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-31985828307518240102011-04-10T19:30:09.279-07:002011-04-10T19:30:09.279-07:00Joan - thanks for the clarification on the alignme...Joan - thanks for the clarification on the alignment studies that NWEA does (more info at http://www.nwea.org/sites/www.nwea.org/files/NWEA%20State%20Standards%20Alignment%20Study%20Methods%20_3_.pdf for those interested). I can see how MAP gives some clues about level of difficulty of state standards, but level of difficulty doesn't seem like the same thing as level of quality - that is, harder does not necessarily mean better (refer to http://www.alfiekohn.org/teaching/edweek/chwb.htm).chungahttps://www.blogger.com/profile/11106667474153634408noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-78176250864869881392011-04-10T17:58:29.904-07:002011-04-10T17:58:29.904-07:00here is the proper link...
http://www.erdc.wa.gov...here is the proper link...<br /><br />http://www.erdc.wa.gov/<br />datasharing/pdf/data_inventory.pdfJoan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-13521659244552605892011-04-10T17:57:16.699-07:002011-04-10T17:57:16.699-07:00I just now checked to see if MAP scores may be goi...I just now checked to see if MAP scores may be going into our state's LDS (longitudinal data System.)<br /><br />I found this document.<br /><br />http://www.erdc.wa.gov/datasharing/pdf/data_inventory.pdf<br /><br />I do not see any mention of MAP scores, only state assessment results.<br /><br />At least one state (Delaware) desires to have MAP scores be part of their LDS.<br /><br />If you haven't heard of LDS, well, it's something to get educated about. There are profound privacy issues associated. Gates' money is a major impetus behind encouraging all states to adopt mutually compatible LDS's, to facilitate seamless national-scale integration of state LDS dataJoan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-8993902946857156422011-04-10T17:32:35.956-07:002011-04-10T17:32:35.956-07:00On the question of the meaning of correlations of ...On the question of the meaning of correlations of RIT scores to percent chance of passing MSP/HSPE:<br /><br />The chart helps us to understand that the correlation isn't very good. NWEA publishes the correlations. <br /><br />The fact that some kids with high RIT scores are failing the MSP/HSPE, some kids with low RIT scores are able to pass these tests, tells us that the alignment isn't particularly good.<br /><br />Take a bunch of kids with the same level of mastery of Math standards, and you will see a pretty large variation in RIT scores.<br /><br />Take a bunch of kids with the same RIT score, and you will find a pretty large variation in level of mastery of state math standards. <br /><br />You will also find a huge dstribution in the scores of this group of students on the next benchmark. <br /><br />This phenomenon is documented in the 2008 NWEA Complete Norms report (see Figure 6.1 therein), and is expected due to the statistics of the test.<br /><br />This is why parents and educators need to be cautioned against overinterpreting RIT scores.<br /><br />Once a child has a history of several MAP scores, and there is not too much variation, then the observed growth compared to expected growth becomes more meaningful than the absolute level of the scores.<br /><br />Beacause RIT scores vary significantly for students having the same level of achievement, it is inappropriate to use MAP data for instructional planning and for flexible grouping!!!!Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-64995311988803996992011-04-10T17:03:31.091-07:002011-04-10T17:03:31.091-07:00Melissa,
I don't know. If NWEA is not gettin...Melissa, <br /><br />I don't know. If NWEA is not getting SSNs and names, that is very good. In general, does NWEA try to get contracts thatcall for the district to pass student name/SSN? If so, that may well be a violation of FERPA.<br /><br />Does SPS give MAP data to the State's CEDAR system? I woudn't be surprised if it is doing so. Something I could check into. That is what I was suggesting might be happening.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-17009970356279139672011-04-10T14:17:58.313-07:002011-04-10T14:17:58.313-07:00Joan, I thought the issue of student identificatio...Joan, I thought the issue of student identification got taken care of during the contract debate. I recall them saying that they would take that out. Is it still there?Melissa Westbrookhttps://www.blogger.com/profile/12588239576000641336noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-30877722787202233972011-04-10T11:19:29.791-07:002011-04-10T11:19:29.791-07:00Chunga, due to NWEA having done what they call &q...Chunga, due to NWEA having done what they call "alignment studies" for most of the states, it is possible to use the results of these studies to compare the difficuly of state assessments:<br /><br />If we place on a singe chart, for a single grade level, and for multiple states, the 85% (say) passage rate curve, the relative position of these curves indicates the relative difficulty of the state assessments.<br /><br />Incidentally, the MAP-STARR for Reading, and which I haven't posted yet, shows the passage rate curves to be flat from grades 8-10. This means that the state's grade 8 and 10 reading assessements have similar difficulty. Just another example of what you can learn from having nationally normed data that is not subject to inflation, is not bounded.<br /><br />Yes, we do have NAEP for detecting score inflation, so MAP is not needed for this, but MAP provides another means for detecting inflation. Are not far more students tested by MAP each year than by NAEP?<br /><br />Aren't NAEP tests confined to some concept of grade level standards? That is, aren't NAEP tests bounded, unlike MAP?<br /><br />If so, then this gives MAP an advantage.<br /><br />I have mixed feelings about MAP.<br /><br />I find high quality data can be very useful for us parents who are looking to prove to the district/state/feds that certain policies/curricula are not in the best interest of children.<br /><br />For certain limited applications, MAP data can be considered high quality. But for these appropriate purposes we do not need 3x/yr scores.<br /><br />I would rather see our District adopt a different benchmark system that provides high quality, validly actionable, formative assessment data, accessible to parents and teachers, and that provides the aggregate data value that MAP provides. <br /><br />It would be good to have a product that, like MAP, does not have a floor or ceilng.<br /><br />It would be good to have a product that is valid for use with special education and ELL students.<br /><br />I agree that the District should make it easy for parents to opt out, or to have an opt in program.<br /><br />Be forewarned - your child's MAP data is very probably going into a personally-identified (via SSN) state-wide longitudinal data system.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-14371105652202932532011-04-10T11:08:07.599-07:002011-04-10T11:08:07.599-07:00Anonymous - I don't think you're splitting...Anonymous - I don't think you're splitting hairs. It is hardly clear that a correlation with MSP pass rate means much of anything. MSP is certainly derived from state standards, but is itself a limited, and I would argue superficial, sample of students mastery of them.chungahttps://www.blogger.com/profile/11106667474153634408noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-73343166881119937372011-04-10T10:43:17.364-07:002011-04-10T10:43:17.364-07:00Joan - it's not clear whether MAP is a good pr...Joan - it's not clear whether MAP is a good proxy for checking state standards. Without some assessment of the quality of MAP, isn't it equally likely that MAP will correlate better with a "bad" state standard as with a "good" one? <br /><br />And, if our goal is to have some independent benchmark of state standards, don't we already have NAEP?chungahttps://www.blogger.com/profile/11106667474153634408noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-31427690998372042992011-04-10T10:32:52.113-07:002011-04-10T10:32:52.113-07:00I made a reference to a comment by Chuga...here is...I made a reference to a comment by Chuga...here is that comment:<br /><br />chunga has left a new comment on the post "Reflections on Standardized Testing Forum": <br /><br />Being nationally normed says nothing about the value or validity of the MAP test. As a mostly multiple choice test, MAP only provides a limited and relatively superficial view of student learning. Since it has no value as a formative assessment or tool for instructional planning and is not suited for schools or teacher assessment, the district really has no business spending millions of dollars on MAP, not to mention the time and facilities it takes up. <br /><br />What is even more disturbing is that NWEA recommends teachers work with students to set target goals. Refer to this video to see how dystopian this can get: http://www.youtube.com/watch?v=MVLwu6uQK2I&feature=player_embedded. <br /><br />If some parents really want to see such scores, then perhaps it can be given on an opt-in basis.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-59817218892092269902011-04-10T10:31:11.317-07:002011-04-10T10:31:11.317-07:00I agree with Chunga.
Due in part to national norm...I agree with Chunga.<br /><br />Due in part to national norming, MAP does provide some data value - for example it can be used to detect score inflation in independent standardized tests (I cannot now locate the published example of this that I saw recently). <br /><br />Another example: Through nationally normed MAP data it is possible to compare the between-state rigor of state high stakes summative assessments. This is a proxy for comparing the rigor of different states' standards.<br /><br />I agree with Chuga that MAP ABSOLUTELY should NOT be used for instructional planning. <br /><br />1. This use of MAP data for this purpose has not been validated.<br /><br />2. The technical characteristics of the MAP/Descartes products and the statistics noise in individual student data do not support this use of MAP data.<br /><br />Our children are harmed when MAP is used for instructional planning.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-42548757359929358632011-04-09T22:38:58.920-07:002011-04-09T22:38:58.920-07:00Being nationally normed says nothing about the val...Being nationally normed says nothing about the value or validity of the MAP test. As a mostly multiple choice test, MAP only provides a limited and relatively superficial view of student learning. Since it has no value as a formative assessment or tool for instructional planning and is not suited for schools or teacher assessment, the district really has no business spending millions of dollars on MAP, not to mention the time and facilities it takes up. <br /><br />What is even more disturbing is that NWEA recommends teachers work with students to set target goals. Refer to this video to see how dystopian this can get: http://www.youtube.com/watch?v=MVLwu6uQK2I&feature=player_embedded. <br /><br />If some parents really want to see such scores, then perhaps it can be given on an opt-in basis.chungahttps://www.blogger.com/profile/11106667474153634408noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-38329712033071665352011-04-08T18:44:23.697-07:002011-04-08T18:44:23.697-07:00Correlation between RIT and State Standards? Anon...Correlation between RIT and State Standards? Anonymous' comment on this is correct. <br /><br />I suggest very roughly that the band between the 75% and 85% level represents students working toward good or better mastery of state standards. <br /><br />This is speculative; it is based on knowing the distribution of RIt scores of a class of tracked students who were studying 6th grade standards.<br /> <br />If one accepts this suggestion, then you can start using this chart to estimate how far behind grade level is a kid whose RIT scores are tracking at a much lower level than the 75% passage rate curve.<br /><br />For example, a ninth grader at the 50% national norm perhaps has only mastered 4th or 5th grade standards. Should this student be forced to study algebra in 9th grade?<br /><br />We must accelerate the kids that are at low percentiles in the earliest grade possible so that a much higher % of SPS kids are ready to study algebra by ninth grade and be successful in it.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-40087692084944138212011-04-08T16:38:35.340-07:002011-04-08T16:38:35.340-07:00I'd also like to point out that Joan's cha...<i>I'd also like to point out that Joan's charts visually support a correlation between MAP RIT scores and state standards</i><br /><br />I'm going to split hairs and say this isn't exactly true. The correlation is between MAP RIT scores and passage rates on state tests. The skill set necessary to get comparable scores may be similar, but it is not necessarily true that the MAP is aligned to state standards. My understanding is that the "correlation" simply provides a predictor of MSP pass rate based on RIT score. Who knows what state standards have been mastered.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-28765366.post-67746389004995780472011-04-08T15:17:00.506-07:002011-04-08T15:17:00.506-07:00Chris - the MAP statistics are poor at upper grade...Chris - the MAP statistics are poor at upper grades at the one year interval also. <br /><br />Table 4.3a of the NWEA complete norms report (2008) shows that the expected one-year growth (fall to fall) for 10th grade MATH is 3.3 RIT points, and the Standard Deviation (SD) for "individual test scores around growth trajectories" is 12 RIT pts. Thus the 95% confidence interval for growth for a 10th grader is [-24 ... +27] RIT points.. Nearly 1/2 of the confidence interval for growth on MAP in 10th grade is in the negative growth realm.<br /><br /><br />I received SPS data recently that shows the expected pattern: Increasing proportion of students showing negative growth as grade level increases, over the Fall 2001-Winter 2010 benchmark interval, with roughly 40% if tested kids in Grades 6-9 showing negative growth.<br /><br />The situation for reading at upper upper grades is even worse.<br /><br />The best statistics for MAP (2-11) is in MAth Grade 2, where the fall-to-fall (one-year) EG is 15, and the SD is 9. <br /><br />Grade 2 has the best statistics for Reading: mean fall-to-fall EG is 14, and the SD is 10.5. <br /><br />Thus, less than 1/6 of 2nd grade kids should show negative growth on the Reading or Math MAP.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-77612236690184120502011-04-08T15:08:21.461-07:002011-04-08T15:08:21.461-07:00This comment has been removed by the author.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-2234638099208632302011-04-08T14:41:49.056-07:002011-04-08T14:41:49.056-07:00I'd also like to point out that Joan's ch...I'd also like to point out that Joan's charts visually support a correlation between MAP RIT scores and state standards (an indeed the NWEA source study was designed to show "alignment" of MAP with WA state standards.)<br /><br />HOWEVER, Maureen made the very good point that both MSP(state) and MAP are highly correlated with SES (socioeconomic status)so you can wonder what kind of correlation would be left if you could adjust for that. Also, that observation is probably the key to how the MAP can be simultaneously aligned to many states' standards.Chris S.https://www.blogger.com/profile/17016898261120819596noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-50960818183412737352011-04-08T14:37:50.498-07:002011-04-08T14:37:50.498-07:00I would like to highlight this from Joan's pos...I would like to highlight this from Joan's post - it's worth reading twice:<br /><br />"It is hard to understand what value 3x/yr MAP testing is providing in upper grades when the expected growth is so small compared the the uncertainty."<br /><br />To say another way, the change you see over a yearly testing interval might be large enough to be significant (i.e. believable, not attributable to chance) but over shorter intervals, it's almost certainly not.<br /><br />And Joan can correct me if I'm wrong, but the published uncertainly measure refers to the 3-8 MAP test. I don't believe any such number has been provided for MAP for Primary Grades, NWEA's K-2 product. I am assuming it would be higher, if only for the greater confounding in younger kids by computer familiarityChris S.https://www.blogger.com/profile/17016898261120819596noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-34844361726518246792011-04-08T14:36:07.462-07:002011-04-08T14:36:07.462-07:00This comment has been removed by the author.Chris S.https://www.blogger.com/profile/17016898261120819596noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-13259817463051255082011-04-08T13:08:14.841-07:002011-04-08T13:08:14.841-07:00I loaded into scribd a one-page flyer called "...I loaded into scribd a one-page flyer called "Parent's guide to interpreting their child's MAP scores."<br /><br />http://www.scribd.com/doc/52608936<br /><br /><br />This is for parents who worry that MAP scores indicate lagging acheivement (percentiles are dropping) or low-level achievement (percentiles are low).<br /><br />Please comment on this flyer- let me know if it is wrongheaded. IF so, I will pull it.<br /><br />One person complained that the NWEA norms chart has no value for educators. <br /> <br /><br />If a teacher is inclined to accept the notion that kids working at grade level tend to plot between the 75% and 85% passage rate curves, then perhaps a teacher can use chart to estimate the levels of standards mastery that are represented in his/her classroom.<br /><br />I don't know for certain yet, but I suspect that the median of SPS students is close to the NWEA national median. IF correct, then 50% of SPS ninth graders are scoring at or below the NWEA national median.<br /><br />IF correct, this chart suggests that about 1/2 of SPS 9th grade have RIT scores indicating that they should be studying Grade 5 standards or lower, rather than Algebra.<br /><br />The preceding interpretation of the MAP-STARR is based on two assumptions.<br /><br />1. Students will have a difficult time mastering math standards if they did not achieve at least a moderately high level (say 75% or better) mastery of the prior grade level standards.<br /><br />2. Having a MATH RIT score at least above the 75% passage rate line is a good indication that the student is making good progress on grade-level appropriate standards. This does not mean that plotting below this line is clear evidence of poor mastery. (Students who have very good guessing strategies on the MAP or are studying above grade-level will tend to plot above the 85% line.)<br /><br />In light of this interpretation we can see how wrong-headed is SPS policy on Algebra enrollment, i.e, all kids take algebra by ninth grade at latest, and that algebra teachers do not teach remedial math.<br /><br />What happens is that science teachers in HS are using many weeks of their instructional time to teach remedial math, so that students can do the math required for the science course.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.comtag:blogger.com,1999:blog-28765366.post-16842582508118516352011-04-08T11:44:40.827-07:002011-04-08T11:44:40.827-07:00The NWEA normals on the MAP-STARR chart provides m...The NWEA normals on the MAP-STARR chart provides me at least with some very interesting insights into the MAP product.<br /><br />The value of this chart is increased if one puts on here the uncertainty interval which is typical of student RIT scores. <br /><br />The uncertainty attached to student scores is typically, +/- 6 RIT points regardless of grade level. The RIT score ranges on grade reports (typically, +/- 3 RIT pts) give only 2/3 of the full 95% confidence interval).<br /><br />Parents need to draw on the uncertainty interval when they plot their child's data on the NWEA norms chart (or MAP-STARR); these intervals will help the parent to avoid overinterpreting the test-to-test variation in scores.<br /><br />If you plot these vertical bars (+/- 6 points) in one of the normals in the MAP=STARR chart, you can see that the expected growth between benchmarks becomes very small in upper grades, compared to the size of the confidence interval. <br /><br />What these means is that in the upper grades, a very high proportion (on the order of 40%) of students tested will see a score drop over any two benchmarks (fall-to-winter, winter-to-spring).<br /><br />It is hard to understand what value 3x/yr MAP testing is providing in upper grades when the expected growth is so small compared the the uncertainty.<br /><br />Im the lower grades, the expected growth is much larger relative to the uncertainty. Consequently, the data from elementary MAP tests has far more information content than from the upper grade tests.<br /><br />I have located a copy of an NWEA report that is not generally available to the public, and which speaks volumes about the patterns of uncertainty in the NWEA test score database.<br /><br />There are charts in this report that show the very high level of test-to-test score variability seen over the score trajectories of 300 randomly-selected individual students.<br /><br />Viewed on a coarse scale, MAP data has some value for some parents, at least at the lower grades, but seems to me that have about zero value in the upper grades.Joan NEhttps://www.blogger.com/profile/02810050976533673804noreply@blogger.com