Disqus

Tuesday, September 25, 2007

WA State Students Static in NAEP Testing

Reported in an article in today's Times, it looks like Washington State students are in a holding pattern as far as progress in taking the National Assessment of Educational Progress test is concerned. From the article:

"But none of Washington's scores were significantly higher than they were in 2005, the last time the exams were given, according to the National Center for Education Statistics (NCES), the part of the U.S. Department of Education that administers the exam.

Washington's fourth-graders scored 224 out of a possible 500 in reading, compared with the national average of 220, and 265 in eighth-grade reading, compared with 261 nationally.

In math, Washington's fourth-graders scored 243, compared with 239 nationally; and eighth graders scored 285 to the nation's average of 280.

While Washington's scores didn't improve, 18 other states posted significant increases in fourth-grade reading, and six in eighth grade. And fourteen states and the District of Columbia posted gains in math in both grade levels. In Washington, the achievement gap between white and African American students widened."

About the NAEP:

"A representative sample of students takes the exams, which are given periodically in reading, math, science and a variety of other subjects. This year, a total of 702,000 fourth and eighth-graders from all 50 states plus the District of Columbia and Department of Defense schools were involved, and roughly 3,000 from Washington state. Participation used to be voluntary, but has been required since 2003 under the No Child Left Behind Act."

I wish we would drop all the state tests and use this one national test that would give a clear picture of how students are doing, across the nation and locally. NCLB requires it anyway; why not make it the test of choice? The money and time it would save would be tremendous and both could go towards helping students.

25 comments:

Anonymous said...

I have often wondered why each state is allowed to come up with their own test, IE WASL?? How can we truly compare our children to those across the nation when tests are different for each state? Why don't we have a national test so that we have a true comparison? Also, there is power in numbers....if everyone were taking the same test, perhaps it would be a stronger, more meaningful test than the WASL???

Dan Dempsey said...

Melissa,

To the best of my knowledge.

The NAEP was given to around 3000 Wahington students in grades 4 and 8.
The WASL was given to 74,000+ in grade 4
and to 78,000+ in grade 8

The NAEP is given to carefully selected representative samples of the student populations. It is not given to everyone at grades 4 and 8.

Dan

Anonymous said...

Mr Dempsey,

Do you know how students are selected to take the NAEP test? Are they carefully selected, and if so, how? Or, are they randomly selected? I'm very curious.

If you insinuate manipulation you have to back it up with facts.

Dan Dempsey said...

apples to apples,

you said: More meaningful than the WASL.

In many respects it would be hard to be less meaningful than the WASL.

Consider grade 10 math where of the Algebra related questions at least 80% were judged to be at the pre-algebra level.

1999-2005 Iowa reading test scores were constant in WA at grade 3, 6, and 9; while WASL reading rises rapidly at grades 4, 7, 10.

Any nationally standardized and normed test would be stronger and more meaningful than the WASL for if the test was not the makers would be out of business.

Dan

Dan Dempsey said...

Apples to apples,

I am not insinuating that there is any manipulation in the selection of NAEP. I think NAEP test takers are as well as can be done samples of the representative populations. The TIMSS is done in the same way I believe.

What I do believe is that in NAEP math the content emphasized has changed over time.

Dan

Anonymous said...

Dan,

You said the NAEP test was given to a "carefully selected representative sample" of the Seattle student population.

I would like to know how the district "carefully selects" a group of students to take this test? Since you made the statement, I assume you know the answer to this question.

respectfully,
apples to apples

Dan Dempsey said...

Apples to apples,

From the times:
Not only was there no improvement in Washington's average scores, its African-American students in Washington also didn't rank as high as in the past. In 2003, for example, the average reading score for Washington's black fourth-graders was higher than the average for black fourth-graders in any other state.

This year, however, 14 other states posted scores that were as high or higher for that group of students.


Keep in mind that these results are coming from small samples of the student population. Could the sample picked be causing the differing results over time?

What I do know is that in SPS in MATH there is an achievement gap for Black & Hispanic Students that has grown significantly larger over the last decade. Granted this is WASL measured at grades 4, 7 and 10.

This school district uses a defective definition of math:
Mathematics is the language and science of patterns and connections. Doing mathematics is an active process of constructing meaning through exploration and inquiry.

The Everyday Math adoption was intellectually fraudulent as Ms. Santorno forced this very poor curriculum on the SPS.

We now have an example of exactly what kind of leadership that is in store for the future for SPS.

Autocratic Centralized Decision Making is here now. Look at the WSHS data that supports the 4 period day. Then look at the decision. Then look at the communications.

From Ms. Santorno's letter of Sept 21, 2007:
At the community forum, I indicated that there would be an appeal process. After further discussions, the Superintendent has determined that this is an administrative decision that is not subject to an appeal process. Every decision that we make needs to be made with a district-wide perspective in mind. We do not believe that the District is best served by having different schedule formats among our comprehensive high schools.

To improve a system requires the intelligent application of relevant data.

Complex systems will not operate with greatest efficiency when the decision making ignores the input from those closest to the day to day operations. Ms. Santorno's mandates that ignore the relevant data and insult the individuals closest to the actual educational process will not bring about the improvements needed.

Over the last decade the public has repeatedly asked for more resources to the classroom and smaller class sizes. The current SPS response is the exact opposite: more coaches, more admin, less teachers.

Listen less and Mandate more.

If you need WSHS data then contact

dempsey_dan@yahoo.com

I need data not more centralized mandates, unfortunately the SPS only provide mandates no data.

Currently the SPS is big on spending dollars for in house Edu-Soft designed testing. The exact opposite of what Melissa and most of us would like to see - economical testing that gives us a reality reference.

Luckily there are 9 states that are using end of course testing for many high school courses. Washington could be one someday.

Massachusetts has decided to pay to have the state tested and evaluated as a nation by the next world TIMSS study. They look for increased contact with reality unlike WA & SPS.

Dan

Dan Dempsey said...

Apples to apples,

As we look for truth and if not truth at least sanity. A clarification is in order.

Dan did NOT say the NAEP test was given to a "carefully selected representative sample" of the Seattle student population.

Dan thinks that:the NAEP test was given to a "carefully selected representative sample" of the Washington State student population.

The numbers I gave were Washington State numbers for students tested at grades 4 and 8 on the WASL not Seattle students tested.

I've got a Book of a NAEP decade summary put out by National Council of Teachers of Math in the trunk of my car. Perhaps I better go get it.

Dan

Charlie Mas said...

From Seattle Public Schools
Policy C40.00, Testing Program
:

"The purpose of diagnostic tests and/or periodic student assessments will be used to measure student mastery of the district's curriculum and local building effectiveness; standardized achievement tests will be used to measure the achievement of Seattle students against national norms."

The District has stopped administering standardized, norm-referenced tests, such as the ITBS. This oversight appears to be a violation of this Policy.

Anonymous said...

Dan,

You have provided a lot of information here. Thank you.

Can you now answer my question as to how Washington state "carefully selected" a group of students to be tested as you stated in your earlier post. Please let me make my question very clear.

How did the district choose which students to test?

How were they "carefully selected by the district"?

Please just answer the question, or say you don't know.

When you make an accusatory statement, you should be able to back it up with fact, statistic etc.


Thank you

Melissa Westbrook said...

What the article says is a representative sample is used. I'd have to go and look up what that means. Apples to apples still has a valid point; why not give this test to ALL students if the goal of national testing is to see how students are doing? It could be a stronger, more meaningful test.

It would be interesting to find out what Massachusetts is doing that their scores keep rising.

Of course the goal of knowing how American students are doing is always, always overridden by this almost perverse need for "local" control of education. I don't want a one size fits all education program but this micromanaging of education likely hurts more students than it helps.

Roy Smith said...

The internal logic of standardized testing demands that every district and state use the same standardized test and give it to everybody. The NCLB testing requirements don't really make much sense when every state is allowed to create their own standards and tests.

I am not a big fan of standardized testing, but if we insist on doing it, we should at least do it in a way that makes some sort of sense. The WASL and NCLB in their currents forms don't make sense.

Dan Dempsey said...

Dear Apples to apples,

I do not know how the selection was done but I shall try to find out.

Could you please tell me the statement that you find acusatory?

Thanks,

Dan

Dan Dempsey said...

Charlie Mas said:
The District has stopped administering standardized, norm-referenced tests, such as the ITBS. This oversight appears to be a violation of this Policy.

What a kind word "oversight"

Read Policies D44.00 and D45.00. I've been attmepting to get the SPS to comply with their own policies. My lack of success would indicate that these are not oversights but failure to perform.

Dan

Anonymous said...

Dan,

I find the term "carefully selected" in the context that you used it to be inflammatory. Especially if you don't have any idea how the "carefully selected" students were chosen. Perhaps they were not carefully selected, perhaps it was random? Perhaps it was by parent request? Unless you have your facts, you can't just say random stuff, it lends to unfounded information being passed around and around.

Anonymous said...

apples to apples:

I expect that "carefully selected" in this context means that the test was administered to a subsample of the population that was chosen to reflect the demographic characteristics of the population as a whole (probably, race, economics, rural-urban). A 'randomly' chosen population might not, in any given year, have the same characteristics as the whole population. Sometimes in social science research it makes sense to carefully select your target sample so it reflects your population--that doesn't mean the study is biased.

In this context, the use of "carefully selected" is not inflamatory--it's descriptive of the way the study was conducted.

Anonymous said...

Regarding the confusion (disagreement?) between Apples to Apples and Dan, it is my understanding that the districts do Not choose the students to be tested. Statisticians from the NAEP program do that. I don't remember for sure, but I doubt that the districts are allowed to manipulate that. I did not read Dan's comment as insinuating that. I took his words "carefully selected" to be at face value, not an ironic dig at the test.

Jay Mathews of the Washington Post has written about NAEP. Searching the archives of the Washington Post (and the NYT, now free!) would probably be fruitful.

WenG said...

Dan: Thanks for posting the SPS definition of math. I'm curious, where does it come from? (I’m guessing this is U of Chicago boilerplate, from the group that created Everyday Math.)

My definition of math is the language and science of quantity, and the relationship of quantities.

Last night I helped my 4th grader with EM homework. She was irritated with her worksheet. It didn’t make sense to her. She's a good reader who normally comprehends what she reads, unless it’s her EM book. I shared her irritation, because decoding the jargon of EM is all about finding a work around in order to solve simple arithmetic problems.

Instead of using a grid for glorified finger counting, I committed the heresy of encouraging her to subtract, and she didn’t "trade-first," (more EM jargon), she borrowed.

The thing that confused her most was the EM innovation of grouping numerical equivalencies together in a little box and calling them “names.” 12 is the name of a number, but equals are equal. Numerical equivalency defines what’s going on, and cultures throughout the world have a mutual understanding of equivalencies. Call them what they are. But what do I know, I’m not part of the Chicago brain trust that sold this curriculum.

Maybe EM is a back door into preparing kids for programming languages, where functions are often named for the expected result. For example, 4 + 1 = 5, so when you run 5, you get 5. The naming of the function is the same as the operation, or the results of the operation. And that’s great, a whole new wrinkle on things, but before we venture into C++, let’s nail down arithmetic.

Dan Dempsey said...

Apples to Apples,

Maureen has it exactly correct as to what carefully selected means as opposed to random. When expecting to get a valid outcome with small samples a careful selection is required.

NAEP does the selection, just as TIMSS does the selecting. Both evaluate the demographics of the state, or nation and then go about selecting representative samples of that population to test.

Specifically how this was done for Washington and the NAEP - I have no clue.

Dan

Dan Dempsey said...

Weng,

I think that the Math definition comes from OSPI as that is the organization that SPS usually copys.

Original thought in finding what works is not a strong point of SPS - of course you know that as you wonder through the mysterious adventure of EM.

SPS recently revised part of the math portion of their website - it was really comical reading what they thought EM does. It think they took it down.

Dan

Dan Dempsey said...

Weng,

Try this at OSPI:

http://www.k12.wa.us/
CurriculumInstruct/Mathematics/default.aspx

That seems to be where a lot of this originates.

Dan

Anonymous said...

I am not a fan of our math curriculum either. In fact, I and both of my children have suffered the frustration that Weng and her child has. However, statistics like Washington States performance on the NAEP test give me a sense of confidence in the district. We are ahead of the national average in math and reading. This must mean that Washington is getting the job done, albeit in a way that is frustrating to those of us who are much more comfortable in conventional math. If the NAEP test group was "carefully selected" and truly reflects our population as Maureen suggests then we should be comforted in the knowledge that we are ahead to the country.

Dan Dempsey said...

The NAEP is used for year to year comparisons within the same state.

The NAEP is NOT a valid instrument for evaluating the efficacy of education between states.

Washington has per capita the highest education level of all 50 states. Thanks to our employers importing lots of smart folks.

There is little to find encouraging in the NAEP stats if you know how to read stats.

No our state survives despite the bizarre leadership of OSPI and the sheep who follow this defective shepard.

Dan

Anonymous said...

"We are ahead of the national average in math and reading. This must mean that Washington is getting the job done."

I don't think that's necessarily true at all. I don't really care how Washington students *rank*. I care what they *know*.

I support the idea of criterion-referenced testing (even though I'm not a fan of the current WASL) because it never has made any sense to say that you wanted all students to be above X percentile, or worse, the *average* of all students' scores should be above X percentile. It makes *way* more sense to have minimum standards and be able to point to how many students met them.

Norm-referenced testing is by far the most appropriate tool for some purposes, but it stinks as a way to evaluate whether the vast majority of students are reaching appropriate levels.

I just wish education professionals had the sense to quit acting as if the choice was between using a hammer as a screwdriver, and using a screwdriver as a hammer -- or that those who fund education would admit that we need *both* screwdrivers and hammers to do a decent repair job.

Helen Schinske

Dan Dempsey said...

Correction.

Seattle has the highest percentage of college graduates of all major cities. Seattle is number 2 in post graduate degrees per capita.

I think the state is around number 10 in percentage of college degrees per capita.

If you want to go for small towns I believe that Los Alamos New Mexico has the highest percentage of Ph.Ds in the Universe.

Cheers,

Dan