More on what works in math and science from this NY Times article, Guesses and Hype Give Way to Data in Study of Education. Want to be data-driven? Use real research.
But now, a little-known office in the Education Department is starting to get some real data, using a method that has transformed medicine: the randomized clinical trial, in which groups of subjects are randomly assigned to get either an experimental therapy, the standard therapy, a placebo or nothing.
The findings could be transformative, researchers say. For example, one conclusion from the new research is that the choice of instructional materials — textbooks, curriculum guides, homework, quizzes — can affect achievement as profoundly as teachers themselves; a poor choice of materials is at least as bad as a terrible teacher, and a good choice can help offset a bad teacher’s deficiencies.
So far, the office — the Institute of Education Sciences — has supported 175 randomized studies. Some have already concluded; among the findings are that one popular math textbook was demonstrably superior to three competitors, and that a highly touted computer-aided math-instruction program had no effect on how much students learned.
Jon Baron, the president of the Coalition for Evidence-Based Policy, a nonprofit, nonpartisan organization, said the clearinghouse “shows why it is important to do rigorous evaluations.”
“Most programs claim to be evidence-based,” he said, but most have no good evidence that they work.
If Massachusetts were a country, its eighth graders would rank second in the world in science, behind only Singapore, according to Timss — the Trends in International Mathematics and Science Study, which surveys knowledge and skills of fourth and eighth graders around the world.
While Massachusetts has a richer and better-educated population than most states, it is not uniformly wealthy. The gains reflected improvement across the state, including poorer districts.
Not too shabby. It's a compelling story about a state that actually listened and didn't turn into a whirling dervish of ed reform.
The three core components were more money (mostly to the urban schools), ambitious academic standards and a high-stakes test that students had to pass before collecting their high school diplomas. All students were expected to learn algebra before high school.
Also noteworthy was what the reforms did not include. Parents were not offered vouchers for private schools. The state did not close poorly performing schools, eliminate tenure for teachers or add merit pay. The reforms did allow for some charter schools, but not many.
Then the state, by and large, stayed the course.
On tests administered by the federal Education Department, Massachusetts, which had been above average, rose to No. 1 among the 50 states in math.
What about "math wars?"
The “math wars” erupted at the turn of the millennium, culminating in a sort of détente — constructivism was purged, but the new Massachusetts standards did not prescribe a new approach. They stated what students were to learn, but not how teachers were to teach. “What came out of it ended up being a good document, because it contained no pedagogy,” Dr. Kendall said.
That allowed teachers like Ms. Walsh to devise and improve.
From Ed Week, a HUGELY important article about Common Core math assessments and math (hint: you'll need a calculator).
Although calculators have not figured prominently in discussions of the common-core math standards, it's likely the complementary tests will result in far greater uniformity in their use on state exams across states. And the rules emerging from the two state testing consortia are sure to influence regular classroom use of calculators, experts say.