Tuesday, April 14, 2009

Federal studies show most programs yield few results

'No Effects' Studies Raising Eyebrows
EdWeek.org
By Debra Viadero
March 31, 2009

...The studies are part of a new generation of so-called "scientifically based" research that was set in motion by the institute­—the main research arm of the U.S. Department of Education—when it was created in 2002.

The body of research employs a study design called "randomized controlled trials," in which subjects are randomly assigned to either an experimental group or a business-as-usual group. Although rarely used in education before the wave of studies backed by the IES, such designs are widely considered to be the "gold standard" for determining whether an intervention works.

Of the eight such studies released by the federal institute this academic year, six have produced mixed results pointing to few, or no, significant positive effects on student achievement.

They include studies on: school-based mentoring programs in elementary school; commercial software programs for teaching mathematics; various certification routes for teachers; teacher-induction programs; interventions for boosting literacy instruction for disadvantaged preschoolers and their families; and professional-development initiatives in reading.

In addition, the research agency’s final evaluation of the federal Reading First program, which uses a research design that differs slightly from the randomized controlled approach, found that the $6 billion federal reading program improved young children’s decoding skills, but failed to make dramatic differences in reading comprehension.

On the other hand, an ongoing study of “double dose” reading classes for struggling 9th grade readers is showing positive results. And a head-to-head comparison of four different elementary math curricula identified two, philosophically different programs that gave 2nd graders an added boost in that subject over the standard curricula...

No comments: