Many educators and critics acknowledge that mathematics and science education are broken in the US, but what isn’t as clearly discussed is how, exactly. Now new data from the US Department of Education might be jumpstarting the conversation. A small office in the DOE building in Washington is hoping to apply the same methodology that revolutionized medicine – a randomized clinical trial – to identify what works and what doesn’t when it comes to teaching STEM.
The New York Times’ Gina Kolata reports that the first set of results are being made public and already offer much materials for education experts and advocates to digest. Specifically, the report includes a surprising finding that when it comes to STEM — that instructional materials play almost as big a role in student success as instructional quality. So much so, it seems, that good quality academic tools can more than offset the harm done by an incompetent teacher.
So far, the office — the Institute of Education Sciences — has supported 175 randomized studies. Some have already concluded; among the findings are that one popular math textbook was demonstrably superior to three competitors, and that a highly touted computer-aided math-instruction program had no effect on how much students learned.
Other studies are under way. Cognitive psychology researchers, for instance, are assessing an experimental math curriculum in Tampa, Fla.
The institute gives schools the data they need to start using methods that can improve learning. It has a What Works Clearinghouse — something like a mini Food and Drug Administration, but without enforcement power — that rates evidence behind various programs and textbooks, using the same sort of criteria researchers use to assess effectiveness of medical treatments.
So if all this information is now out there, why isn’t it being used? The answer is because it’s not being sufficiently publicized. According to the Office of Management and Budget, only slightly more than 40% of schools are aware that the clearinghouse exists. However, even those who know about it have reservations – specifically, about how all this knowledge can be translated into the classroom and benefit the students.
Nor is it clear that data from rigorous studies will translate into the real world. There can be many obstacles, says Anthony Kelly, a professor of educational psychology at George Mason. Teachers may not follow the program, for example.
“By all means, yes, we should do it,” he said. “But the issue is not to think that one method can answer all questions about education.”
In this regard, other countries are no further along than the United States, researchers say. They report that only Britain has begun to do the sort of randomized trials that are going on here, with the assistance of American researchers.