A new report from the Urban Institute offers an in-depth look at the variation of student achievement across the United States and how much changes to state demographics affect performance on the NAEP.
The report, “Breaking The Curve: Promises And Pitfalls Of Using NAEP Data To Assess The State Role In Student Achievement,” discusses the role states play in education policymaking throughout the country, as state governments begin to play a larger part than local school districts. Current trends in Congress suggest that even more autonomy will be offered to state governments in the near future.
The National Assessment of Educational Progress (NAEP), released every other year, looks at the role played by individual states in educational policy and is used by politicians to offer support for their policies when high or improving scores are seen. However, researchers continue to stress that the scores only provide credible evidence of working policies on rare occasions.
For the report, author Matthew Chingos took into account the average NAEP performance of students in each state and compared them to similar students in other states. He also observed changes to each state’s performance over the past decade relative to what he believed could have been due to changes in demographics.
According to the report, similar students were found to vary greatly in their test performance depending on which state they lived in. However, states that “break the curve,” or yield higher scores than their demographic peers, were found to be not necessarily the states with overall high scores. In other words, when comparing NAEP scores across states, student populations must be taken into account.
The report found that NAEP scores have increased in all 50 states more than would be expected when looking at demographic shifts between 2003 and 2013. Therefore, Chingos says that changes to student demographics should be considered when looking at trends over time to evaluate a state’s performance on the test.
Chingos said that states vary in terms of student performance, with variations among students, classrooms, schools and districts within the state.
“In the words of education researcher Tom Loveless, ‘[A]nyone who follows NAEP scores knows that the difference between Massachusetts and Mississippi is quite large. What is often overlooked is that every state has a mini-Massachusetts and Mississippi contrast within its own borders.'”
The report concludes that NAEP scores have consistently been misused by being offered as evidence that the education system in the country is in need of repair or that certain policies are working. However, data is limited and is unable to be applied to every factor that could affect student achievement.