Education Guide / Evidence
One of the best pieces of data we have to see if a particular program is working is also the most hazardous: test scores. At the most simplistic level, standardized tests offer a way to be consistent in comparing academic performance of schools and districts.
However, the reliability and quality of these tests is often questioned. Does a snapshot of student performance from a single day truly capture how well all students know the material or does it only reflect that some are better test takers than others? Are test results affected by the extent that different teachers prepare their students for the test using test-taking strategies and practice exams? And just as the students who take the test change each year, the tests themselves also often change, which can make long- term comparisons impossible. For instance, in the spring of 2015, more than 40 states took brand new exams aligned to the Common Core State Standards, which will not be comparable to past state tests.
The solution isn’t to ignore test scores, but to examine additional proof points: test scores plus, for example, attendance rates, parent survey results, a research study, interviews and observations.
“In education, you don’t always have the evidence that you like, and over-relying on some pieces of evidence can be problematic. In my story about parent involvement in Chicago’s Logan Square neighborhood, a lot of the indicators are qualitative. They didn’t have what you’d ideally want – data that attributes student performance increases to parents in the classroom. Test scores are going up, but we can’t claim a direct link. But in my view, the problem that the program is trying to solve is a lack of parent involvement in schools – and on that count, they’ve trained 1800 parents in 20 years. The perspectives of teachers and principals I spoke with backed that up: the school had surveyed those people to see if program was valuable, and the results were consistent.”