A First Step for Ed Reform: Getting the Metrics Right
Many programs targeting low-performing students rely on data that doesn’t indicate a real understanding of impact on students’ lives.
I continue to believe that a high-quality education has the most potential to break the cycle of poverty and inequity in low-income families and communities. Many philanthropists, and foundation and nonprofit leaders in the United States share this ambitious belief.
But is our commitment to mediocre outcomes for students, rather than transformative ones? That's what the data—or the lack of it—suggests. The data we use to select and fund programs to accelerate student achievement does not meet with the life-transforming outcomes invoked in our mission statements. Instead we measure only the most rudimentary indicators, and use these to inform our most critical strategic and funding decisions.
To determine a program’s level of success, we look to API scores, levels of proficiency in math and English (as measured by standardized tests), or high school graduation rates. In doing so, we default to what is easily measured, but not what is meaningfully connected to student success in college and beyond. This is the real impact we are striving for, but the data we focus on doesn't get us there. In many schools and districts, high school graduation requirements are completely disconnected from college entrance requirements, yet we rely on the milestone of graduation as a meaningful indicator to evaluate a program’s success. Even more programs don't use reliable achievement data at all, relying on assessments created within the programs themselves or provided as part of a publisher’s curriculum—all of widely varying and often sub-standard quality.
What if we decided that we adults could do better for our students? I propose measuring nine indicators that, taken together, could provide a more holistic view of whether a program is on the right track. (Note: These indicators were developed with my colleague Laurie Olsen for the Sobrato Family Foundation’s new Education Fund, to better understand the progress of their grantees.)
- Matched, longitudinal data on student growth on measures of academic mastery and language proficiency. We need to see how individual students participating in a program are growing over time and in comparison to a matched group of peers who are not involved in the program, rather than relying on standardized testing at one moment in time during the year. The coming Common Core Standards promise to make this data even more robust, with new formative assessment systems in the works.
- Data on successful completion of major mid-point milestones relevant to age/grade on the path toward college and career readiness. Instead of waiting to the end to see if students graduated high school or made it into college, we need to look at indicators along the way, such as reading proficiency by third grade or completing algebra 1 in eighth grade.
- Data disaggregated by subgroups (such as race/ethnicity or English proficiency level) and by grade-range (K-3, 4-8, 9-12) so that we can understand if a program is working for all students. Since the enactment of No Child Left Behind, this data is gathered and made available by law, but are we using it to really understand whether the most vulnerable students are benefitting from our programs?
- Data on specific measures appropriate to the target population(s). For English Language Learners, one of our most vulnerable populations, it is critical to track and measure growth on English language development tests with the ultimate goal of re-designating students as English proficient.
- Comparative school, district, and—where available—state data on demographically similar populations. The expense and difficulty of random control trials is often put forward as an excuse to do no comparison at all. But even small programs with limited resources can dig into publically available data and make an objective comparison.
- Student voice data (especially for middle school and high school) for evidence of impact and student satisfaction. As an example, I am an advisor to YouthTruth, a national survey project that demonstrates the power and utility of getting feedback from the ultimate “customers” in the system—our students.
- Data on the increase of positive behaviors, such as consistent attendance, homework completion, and even persevering through challenging tasks.
- Data on the increase of 21st-century skills such as analytical thinking and problem solving, as well as character development in areas such as grit and resiliency.
- Data on changes in practices among families, teachers, community-based program staff, school leaders, and school systems that are aligned with evidence-based approaches for supporting the academic, language, and literacy achievement of underserved children.
- Add your favorite indicator here.
We could use a little courage here—and a strong dose of stamina. Education leaders and funders should start by focusing on the promising programs that we currently support. We should offer them the resources and accountability needed to strengthen and align their data to the coming Common Core Standards and other meaningful indicators that drive a more ambitious vision of student success.
These “getting ready” investments are ones that philanthropists often shy away from, preferring to fund the provision of services directly to students, teachers, and parents, instead of strengthening program models all together. We will need to get much more comfortable with this tradeoff—between the relative safety of direct services and the ambition required to create new models—and with the greater success it will inevitably bring. Our kids need us to start looking toward the horizon and delivering on our intentions. This is the most effective way for education to stop the cycle of poverty and create new cycles of economic opportunity.
In upcoming posts, I will interview leaders of programs that are collecting more robust and meaningful indicators toward greater impact for students. Please let me know if you have some good examples to share.