SUBSCRIBE | HELP

Education

A First Step for Ed Reform: Getting the Metrics Right

Many programs targeting low-performing students rely on data that doesn’t indicate a real understanding of impact on students’ lives.

I continue to believe that a high-quality education has the most potential to break the cycle of poverty and inequity in low-income families and communities. Many philanthropists, and foundation and nonprofit leaders in the United States share this ambitious belief.

But is our commitment to mediocre outcomes for students, rather than transformative ones? That's what the data—or the lack of it—suggests. The data we use to select and fund programs to accelerate student achievement does not meet with the life-transforming outcomes invoked in our mission statements. Instead we measure only the most rudimentary indicators, and use these to inform our most critical strategic and funding decisions.

To determine a program’s level of success, we look to API scores, levels of proficiency in math and English (as measured by standardized tests), or high school graduation rates. In doing so, we default to what is easily measured, but not what is meaningfully connected to student success in college and beyond. This is the real impact we are striving for, but the data we focus on doesn't get us there. In many schools and districts, high school graduation requirements are completely disconnected from college entrance requirements, yet we rely on the milestone of graduation as a meaningful indicator to evaluate a program’s success. Even more programs don't use reliable achievement data at all, relying on assessments created within the programs themselves or provided as part of a publisher’s curriculum—all of widely varying and often sub-standard quality.

What if we decided that we adults could do better for our students? I propose measuring nine indicators that, taken together, could provide a more holistic view of whether a program is on the right track. (Note: These indicators were developed with my colleague Laurie Olsen for the Sobrato Family Foundation’s new Education Fund, to better understand the progress of their grantees.)

  1. Matched, longitudinal data on student growth on measures of academic mastery and language proficiency. We need to see how individual students participating in a program are growing over time and in comparison to a matched group of peers who are not involved in the program, rather than relying on standardized testing at one moment in time during the year. The coming Common Core Standards promise to make this data even more robust, with new formative assessment systems in the works.
  2. Data on successful completion of major mid-point milestones relevant to age/grade on the path toward college and career readiness. Instead of waiting to the end to see if students graduated high school or made it into college, we need to look at indicators along the way, such as reading proficiency by third grade or completing algebra 1 in eighth grade.
  3. Data disaggregated by subgroups (such as race/ethnicity or English proficiency level) and by grade-range (K-3, 4-8, 9-12) so that we can understand if a program is working for all students. Since the enactment of No Child Left Behind, this data is gathered and made available by law, but are we using it to really understand whether the most vulnerable students are benefitting from our programs?
  4. Data on specific measures appropriate to the target population(s). For English Language Learners, one of our most vulnerable populations, it is critical to track and measure growth on English language development tests with the ultimate goal of re-designating students as English proficient.
  5. Comparative school, district, and—where available—state data on demographically similar populations. The expense and difficulty of random control trials is often put forward as an excuse to do no comparison at all. But even small programs with limited resources can dig into publically available data and make an objective comparison.
  6. Student voice data (especially for middle school and high school) for evidence of impact and student satisfaction. As an example, I am an advisor to YouthTruth, a national survey project that demonstrates the power and utility of getting feedback from the ultimate “customers” in the system—our students.
  7. Data on the increase of positive behaviors, such as consistent attendance, homework completion, and even persevering through challenging tasks.
  8. Data on the increase of 21st-century skills such as analytical thinking and problem solving, as well as character development in areas such as grit and resiliency.
  9. Data on changes in practices among families, teachers, community-based program staff, school leaders, and school systems that are aligned with evidence-based approaches for supporting the academic, language, and literacy achievement of underserved children.
  10. Add your favorite indicator here.

We could use a little courage here—and a strong dose of stamina. Education leaders and funders should start by focusing on the promising programs that we currently support. We should offer them the resources and accountability needed to strengthen and align their data to the coming Common Core Standards and other meaningful indicators that drive a more ambitious vision of student success.

These “getting ready” investments are ones that philanthropists often shy away from, preferring to fund the provision of services directly to students, teachers, and parents, instead of strengthening program models all together. We will need to get much more comfortable with this tradeoff—between the relative safety of direct services and the ambition required to create new models—and with the greater success it will inevitably bring. Our kids need us to start looking toward the horizon and delivering on our intentions. This is the most effective way for education to stop the cycle of poverty and create new cycles of economic opportunity.

In upcoming posts, I will interview leaders of programs that are collecting more robust and meaningful indicators toward greater impact for students. Please let me know if you have some good examples to share.

Tracker Pixel for Entry
 

COMMENTS

  • Ben Jakovljevic's avatar

    BY Ben Jakovljevic

    ON February 5, 2013 05:59 PM

    What factors into current programs not getting the data right or pursuing only some data? The indicators you suggest would certainly be helpful, but I wonder whether the task of gathering the data would be difficult for some of these programs. 

    That leads to the question, are there certain indicators which you think are more crucial than the others?  As far as measurement goes, should we be starting totally new organizations that get the metrics right from the start or some sort of higher accountability group that would track and measure programs and “certify” them when they achieve a deeper set of metrics? 

    Great article!

  • BY Alexa Cortes Culwell

    ON February 5, 2013 08:03 PM

    The biggest obstacle organizations that deliver programs into the K-12 public ed system face is funding to support ongoing research and development. Let’s consider how different this dynamic is from the private sector.  I live in Silicon Valley, also known for its robust bio tech sector. A friend who is a genetic scientist has been inventing cancer drugs for 20 years that are just now getting approved by the FDA and becoming profitable. Investors stuck with them because the payoffs are big.  But the nonprofit world is different. Funders stop investing in the research once the slightest evidence emerges and then they focus on expanding the programs to more students (with no external review agency involved such as the FDA). So we have a lot of promising programs out there that don’t have enough of a robust evidence base. Starting new programs can be very risky, take forever to scale and they often fail.  So taking promising programs funders have already identified as having strong leadership, promising metrics and scale could be a lot cheaper and deliver results faster for students.  That said, there is always space for funding start-ups in this space - it’s just not where I would begin.

    Finally, among the nine indicators a program should prioritize which ones are most applicable and important to them depending on what their program is intended to do.  Then they should pursue them relentlessly, and convince funders to help pay for it. I promise to unearth some good examples for future posts. Stay tuned and thanks for your comment.

  • Jared Stancombe's avatar

    BY Jared Stancombe

    ON February 6, 2013 11:41 AM

    Personally, I believe the first thing in reforming anything is clearly stating what the problem is. When it comes to education, the United States is falling behind, and fast. 

    We are clinging to a 300 year old method of teaching children. We still put 1 teacher in front of a classroom, and expect children to sit 45 minutes, paying constant attention to what 1 person is doing, and enforce disciplinary measures when they act out. They are expected to completely disconnect from everything, learning something that has no relation to their lives, and expected to succeed.

    Then, they go home, turn on the TV, get their laptop, and text their friends all at the same time. Schools are no longer the primary centers for learning for children, and they are being asked to focus in the classroom, when they live their lives completely connected to everyone they know at all times while having access to the summation of human knowledge on a 12” screen they can carry around in their backpack or on the 3” screens on their cell phones.

    When our schools start looking more like our malls and less like our prisons, and when teachers are respected as much as doctors and lawyers instead of consistently scoring in the bottom 25% of their college classes, maybe we can actually see “reform” being made.

    We can discuss the monitoring and evaluation process of encourage student success, but I really believe that is a smaller part of a larger problem in the U.S. education system, which is the very model we base our education system on is completely outdated and no longer prepares students for the challenges they will face in their future lives.

  • BY Alexa Cortes Culwell

    ON February 6, 2013 02:56 PM

    Jared, I couldn’t agree with you more in terms of how outmoded the system is. And I believe that metrics can be a powerful driver for transformation of public education, as what we decide to measure can radically impact what we teach and how we teach it. There is a lot of rhetoric right now about needing a new kind of system—yet most organizations I look at are still relying on the traditional metrics.  If the system has to measure and be held accountable for the skills and characteristics most needed in this economy (e.g., team work, problem solving, resiliency), this could go a long way toward creating change in what gets taught and how it is taught. Thanks for responding.

  • BY Tricia Powe

    ON February 7, 2013 04:05 PM

    That was a nice article, however, instead of placing my favorite indicator at #10, I would place it at the top since the the rest depend upon its value and related strength. That said, considering my work and the strain I see in families daily, my focal point and indicator of choice is with the social capital of family structure. Our current research survey is in its fourth year. We will report after seven years or 10k responses. Family-based social capital inherently, but not necessarily, leads to more opportunities for higher learning because resiliency is stronger, teachers have more time to wear their teaching hats and this translates into the economic mobility we would like to see.

  • BY Alexa Cortes Culwell

    ON February 7, 2013 05:19 PM

    Tricia, I would love to know more about your research. Can we connect?

  • Derek Mitchell's avatar

    BY Derek Mitchell

    ON February 8, 2013 11:49 AM

    Alexa,
    I appreciate these indicators and have seen most of them all pop up on preferred lists from time to time in these conversations. I don’t agree though, that anything about the API is ‘easily measureable.’  Sure it is a gross measure, but that is because it incorporates (far too much to my mind) many, many, many elements that are valuable individually, but nearly meaningless when whipped up in a smoothie along with lots of other elements).  This is one of many examples of the resulting sum being less than the total of the individual parts. 

    I worry also though that the major focus in your indicators is on outputs to the exclusion of more inputs when the research is very clear that inputs are what matters most when talking about transformational outcomes for underserved kids of color.  When an elementary school in Palo Alto spends 40% more per child than does one in nearby Ravenswood, and does so for decades, it is no surprise that the performance of kids for the most part matches that pattern.  It is irrational to expect teachers and schools in Ravenswood to do more with less.

    The indicator on your list that I appreciate most is number 9.  What a wonderful world we could have if we had a valid, reliable and credible way of looking at adult practices across contexts and measuring improvements therein!  Wow.  With such a set of measures, the ability to improve all aspects of the enterprise becomes much more possible. 

    Attainment data (particularly along the way to ultimate success) is a close second.  Though to truly accomplish that we would need individual student longitudinal data that are linked to a state ID. The longitudinal data conversation in California has been bogged down by politics (see the latest CA state student ID discussions).
    Thanks for a fun and interesting read.

    Derek

  • BY Alexa Cortes Culwell

    ON February 11, 2013 06:48 AM

    Thanks for weighing in Derek. I think your comment about better funding the inputs is right on so long as we establish robust measures to better understand why and how that funding of an input (a program practice/approach) is producing the transformative outcome we seek for students.

Leave a Comment

 
 
 
 
 

Please enter the word you see in the image below: