Five Ways Funders Can Replicate What Works
A look at how funders can support effective implementation of evidence-based programs.
There’s no question that innovation is a sexier funding opportunity than implementation. Which would you choose to support: funding the discovery of the light bulb or funding Inspector 48 in the light bulb factory?
And yet effective implementation is critical to the social sector’s growing evidence-based movement. As researcher Dean Fixsen has noted, if evidence-based programs (EBPs) are like a life-saving serum, then implementation is like a syringe: You need both to work to see results. Hence, a program proven to work in one place won’t produce the same results in another if it is implemented poorly.
One solution to this problem is for funders to play a stronger role in the implementation process. If funders create both a higher demand for effective implementation and a stronger set of supports, they stand a far better chance of scaling EBPs while maintaining quality. A federally funded teen pregnancy prevention program shows promise as a model for how to do this.
In September 2010, the federal government awarded $75 million in competitive, five-year grants to 75 nonprofit and public agencies, in 37 states and the District of Columbia, to implement the Teen Pregnancy Prevention program. We surveyed these grantees (and heard back from a third), and then interviewed a dozen of them as well half a dozen technical assistance providers. We also spoke with the federal officials sponsoring the program at the Office of Adolescent Health (OAH). By the end of our research, we were ready to advocate for more government funders and philanthropists to roll up their sleeves and approach implementation support with gusto. To those brave enough to accept this challenge, we offer five pieces of advice based on our work:
1. Choose grantees willing to focus on implementation.
Evidence-based programs, unlike many other programs, require diligent fidelity to a prescribed model. This can be a difficult cultural shift for practitioners who are used to having a lot of freedom in how they interact with their clients. “Some clinicians are like artists,” says Doug Kopp, CEO of Functional Family Therapy LLC. “They don’t want you to mess with their creative process.”
OAH made it clear when it issued the pregnancy prevention grants that it would require strict fidelity monitoring. Some grantees adjusted fairly easily, but others had to almost entirely switch out their practitioner workforce. “There was a joke in the agency that I was the hatchet around here because staff were coming and going,” says Lily Rivera, the Teen Pregnancy Prevention program lead at La Alianza Hispana. “But we had to get the right staff.”
2. Choose EBPs with implementation support services built in.
It’s important to evaluate not only the quality of the evidence behind an EBP, but also the quality of support services. For some EBPs, supports aren’t readily available. Others have a crack team of technical assistance experts who have designed training, coaching, and shared data systems for all organizations implementing the program. Funders interested in underwriting implementation should prioritize scaling programs with strong supports already built in. They may also benefit from forging relationships with the developers of the EBPs they wish to scale.
3. Anchor success in a close partnership.
OAH identified a project lead at each grantee site and matched that person with a program officer. While one function of assigning clear roles is increased accountability, it is critical that the project lead isn’t managed as a risk, but supported as a partner. “The OAH project officers are not just monitoring grantees for compliance,” said Amy Margolis, director of the Division of Program Development and Operations at OAH. “They are helping the grantees continuously enhance their programs.”
Linda Rogers, project director with the Iredell-Statesville School District in North Carolina, one of several school district grantees, told us that the “level of support we get from OAH has been incredible.” Many of the grantees we interviewed concurred.
4. Plan for learning.
The Teen Pregnancy Prevention program provided grantees with a year to assess needs, select programs, plan, hire staff, participate in trainings, pilot the intervention, and troubleshoot problems that showed up in the pilot. OAH was therefore making a trade-off between quantity—foregoing many tens of thousands of young people who could have been reached in that year—and the quality of the interventions over the full five-year grant period.
Claire Wyneken of the Wyman National Network emphasized the importance of the piloting phase. “If you’re new to an EBP, you have to do a pilot—especially piloting actual implementation,” she said. “Get staff acclimated to the program and all the logistics related to it. Work through local considerations, partner buy-in, and any bugs in deploying the program. Just because something is an EBP, you can’t just open a box and go.”
5. Fund amply—you get what you pay for.
Leading field research indicates that organizations implementing EBPs typically need three types of funding: start-up or planning money as discussed above; infrastructure funding to provide training and coaching to frontline staff, and to measure implementation; and direct-service funding to actually administer the program as described by the developer.
Many funders inflexibly favor this last type of funding, and even then, cover only an odd assortment of component program parts rather than the entirety of core programmatic components that the data shows must be present for successful implementation. As a result, nonprofits must haphazardly cobble together enough funding to deliver an evidence-based program and cannot do so in a sustainable—much less scalable—fashion. OAH alleviated the funding scramble by structuring its grants to include all three types of financial support.
For those concerned with eventually bringing down the cost of an evidence-based program, we hear you. It is very likely that EBPs will eventually need to travel down the cost curve if they are ever to scale. But funders who believe in evidence must structure grants to experiment with lowering cost per impact while measuring at each step along the way to ensure that impact is still happening.
For the “what works” movement to succeed, funders need to give implementation a second look. If “sexy” is in the eye of the beholder, perhaps there is nothing sexier than an evidence-based program achieving its promised impact.