Stanford Social Innovation Review : Informing and inspiring leaders of social change

SUBSCRIBE | HELP

Foundations

Foundation Learning: The Case for Productive Anxiety

A greater sense of urgency can help foundations better use their strategy and evaluation data to learn, unlearn, and improve.

The Value of Strategic Planning & Evaluation The Value of Strategic Planning & Evaluation In this ongoing series of essays, practitioners, consultants, and academics explore the value of strategy and evaluation, as well as the limits and downsides of these practices.

Foundations collect reams of data about how the issues we fund changing, what other funders are doing, and how grantees are performing. We do this to hold our grantees and ourselves accountable for what we set out to do, and to surface opportunities for improvement and increased impact. Yet we dedicate little space to reflecting on and making sense of the data, or evolving and adapting our strategies; instead, we limit these practices to perfunctory meetings and stuff them into already full workloads.

At the Packard Foundation, we rely on our program staff members to act as trusted and intelligent filters for many information sources. They listen closely to grantees and partners working in the same fields, and tap into a variety of inputs—both qualitative and quantitative, including third-party evaluations, external research, and site visits—to make well-reasoned decisions that help drive positive change. We studied how Packard staff take in, make sense of, and use all of this strategic information, and found that while our processes for effective and efficient grantmaking transactions are many and strong, our supports for evolving our strategies and using evaluative information are few.

To be clear, we have well-established practices for tracking progress and for working with third-party evaluators to analyze and judge how we’re doing. But too often the motivation behind these activities are internal reporting requirements or a sense that we ought to engage in a systematic analysis of how we’re doing because its good practice, rather than an important learning need. As a result, the data we collect and lessons we learn fail to deliver value to the program staff on the front lines, who are making day-to-day decisions about resource allocation. We can do better.

What would it look like if we truly used our data to generate meaningful insights, and then used those insights to drive better decisions and greater impact?

No doubt, user-friendly tools for managing and sharing data and information are part of the equation (at Packard we’re working on building such a platform with Fluxx Labs). It’s also important to have access to good data. There are a number of promising efforts underway to aggregate and improve data for social good, including the Gates Foundation’s Grand Challenge for increasing the interoperability of social good data, and Foundation Center’s IssueLab, Philanthropy In/Sight and Reporting Commitment.

But even if we have fabulous tools and accessible, usable data, we won’t be truly data informed unless we evolve our habits around learning from that data.

We do not need more internal requirements and processes. We need more urgency. The great organizational psychologist Edgar Schein argues that true organizational learning and change happens only when there is a real threat of pain. Schein says: “Anxiety inhibits learning, but anxiety is also necessary if learning is going to happen at all. But to understand this, we're going to have to speak about something managers don't like to discuss—the anxiety involved in motivating people to ‘unlearn’ what they know and learn something new.”

How might we generate productive anxiety and create spaces where funders are using their abundant data to generate meaningful insight to inform better action? Here are a few things I’ve learned through trial and error over the past couple of years as Packard’s evaluation and learning director:

  1. The issues we are tackling and the communities we’re working to benefit need to drive learning, not foundation bureaucracy. At Packard, I have found that the more we try to formalize and “require” learning, the less applicable insight and strategic value we produce. While leaders must be champions of organizational learning, we can’t be too prescriptive about when and how it happens.
  2. Our evaluations need to address important learning needs and surface timely insights if they are to serve as real inputs to strategy evolution. Too often, mandatory evaluations end up recapping what program staff already know and/or the results come too late to inform decision-making. The good news is that we’re seeing more emphasis on use-driven evaluation; for example, there are growing bodies of work around developmental evaluation and strategic learning.
  3. We simply need to carve out the space for reflection and learning. Program staff members interact with massive amounts of data on a daily basis—data that they’re formally collecting, insights generated through their interactions in the field, and publically available data sets that we can only expect to grow. But we aren’t reserving ample time reserved or prioritizing sense-making. At Packard we’re experimenting with creating more intentional spaces so that we can learn and unlearn. The tricky part is making—not mandating—space and staying connected to the pressing (and anxiety producing) issues in our fields.

Those are just a few starting points. I’m sure there’s a lot more we can do to provoke learning, including using what we know about cognitive biases, and thinking creatively about integration with workflow. What do you think? How else might funders generate productive anxiety? And do we need it?

See more from The Value of Strategic Planning & Evaluation

Tracker Pixel for Entry
 

COMMENTS

  • BY Mario Morino

    ON April 18, 2014 06:47 AM

    Thanks for an excellent post. In the same vein as Edgar Schein, a colleague once shared that “real change doesn’t happen, until what you are used to stops, with extreme examples including the loss of your job, divorce, death of a loved one, etc.” I’ve watched people change for the better, as they learned to deal with situations they never imagined. We also found that unless there is a constructive tension, an edge, or, as Schein says, “anxiety,” meaningful change/learning doesn’t happen. I agree that while leaders want to advance organizational learning, it can’t be “too prescriptive about when and how it happens.” Your ending about dealing with cognitive biases and integrating learning with workflow will likely be provocative to those who want structure and definition, but it’s music to my ears. I’ve long believed in natural/organic learning that occurs through observation, hands-on doing, exchange, even confrontation at the point of discovery or challenge—especially when this approach is encouraged or even when you have to push someone out of their comfort zone. Thanks again; I really appreciated your post.

  • BY Lucy Bernholz

    ON April 18, 2014 10:45 AM

    Diana

    Thanks for reminding us of the real purpose of all this “stuff” - data, tools, performance measurement, evaluation - it’s about learning. Learning is fundamentally a human endeavor (there are those who will argue it’s definitional to humanity). It takes time, interaction, reflection, purpose and serendipity. Organizations, as you suggest, need to find ways to value it without structuring it into submission.

    Lucy

  • BY Tom David

    ON April 18, 2014 02:36 PM

    Diana -
    Thanks for sharing these observations, which in my experience are right on the mark.
    I definitely agree with you and Edgar Schein that real learning is hard, and is often accompanied by genuine discomfort. I think the challenge of promoting more effective learning practice in foundations is at root a matter of changing organizational culture…which Schein would also argue is a very difficult task at best.  Just as most foundations are remarkably risk averse, they also tend to avoid the potential unpleasantness of truly candid self analysis and dialogue.  If foundations are going to establish and maintain the kind of mutual trust and respect that underpins real candor, their boards of directors need to set the tone and make it clear they are really interested in the unvarnished truth, even about a cherished project. Too often, in my experience, the perceived potential for personal or organizational embarrassment presents a formidable tacit barrier to that kind of honesty…and real learning.
    Tom

  • BY Julia Coffman

    ON April 22, 2014 09:42 AM

    Thank you for this excellent post. As a long-time external evaluator with the Packard Foundation and many other foundations, all of your points certainly resonate. Particularly the points around evaluation needing to provide timely insights and the importance of carving out space for reflection and learning. As you suggest, these have to be carefully crafted and not overdone. For example, I’ve built many a “learning meeting” into my evaluations, but I’ve often been frustrated by their utility. My error was in timing and framing them around when external evaluation data or reports come in, so the question has been: “Here are data and evaluation findings; what can we learn from them?” That has not always led to the satisfactory application of data. So I’ve been working instead on framing learning meetings around strategic questions or decisions that are at hand or coming up (seems obvious, but this isn’t typical evaluation practice). When framed in that way, any relevant data or insights related to those questions or decisions will be on the table for consideration, including external evaluation data, but also monitoring data, personal experiences and observations, and other relevant information. This makes the application of data and learning much more likely. In addition, occasionally, I will invite “outsiders” to join a learning meeting for the purposes of both bringing in additional expertise and protecting the learning process from the kinds of cognitive biases that can limit perspective or cloud judgment in small groups (e.g., confirmation bias, groupthink).

  • BY Heather Peeler

    ON April 23, 2014 02:06 PM

    Great post, Diana!  While we strive to build learning organizations, too often we take a check list approach to learning that is devoid of urgency and an authentic sense of purpose.  Thanks for reminding us to keep it real.  “…the communities we’re working to benefit need to drive learning, not foundation bureaucracy.”  Well said.

  • BY Mary Shipsey Gunn

    ON April 30, 2014 12:57 PM

    Diana—

    Thanks so much for these thoughts.  They helped me frame and reframe the hard work of staying focused on a generation of change for a place-based program.  In addition to learning within a foundation and within the funding community, committing to a culture of curiosity that permeates organizational, sector, and even personal opinion silos can’t be underestimated! Building willingness to un-learn, re-consider, and re-source are other ways to think on that important practice of learning as we go. 

    With appreciation for all you do to encourage learning.

  • BY Sarah Stachowiak

    ON May 2, 2014 10:48 AM

    This is great and thoughtful.  I think there is much evaluators can do, as Julia Coffman’s comments above show, to foster the right kind of conversations that can really support learning.  We have also been orienting our work toward the questions and decisions at hand and the various ways information and data can support that, and being sure to keep some humility about the ways in which evaluation can support decisions as part of a suite of activities and kinds of information.  And the evaluation field can also continue making sure people are aware of some of the ways that doing systematic collection of data at the right times for the right kinds of questions can be a meaningful support toward impact rather than assuming evaluation is primarily accountability and retrospectively focused.

    The organizational issues you mention, such as spaces and time is important.  And cultural aspects that support learning and welcome transparency, failure, open debate and course correction seem critical as well.

Leave a Comment

 
 
 
 
 

Please enter the word you see in the image below: