Participatory Learning and Evaluation: Participation in Philanthropy Beyond Grantmaking

Participatory Learning and Evaluation Planning​ for Participatory Philanthropy Toolkit

This post is part of an ongoing series of updates to our Participatory Philanthropy Toolkit, a comprehensive resource based on our one-time exploratory project, called Participatory Climate Initiative. Each update aims to highlight relevant challenges, offer practical solutions, and ensure that the toolkit evolves to meet the changing needs of funders working to integrate participatory practices that share decision making, build trust, and shift power. 

 

Practices of participatory learning and evaluation are relatively rare in philanthropy, even by funders committed to participatory grantmaking. But they are an important part of the philanthropy process that offer significant opportunities for funders to shift power to the communities they serve.

As conventional approaches to learning and evaluation are critiqued as extractive, harmful, and wasteful, participatory approaches offer funders a chance to rethink and reimagine their efforts. Participatory learning and evaluation practices allow foundations conducting an evaluation to reconsider why they are doing it, and whether there are better ways to be in relationship with community. And by definition, such practices open up participation in the learning and evaluation process in a way that shifts some of the power held by external or staff evaluators to the people most impacted by the work being examined.  

 

Participatory Learning and Evaluation Planning​ for Participatory Philanthropy Toolkit

A tool to help funders implement participatory approaches to learning and evaluation.

Participatory evaluation “is not just a matter of using participatory techniques within a conventional monitoring and evaluation setting. It is about radically rethinking who initiates and undertakes the process, and who learns or benefits from the findings.” --Institute for Development Studies

“Participatory learning is the process of involving grantee or community members with lived expertise in foundation supported learning activities, aimed at shifting decision-making to these individuals.” --Engage R+D

 

To support the small but growing field of participatory evaluation and learning, Fund for Shared Insight has identified three core principles that should underpin the work:

  • Learning and evaluation is a two-way street that benefits everyone involved
  • People closest to the issues are the experts of their experiences
  • People closest to the issues should be able to participate in learning and evaluation

Shared Insight has also added a new section on learning and evaluation to our Participatory Philanthropy Toolkit (PPT) that includes recommendations to help funders get started. And we invited funders already doing the work to two recent committee meetings hosted by Shared Insight’s Participatory Philanthropy Committee (co-chaired by Jean Ries at Packard and Arelis Díaz at Kellogg), to share their experiences.

John Matthew Sobrato, from Sobrato Philanthropies, shared a timeline of a two-year journey to explore, design, and launch a participatory evaluation process. This approach to learning and measuring progress will support the foundation’s new 10-year grantmaking strategy focused on advancing economic mobility for Silicon Valley’s most excluded residents. Beginning in 2022, Sobrato Philanthropies collaborated with researchers to engage more than 120 grantees in focus groups and shape a thoughtful plan; one that would draw lessons learned from past experiences with evaluation. In addition to influencing Sobrato’s view of what mattered and what needed to change, Grantee representatives also played a crucial role in selecting the evaluation firms, sharing decision-making power and receiving compensation for their participation.  

Aysha Pamukcu from the Partnerships in the Bay’s Future, an initiative of the San Francisco Foundation, offered four guiding questions, with some advice as well. The questions are intended to promote trust and data integrity and embody the spirit of mutual learning: 

 What does success look like? Funder definitions of success should align with grantees’ perspectives – or ideally, go a step further, and ask grantees to define success and important milestones. This will likely bring up tangible and intangible metrics. 

  1. Who is taking on the labor? Respect partners’ time and effort, such as by collecting information through conversations or already-existing reports. Consider how to relieve grantees of demands on their time – such as converting your conversations into written reports, if those are required by your institution. 
  2. Who has the power to make meaning? Involve grantees in interpreting data to understand what they consider important lessons, trends, and stories. See if you can produce information that is useful to grantees, not just your institution or the funder field.  
  3. Why are you collecting the information? Be clear about the purpose of evaluation activities. In its case, the Partnership for the Bay’s Future collects information for learning and improvement, not for decision making about grants. This gives grantees the space to be honest about challenges, which can strengthen relationships and PBF’s ability to be a helpful partner. 

Jessica Oddy from the Global Fund for Children (GFC) talked about how the fund is shifting from a more conventional evaluation model to a trust-based learning model. Instead of focusing on compliance-driven metrics, GFC uses reflective, narrative-based reporting. They prioritize learning, she said, as they shift to acting as “knowledge brokers” who facilitate learning across their network. This includes peer learning exchanges, partner-led participatory research, and capacity-building initiatives, like convenings and retreats. 

 

The diversity of the experiences shared by Sobrato, Pamukcu, and Oddy underscore that participation is not a binary; there is a range of power, ownership, and autonomy that can be shared with communities impacted by a funder’s work. And their presentations raise useful questions for funders to ask when embarking on participatory evaluation processes, such as about how to value and understand expertise, and how to define and ensure “representation” when inviting participants.

Input and feedback

Please reach out to our program manager Katy Love if you have questions, suggestions, or more to discuss about participatory grantmaking. 

About the authors: Winifred Olliff and Katy Love led Shared Insights’ Participatory Climate Initiative.

Photo of Katy Love
Katy Love
Program Manager, Fund for Shared Insight
Winifred Olliff
Winifred Olliff
Consultant

Have you seen funder listening in action?

Share your stories to demonstrate what’s possible when grantmakers center the people and communities most impacted by their decisions.