The Center for Employment Opportunities (CEO) provides employment services and other supports to individuals who have recently returned home from incarceration. CEO believes that organizations that listen to their participants and incorporate their voices into program decisions are better able to provide accessible and effective services. In 2016, as part of broader efforts to partner with participants in this way, the organization deployed a text message survey program to hear directly from clients. This text messaging program is part of CEO’s broader Constituent Voice (CV) programming, which focuses on providing a platform for participants to be heard and effect change. 

A few years later, to learn more about the value and the limitations of the text messaging program, explore the relationship between participant feedback and outcomes, and find opportunities for improving feedback mechanisms, CEO partnered with the firm MDRC on a research grant from Fund for Shared Insight. Using data analytics, our research team found that the people who responded to the text message surveys most often gave positive feedback and were also more likely to have success in the labor market. This suggested that the text message responses might not reflect the sentiments or perspectives of the people who needed more support or had negative experiences. To find out how to capture these potentially missing voices, MDRC suggested incorporating a qualitative research component, guided by one of CEO’s Participant Advisory Councils (PAC).

PACs are groups of alumni and participants who engage in leadership and development training opportunities to ensure that CEO centers people with lived experience in strategic and programmatic decision-making. PACs are usually organized around a specific topic or project, meeting for a discrete period of time. Over two months in 2021, MDRC’s research team worked with a seven-person PAC to design staff and participant interviews meant to help CEO better understand perspectives on and experiences with the text messaging feedback program.

Here are two important lessons MDRC learned from the PAC that were then applied to the interview design process:

1. Show genuine interest in the interviewee as a person: The PAC expressed that interviewees will want to know who the interviewer is and not only share about themselves in a one-way conversation.

  • What did MDRC do? We added a section at the beginning of the interview that was focused on getting to know each other — interviewer, notetaker, and participant interviewee – on a more personal level and in a low-stakes way. For example, the interviewer would say, “Before we get started, we’d appreciate getting to know more about each other, so we want to share a fun fact about ourselves. I’ll go first and then I’ll pass it over to the next person.”
  • Why does it matter? This served to set a more personal tone and established a more of a two-way conversation in which the interviewee experiences the interviewer sharing information about themselves before asking them to share their own experiences.

2. Value contextual information and integrate empathy into the interview methodology: The PAC emphasized that the research questions around soliciting feedback would not resonate with many participants. Instead, it would be important to ask questions that the interviewee could more easily relate to, such as about their personal experience with CEO’s programs, before segueing into their experience with giving feedback.

  • What did MDRC do? We had several interview sections focused on gathering interviewee’s feelings about and motivations for engaging or not engaging in CEO’s broader Constituent Voice initiative, while also asking interview questions on experiences with staff and how this impacted their willingness to provide feedback. We also integrated what’s called empathy interview methodology, which incorporates “open-ended questions to elicit stories about specific experiences that help uncover unacknowledged needs.”1 For example, instead of asking a more general question like, “What has your experience with feedback been?”, we said, “Think about a time at CEO that you were asked for your feedback. What was it about? How did you feel? Tell me more about that.”
  • Why does it matter? Through the PAC, the research team was able to understand the importance of creating human-centered interview protocols that took into consideration the experiences of interviewees with the program and their potential perceptions of and lack of experience with research interviews. This allowed us to create an interview environment that centered a relational rather than transactional dynamic. We aimed to be respectful and responsive to the person’s experiences and developed a protocol that elicited detailed and honest responses.
      •  
Here are some overall lessons MDRC learned about themselves and their relationship to the research:
  • Engaging with a PAC in the research design process requires acknowledging the power one has as a researcher and sharing that power with partners. The researcher and/or research team can consider how they can leverage their decision-making power to facilitate participation, shared decision-making, and mutual learning with others,2 especially with those who are the target audience for the study (program participants) and those who work with them (site staff). To create spaces for authentic and meaningful participation, we aim to identify our “power levers” as researchers and leverage those for greater participation and mutual learning. By stepping into our power as researchers, we can foster others to step into their power, as well.
  • We need to provide structure and information for participant advisors to engage fully. This means being explicit about our goals, objectives, and our roles. It also may mean returning to this framework throughout the project to ensure that we are all still aligned on the goals and research questions. We found that sharing agendas beforehand and providing a chance for advisors to contemplate the information and questions before the next meeting was also helpful.
  • To propose new ideas based on the feedback we received in previous meetings from the PAC, our team needed to debrief and talk through our self- and team-reflections. Taking time to do that enabled us to center participants’ experiences by explicitly highlighting the ways that their input connects to our broader mission and the deeper purpose of the work. Our ongoing conversations included discussions about our social identities, especially racial identity, and how to engage with interviewees in a way that was respectful of their agency and experiences and would mitigate power dynamics.
  • It is crucial to “close the loop” by informing advisors how their input is being used. We did this by developing a presentation at the conclusion of our project that outlined the ways we incorporated the PAC’s insights and suggestions into the interview protocol and recruitment process.

Our work with the Participant Advisory Council — and these critical takeaways from it — demonstrate the importance of engaging participants during different stages of the feedback process and understanding which platforms beyond surveys, such as text messages or group workshops, are most appropriate for the feedback being requested. In partnership with CEO and its advisors, we gained insights that can inform and improve feedback practice and deepen our understanding of the significant relationship between participant feedback and long-term outcomes.

About the authors: 

Gloriela Iguina-Colón
Gloriela Iguina-Colón
Research Analyst, MDRC
Iguina-Colón is a qualitative researcher and facilitator who provides strategic support to institutions seeking to improve service delivery for low-income people.
Brit Henderson
Brit Henderson
Research Associate, MDRC
Henderson is a mixed-methods researcher who works to help nonprofits and government agencies use data to better serve communities.

This post is part of a series highlighting the work of Fund for Shared Insight’s 2019 feedback research grantees exploring the relationship between feedback and participant outcomes at customer-facing, direct-service nonprofits. We believe this research is the first of its kind in the United States and builds the case for the importance of including participant voice in program planning, execution, and evaluation.

Most of the work has now concluded, and in a recent blog post about feedback research for AEA365, our research manager, Penelope Huang, summarizes the projects and their findings, providing a good introduction to this series. which will include the Boys and Girls Clubs of the Peninsula, Center for Employment Opportunities, Nurse-Family Partnership, Pace Center for Girls, and YouthTruth.