Impacting Responsibly: Insights from Ben Goodwin

Read the full Impacting Responsibly report here!

Impacting Responsibly
A report from Salesforce, Candid, Urban Institute, and New Philanthropy Capital

Constituent Feedback – Insights from Been Goodwin

“But what are you going to do about it?”

“What do you mean, what am I going to do about it?”

“What are you going to do about the results? What’s the point of asking these questions if Our House isn’t going to take action?”

This question, posed by a young mother of three, challenges me, but it also encourages me that we are doing something right in our work to create a feedback culture in our organization. I am the Executive Director of Our House, a nonprofit organization in Little Rock, Arkansas, with a mission to empower people, especially families with children, to overcome homelessness in a lasting and permanent way. We serve more than 300 people each day on our seven-acre campus in downtown Little Rock, taking a two-generation approach through programs that include housing (emergency shelter and transitional housing), early childhood education, out-of-school-time programming, workforce training, homelessness prevention, case management, and more. We have grown rapidly in the nine years that I have been with Our House, more than quintupling in size, adding new programs and constantly changing. We have also developed into an organization that embraces the power of data, which we use to measure our impact, identify areas for growth and improvement, and attract investment in our work. We track all kinds of data for our adults, children, and families related to employment, housing, school performance, kindergarten readiness, and much more.

Yet, for most of the past nine years, we overlooked a wealth of data that was right under our noses: the ideas, opinions, and suggestions of the people participating in our programs. We had conducted focus groups and listening sessions to get input on things like the design of our new Children’s Center, but we had never seen client feedback as something that we should systematically track and use to drive ongoing improvement. But in 2015, one of our funders, the W.K. Kellogg Foundation, suggested we give the Listen for Good initiative a look. When we learned about the Listen for Good program’s goals of creating best practices, standards, and national benchmarks around collecting and using participant satisfaction data, we saw the same seriousness of purpose with which we approached our program data collection. So we decided to give it a shot.

The Listen for Good program guided us as we set up a system of regularly surveying our program participants. Our first instinct was to survey a small cross section of our clients, but Listen for Good pointed out that, in scenarios like that, you often wind up surveying the people who are most closely connected to your program and leaving out the voices of those who are feeling disconnected from or disgruntled with your program. Thus, you miss the opportunity to find out why. So we decided to try to survey all our clients. We knew this would be no small feat. We planned an intensive week of survey collection that we called Speak Up Week. We set up “Speak Up Spots” all across our campus to collect as many surveys as we could. We recruited and trained volunteers from the community, including some of our board members, to staff the Speak Up Spots and collect the surveys. The Listen for Good technical assistance providers had taught us about the phenomenon of “courtesy bias,” in which participants in social services programs often rate them highly or refrain from providing negative feedback because they don’t want to seem ungrateful. We thought that having someone other than the Our House staff, that clients weren’t accustomed to seeing, administer the survey it would reduce the courtesy bias effect. What was the end result of these and many other efforts to collect a robust and authentic set of feedback from our clients? In that first round, we collected 199 surveys.

This is the point in the process where “What are you going to do about it?” needs to be asked.

We had the results of 199 surveys. We had summary data for quantitative questions but also hundreds of very specific written answers to questions like, “What are we doing well?” and “What could we be doing better?” How do you respond to such an outpouring of experiences, impressions, and suggestions? Here is where I think we added our own innovation to the field. We asked for help from our Community Council. This is the group that had already been serving as a kind of “focus group” of advisers, made up of participants from each of our programs. Instead of asking them to share their own opinions, this time we asked them to read, reflect on, and summarize the results from all the Speak Up Week surveys. We supported them as much as possible in doing so, organizing an evening meeting with dinner, childcare, and an interactive process of sorting the hundreds of survey comments to see what themes emerge. The Community Council carefully chose between 5 and 10 recommendations for each of our programs. The staff of each program was provided with the full results of the surveys, and they were required to develop an action plan to respond to the Community Council’s recommendations. Each program’s staff organized a “talk back” event to share the results of the survey, the Community Council’s recommendations, and the program’s action plan for responding to the recommendations and making improvements and changes.

By giving our clients the power to tell us exactly how we should respond to the survey, we took our own biases out of the equation, but we were also taking a risk. This was a very public, very transparent process. What if it resulted in suggestions we couldn’t follow through on? This was challenging to me as a leader.

The recommendations we received in that first round were challenging. For instance, we were told we needed to provide evening child care to enable parents to attend our evening classes in our Career Center. We were also told we needed to improve security at the front door of our Children’s Center, to make sure only parents and staff were able to enter. Many of their recommendations came with price tags attached, some with hefty price tags. To some of these pricier ideas, my first reaction was to say no. These things aren’t on the agenda, and they’re not in the budget, but I knew that saying “no” wasn’t an option. We had to try.

In trying, I soon realized that I needed to expand my own understanding of what’s possible. We were able to begin providing evening child care in the Career Center, thanks to some new grant funding—funding we received, in part, because we were able to show clearly, through our survey data, that this was identified as a big need by our clients. That program continues to this day, and it enables hundreds more parents to access evening support groups, mental health counseling, career training, financial coaching, and other services each year.

Regarding the security at the front door of the Children’s Center, we initially responded with a series of ineffective measures, and the issue kept coming up in subsequent surveys. We finally bit the bullet and installed a video monitoring system so that all classrooms can see who is in the lobby and verify their identity before buzzing them in. This was a very legitimate safety concern, clearly expressed by our parents who drop their kids off each day, and unallayed by our half-measures. When we finally addressed the problem in the way it needed to be addressed, the next round of survey results showed the impact. For the first time, the Community Council made no recommendations related to safety of the children’s programs.

As we’ve repeated the survey process every six months – five times over the past two and a half years, growing each time to a high water mark of 306 surveys collected – the recommendations have only gotten more challenging. There are dozens more examples of Our House making changes, big and small, in direct response to Community Council recommendations. And, there have been plenty of recommendations that we haven’t been able to implement – yet – but in each of these cases, we had to explain why. By requiring ourselves to explain our successes, our shortcomings, and our decision-making to our clients, we have created a new dynamic of accountability that, as difficult as it can sometimes be, is oriented in the right direction – between nonprofit leaders and the people they are there to serve.

This process is also challenging for our staff. It takes a lot of work simply to collect the surveys. It takes a lot of humility and maturity to read through the comments and criticism, and it takes even more work to respond to the Community Council’s recommendations and make programmatic improvements. Our senior leadership tries to do everything we can to support our team in this, and in fact, we began surveying our staff as well, to make sure they know that their voices matter too. We use the staff survey feedback to identify ways to improve internal processes and support systems to improve team member satisfaction.

Despite the challenge it presents, our staff and our entire organization has embraced this culture of feedback enthusiastically, because it is so resonant with the spirit and energy of their work with our clients. Our House’s approach is oriented around helping clients navigate the systems they interact with in our community. The workforce, health care, school, public agencies – these and many other systems that play such a defining role in all our lives are complex, flawed, and too often unresponsive to the needs of people living in poverty. We help our clients build the skills and confidence to advocate for themselves within these systems, to find their voice and use it to overcome obstacles and accomplish their goals. Within this framework, it only makes sense that we ourselves should strive to be a system that listens closely to our clients, that heeds their concerns, that is responsive to their needs. We want our clients to be invested in the success of Our House’s programs the same way we want them to be invested in the success of their children’s school. We want them to ask hard questions of us the same way we want them to ask hard questions of their child’s doctor. We want them to speak up if they have safety concerns at Our House the same way we want them to speak up if they have safety concerns about their own apartment. We want them to believe change is possible, that new ideas are achievable, that they are valued members of a community. That starts right here at Our House.

So back to the young mother asking, “What are you going to do about it?” She is a member of the Community Council, and she is reacting to our introduction of our newest experiment in collecting client feedback. One drawback to the every-six-months survey process is that six months is a long time to go without checking in with your clients, and yet, the Speak Up Week process is too much work for us to do more often than that. We learned about an approach to customer satisfaction measurement being used in the for-profit sector that we thought might be a good complement to the survey work we were doing. This radically simple approach was pioneered by a company called Happy or Not, and it asks one very simple question (such as “How would you rate this bathroom?” which you may have seen in an airport recently).

Respondents answer by selecting one of four faces that range from a very sad frown to a very happy smile. We were attracted to the idea of collecting real-time data on our clients’ experiences in our program. We reached out to Happy or Not and asked if they would partner with us to test whether this kind of approach could work in the context of a nonprofit social services organization. We were thrilled when they said yes and sent us a set of their terminals to put in every program on our campus. Now, we were presenting this idea to the Community Council to get their help in figuring out how to implement it on our campus, and the first thing we heard in response was “What are you going to do about it?” I don’t yet know what we are going to do about it, how we are going to translate all of this real-time data on how our clients are feeling about our programs into actionable guidance for improving our programs, but I am heartened to hear our clients express their expectation that we will honor their voices and respond to their concerns. I am confident that we, the community of people who are invested in the success of Our House, clients
and staff alike, together, can figure it out.