Episode 69, Feb 07, 2024

Show Me the Results: A Practical Guide to Program Evaluation with Chari Smith

Nonprofit organizations, and their donors, have become increasingly vigilant about measuring success and effectiveness.

This new dynamic can significantly impact the type of resources a nonprofit attracts!

In this episode, Bob DiMeo and Devon Francis speak with Chari Smith, founder of Evaluation Into Action and author of Nonprofit Program Evaluation Made Simple. They tackle the pressing issue of program evaluation and its role in demonstrating impact and making data-driven decisions.

You will discover:

  • The need to look beyond numbers and articulate tangible impact
  • How program evaluation can benefit fundraising and capital campaigns
  • Practical guidance to get started (if you’re new to program evaluation)
  • Real-life examples of organizations benefiting from program evaluation
  • And more!

Full Podcast Transcript

[00:00:00] Welcome to Nonprofit Investment Stewards with Bob DiMeo and Devon Francis from Fiducient Advisors. Bob and Devon are passionate about helping non-profit organizations prosper. Whether you oversee endowment, foundation or retirement plan investments, this podcast exists to help stewards improve performance, reduce costs, and discover strategies that enable your charitable organization to prosper and advance its mission.

[00:00:26] Now onto the show. Hello, and welcome back to the Nonprofit Investment Stewards podcast. I'm Bob DiMeo. Always great to be joined by co-host Devon Francis. Today, I find that nonprofit organizations, and maybe even more so their donors, are becoming vigilant about measuring success and effectiveness. Seems like we're simply in a much more show-me-the-results environment, and that can really impact the type of resources the nonprofit attracts.

[00:00:57] Needless to say, I'm excited to have an expert on the topic as [00:01:00] today's guest, but first, Devon, how are you doing? I'm great, thanks. Great to be here as always, and I'm really eager to glean some insights today from our guest, Chari Smith. So, Chari is an author, speaker, and the founder of Evaluation in Action.

[00:01:16] Her passion is helping individuals and organizations learn to do and use program evaluation. She coaches, trains and provides hands-on evaluation services. She conducts workshops helping nonprofit leaders understand the value and use of program evaluation. She's also an adjunct faculty member at the University of Portland and Chari's book, Nonprofit Program Evaluation Made Simple, provides step-by-step instructions on how to do program evaluation.

[00:01:44] Chari shares insights for nonprofit leaders and it's wonderful to have her on the show. So, Chari, welcome and thank you for being here. Thank you for having me here. It's delightful to be here with you both today. Chari, it is terrific to have you on the program and it's, uh, earlier in the year and folks [00:02:00] think about planning and strategy and such.

[00:02:02] And so the topic I think is super timely. Maybe we can start by you telling us how you began in evaluating success and program evaluation and the like. You bet. It was, uh, 2001 when I moved up to Portland, Oregon, which is where I still am, and I landed a job at Education Northwest and there, there were so many program evaluators and all used different types of approaches and methodologies.

[00:02:27] So I really learned a lot on the job. I came from a research background, so I already had all the skills in survey design and facilitating focus groups and so on. But to apply it in that kind of a setting, it really ignited a passion for me on helping people to use data to make decisions as well as to demonstrate the impact that they're making.

[00:02:46] So, Chari, let's talk a little bit about the landscape of program evaluation. What are the trends that you see in that space? The trends I see is exactly what Bob was talking about earlier, right? Show me the results. [00:03:00] There's more and more a drive towards, don't just give me the numbers, how many people participated in your program.

[00:03:06] Tell me the difference your program is making for your program participants. I think that has become a critical piece that people need to start being able to do. The challenge, of course, is getting the training or understanding how to do it, because there are so many different ways to approach it. And so the one I offer is just one of many ways to do it, but I believe it's really linear and my passion is to make it very accessible to organizations.

[00:03:31] And Chari, can you articulate why you think program evaluation, it should be important to nonprofit stakeholders? I don't know that we have enough time if your listeners wanna snuggle in for several hours and listen to me talk about the passion of why it's so important. But, uh, really at the end of the day, if you want to make your job easier, if you want to have data that's useful to your organization, program evaluation is the key to integrate into your organization, right?

[00:03:59] [00:04:00] So, right now, oftentimes when I work with organizations, they are in a culture of compliance, right? They're doing program evaluation because funders require it, right? But if they can shift into one of a culture of learning, which I also call a culture of evaluation, that's where the change can really happen, and that's where they can see the importance of it.

[00:04:21] It's not just doing something because the funders are requiring it. It's gathering data because it will help you learn how to do it. And Chari, what type of nonprofits? I'm thinking of Fiducient working with hospitals, universities, churches, social agencies, museums, and the list goes on and on. What type of nonprofits are most attuned to program evaluation and perhaps getting the most out of it?

[00:04:46] I think any nonprofits could be attuned to doing program evaluation. It's a matter of looking at what's going to work for the organization that's going to be most realistic for them to do. Let me, let me give you an example. I'm gonna tell you a little story. I [00:05:00] worked with In4All. And they have a STEM Connect program, right?

[00:05:04] Where what it does is connect STEM business volunteers to students. So, they go into fourth and fifth grade classrooms and they deliver a STEM Connect program that's developed by In4All. So this gives the opportunity for these students to be exposed to real world STEM applications. Because the volunteers in the classrooms are actually STEM business professionals, right?

[00:05:24] So that's that connection point. So, it's an opportunity for them, for the students to understand why STEM is so important. So, what we had to do collaboratively is to find the impact that they expected that program to make. Because what we were talking about earlier, the show me the results, right? How many organizations operate and just showing the numbers?

[00:05:45] Initially when I started working with In4All, what they were doing was reporting out the number of students participating, the number of STEM business professionals participating, the number of schools. You get, you get the idea, it's all numbers, but we want to move past the numbers. [00:06:00] I will say it's important to still track your numbers and be able to report those.

[00:06:04] But you also wanna be able to report the difference you expect your program to make. And that requires you to collaboratively to sit down and define what that impact looks like. So coming back to In4All, we all sat down and looked at, well, what difference do we expect for the students in this program to have?

[00:06:23] Like what is going to change for them? We don't have enough time for me to read the entire evaluation plan to you. It's not story time, but I will tell you one of the things we defined, and I'm gonna read it verbatim so I don't miss any of the words. One of the measurable outcome statements we created is that students will increase their interest in STEM-related careers.

[00:06:46] So that provided direction for what we ask in the surveys. So, in pre and post surveys, we know that we're measuring did they increase their interest in STEM-related careers? So that goes back to the, show me the results, [00:07:00] right? It's more than just the number of students participating. Now we can speak to what difference it made for them in terms of did they increase their interest in those STEM-related careers.

[00:07:12] That's helpful. And I know there are many constituents, if you will, that have an interest in the results and, and the success. Are you finding, or have you found that donors are in fact being more discerning and asking more questions? And has program evaluation, the success evaluation helped?

[00:07:32] Did that help the donors you mean? Well helped, helped, uh, certainly the donors on perhaps where they'd like to give, but also helped the organization in terms of being able to satisfy donor questions, and possibly even have a positive economic impact on the nonprofit. Absolutely. I think what the, what I've heard from my clients is that doing program evaluation has offered them the opportunity to concisely [00:08:00] communicate the difference their program is making.

[00:08:01] So that way the donors know, and it's very clear, it's very linear. If you expect them to increase interest in STEM-related careers, then you have the data to show progress towards that outcome. So it's that kind of linear relationship. Here's what we expect to have change.

[00:08:18] Here's the data that shows to what degree it did change. That the funder community is very interested from my experience, I'm not gonna speak for everybody, but from my experience, that's what I see funders wanting. I also see opportunities for nonprofits to share their program evaluation plan and to share their impact model or their logic model, which is essentially a one-page visual summary that shows what the program does and the difference it is expected to make.

[00:08:47] So if the funders are asking for X, Y, and Z and the nonprofit’s plan really focuses on A, B, and C, I've encouraged my clients to share the plan with the funder, and more times than not, [00:09:00] I am not gonna promise this to your listeners, but more times than not, the funder is really excited that there is an actual program evaluation plan in place where they are systematically gathering data that aligns to measurable outcomes that they can report out.

[00:09:15] So that's not as common from my experience of what I see out there. However, there is the opportunity for more and more nonprofits to put themselves in the driver's seat on what data to collect rather than the funder. And that's, that is where the shift happens. That's very compelling. Um, so how can an organization, that's right.

[00:09:36] Um, Chari, how can an organization get started with program evaluation? Let's say it's a new endeavor for them. What's the first step? Well, this again is another question that I could talk a very long time about, so I will try and shorten it for your listeners unless they wanna cuddle up for a few hours, but essentially you wanna really start with that question that we already looked at before, right?

[00:09:58] What difference is your [00:10:00] program trying to make for your program participants? That is starting to build curiosity or evaluative thinking. You know, a culture of inquiry. What difference are you expecting your program to make? If you start with that, as opposed to looking at what are our funders requiring us to do, you know, that's a very different mindset than what data do we need to learn of what's working.

[00:10:22] And what's not working right? So I like to think of this as a culture of compliance versus a culture of learning. So in a culture of compliance is where most organizations already sit because they don't know there's an opportunity to do it differently. They don't necessarily know they can gather data to make their jobs easier.

[00:10:41] So a place to start is to really look at your organizational culture. Because if it's one of compliance, if there's an unwillingness to do any kind of program evaluation work, right? Because there's this mindset of we're too busy, or it takes too much time, or it costs too much money. That's a culture of compliance to continue [00:11:00] to gather data that funders require and then wipe your hands of it and you're done.

[00:11:03] But if you shift it to a culture of learning and that's where you wanna start is to really look at that culture and how can you shift it, then organizations are really starting to learn and wondering what kinds of data do I need to make program planning decisions with? Let me give you another example going back to In4All.

[00:11:23] Okay. I worked with them. I was trying to count it up. I think it's like at least a decade ago. Which I don’t know how it's possible I'm 25, but I'm just kidding. Um, but it was a long time ago that I worked with them and in October, I had the honor of going to their annual meeting and they now have a role called Impact Manager.

[00:11:41] I mean, how, what does that say right there to your funders? Right? You have a role, someone on your team that is the impact manager, which is fantastic, and they presented this report that outlined what was going well based on the students and the staff survey results, where improvements were needed. [00:12:00] And what they were going to change in their program as a result of the data.

[00:12:05] I have to tell you, I could not take the grin off my face that entire time that report was being presented. I hadn't met the impact manager yet, and I rushed up to her afterwards. I was like, oh my goodness, that was amazing. Can I share out your report with everybody that I know? Because honestly, they are a model example of what an evaluation and learning culture looks like.

[00:12:27] And I just wanna give you a quick example from that report. It's not like mind bending lofty things that need to be changed. Right? In this case, one of the things they found that both staff and students reported was that there was a lack of materials. So, they had this systematic way of the number of people telling them we simply didn't have enough materials. That gave them something to point to when they apply for funding. Hey funders, it's showing us that we don't have enough materials.

[00:12:56] We need to increase our budget around materials, and then they'll have more materials [00:13:00] for the next round that the program runs. So, it might seem really trite, that kind of thing, but honestly, rather than it being a couple of staff members who say we ran outta materials today and not really tracking or understanding how many classrooms this truly happened in.

[00:13:15] When you're doing program evaluation, that systematic process of gathering the data lets you know where the gaps are and where the successes are. And in this case, they were like, that's an easy gap to fill. Great. We'll just provide more materials. Yeah, that's, uh, that's super helpful and everyone is looking.

[00:13:35] To improve outcomes and have more effectiveness, uh, frankly, both on the non-profit and for-profit side. Uh, it's interesting you say, uh, culture of compliance, and that means something a little bit different in our, like in our investment world. We're always talking about let's have a culture of compliance.

[00:13:51] As I hear you talk, the words that came to mind for a nonprofit or a mission-based organization, uh, I jotted down [00:14:00] embracing great outcomes, right? Like, like, let's, let's understand what they are, let's track them and, and, and let's make good decisions going forward with the input. And frankly, it leads to my next question, which is maybe we can, when you think about the various nonprofit stakeholders, talk a little bit more about the use of data in program planning.

[00:14:21] You bet. So, in program planning, if you have data to make decisions, it will make your job easier. I'm gonna say that again. If you have data for program planning, it will make your jobs easier. The, the reason I wanted to repeat that was because sometimes when I work with organizations, there's some resistance to putting in new processes in place, right?

[00:14:43] Because initially, initially it will take more time. Because it's a new process, it will take more time. But when you have the data that you need to make decisions, it should ultimately save you time and make your job easier. Let me give you an example. They're gonna hear a little paper rustle, listeners, [00:15:00] Because I wanna put something in front of me so I get everything right.

[00:15:02] I worked with a Northwest Housing Alternatives and they have a resident services program. Okay? So, this means one time a week a resident services coordinator would go into whatever the apartment complex was and offer resident services activities to help those residents in whatever way they needed. So, in preparing to go in, again, they just go in once a week.

[00:15:26] They're not there every day. Once a week, preparing to go in, they would get some ideas out of the 40 plus activities they offer from social events to budgeting workshops to everything in between. They would go in and sometimes there would be a crisis or some reactive, or a property manager needs help with a resident that's on the verge of being evicted, you know, and the list goes on.

[00:15:49] It was stressful. They seemed stressed. When I first met with them, it was stressful. And I said, well, what is your ultimate goal? Like, why are you doing this at all? And they [00:16:00] said they were very much unanimous in this is to promote housing stability. Okay, let's talk, let's break that down. If you wanna promote housing stability.

[00:16:09] How are all of these 40 plus activities doing that? Because everybody's different. All of those residents are different. So, we realized what would be helpful in program planning is to have a housing stability rating for each of those residents updated on a regular basis. So, it's, without going into a ton of detail about how we develop this, suffice it to say, we created a housing stability scale where one being the resident is at a low risk of losing housing and a five being they're at a high risk of losing housing.

[00:16:43] And you know, you can imagine one to five, right? So, each resident coordinator would rate all of the residents in the building and then they would continue to update it over time. And guess what happened? They would start to print out that report before they even went in the building. Devon's smiling.

[00:16:57] She knows where this is going. Right? So, [00:17:00] uh, Bob, you're joining right in. That's great. So what we're doing is we're developing this plan so they aren't stressed anymore. They can see, okay, most of the people in this building are somewhere between a one and a three. So, I'm gonna do some social events. I'm gonna do a tax assistance workshop.

[00:17:16] I think that's what will work best for this group. So, they walk into the building already knowing who's there because you know, they each were managing at least five different buildings. So that's a lot of people, and it's a lot to hold in your head and keep track of. But by having this housing stability status rating, they were able to look at the data.

[00:17:36] Where everybody stood in terms of like, were they at a low risk of losing housing or a high risk of losing housing? And make decisions on what activities they were going to do before they even walked in the building. And I had high hopes for this process. And sure enough, a couple months later, they were all like, you know, their shoulders were not up to their ears anymore.

[00:17:58] They were excited about the data. And [00:18:00] that's where the big shift happened in that group from really becoming a culture of learning. They're like, great, what other kinds of data do we need to gather to make our jobs easier? So, when that shift happened, when they saw how the data could help them in their jobs and make their jobs easier.

[00:18:17] And the data were useful to them. That's where the shift happened. And again, I don't work with them anymore. That's the measure of my success. If you still need me, I'm not doing my job. So, Chari, you shared with us some really compelling successes. Uh, can you speak to any potential pitfalls that you see within the program evaluation space?

[00:18:37] You bet. So one of the things I see when I start working with an organization, when they've tried to do program evaluation on their own is not leading with what's realistic to do. Right? So, they get very excited. They wanna do a survey, so they do a survey, they send the survey out, they get 500 responses, which is great.

[00:18:58] And then they realize they [00:19:00] don't have anybody on staff available to analyze the data and report out the results. So now participants have taken the time to provide these responses, but they go nowhere. So that is a common pitfall to commit to collecting more data than what's realistic to gather. And I really think it's important for nonprofit organizations to lead with what is realistic, which is why I'm such a big advocate of doing this process collaboratively.

[00:19:26] You have to really have everybody in the room who gathers the data analyzes the data, does the reports or anything with the data that they are in the room, and that they are helping to make the decisions on what data will be collected. And it's really clear who's going to do all of those follow-up steps, because that is probably the most common story that I get from people.

[00:19:48] And they hang their heads in shame and they're like, oh my goodness, we have all these surveys. We don't know what to do with that. That's super helpful. That's, uh, I think of, uh, that type of [00:20:00] scenario perhaps being better off, not even asking if you're not going to act on the data. Right, right. Folks put time and effort in and that's right.

[00:20:08] And then nothing comes back. So let's circle back to something you mentioned or we talked a little bit about earlier, and that is, I'm putting in the category of fundraising, capital campaigns, and so on. Yeah. And I just think even personally as, as a donor, my wife and I, we'll be more discerning.

[00:20:25] So just curious how if there's anything else you'd like to add on program evaluation and sort of its tie into either capital campaigns or fundraising, any of that. You bet. So I have a wonderful quote to share with you, but let me give you a little backstory first about this person. So, when I was researching my book, Nonprofit Program Evaluation Made Simple, and I have four organizations that generously let me include their behind the scenes work in the book.

[00:20:58] For each one, I wanted to sit down with [00:21:00] them and learn a little bit more about what they're doing with program evaluation, how it looks today. And one of these organizations is Path Home. They were formally called Portland Homeless Family Solutions, and I worked with them in 2013 and I was working on the book in 2019 and I sat down with the executive director, Brandi Tuck, and asked her a lot of different questions.

[00:21:22] But I literally almost fell out of my chair when she said one of the biggest changes they've had is how their fundraising capacity increased because they could clearly communicate the program impact. Here's some more rustling of paper listeners, because I wanna read this to you. Okay. Because here's what she said.

[00:21:40] Because of having this program evaluation system in place, our annual fundraising from foundations increased by 677%. And our annual fundraising from individual donors increased by 753%. [00:22:00] Because they had that linear piece, right? They could say, here's what we expect to have change. Here are our data to show how it changed, and that is compelling.

[00:22:09] I wanna go on the record saying, program evaluation is not a fundraising activity, marketing or PR, or anything else. The core reason to do program evaluation is to learn, but what I've heard from my clients is how it has the capacity to support their development efforts. But if you lead with it, if I get a call and say, I need data to show what a great organization we are, I'm like, then you should call a marketing person.

[00:22:34] If you want to learn what's working and what's not, then that's program evaluation. It's really important to really define the difference between the two. This whole time that you've been speaking Chari, and first of all, you clearly have such passion for the field, which is, um, it's wonderful to, to see and hear.

[00:22:51] It keeps bringing back to mind something that a guest, we had a webinar several years ago, um, where we interviewed Roy Swan from [00:23:00] the Ford Foundation. And I remember very clearly him saying, what you measure matters. And it seems like most of your comments come back to that, and it's not only gathering the data, but then using the data to learn, to make adjustments, to be able to articulate your value to participants and to potential donors.

[00:23:21] So really, um, you know, so compelling and thank you for sharing all of this with us. Happy to. We've covered a lot, but is there anything else that you'd like to add or perhaps underscore for listeners? Yes. Two things. First is to be realistic in what you can collect. You have to lead with being realistic.

[00:23:40] You need to include the people on the front lines. If you are the boss, you need to give those people permission to say that's not feasible to collect. It's really important, right? Because that's how you build ownership and buy-in to the whole process. You wanna make sure everyone is on that program evaluation train before it leaves the station.

[00:23:58] And that means building buy-in [00:24:00] which leads to building ownership, which leads to it happening. You know, so that's, that's the big thing. The second thing is you can start right now. So if you are gathering just the numbers, which you should be, which is great, you can start having data learning communities on a regular basis where you look at the data that you have and you have just an open dialogue about what the data means.

[00:24:20] How can you use the data to inform program planning? It could likely lead to discussions around what data do you need to collect that isn't currently being collected. Maybe you all sit down and create, use your magic paintbrush and create a report you wish was existed. I think reverse engineering what you need to collect is a really good way to arrive at what do you expect to have change. How many hours do we have? Can I talk a little longer?

[00:24:49] This is super helpful and, and we very much appreciate you, uh, sharing so much. And wonder just, uh, as we wrap things up if we might learn a little bit [00:25:00] about Chari, the individual. And so, when you're not passionate about evaluating programs and sort of outside of the professional world, what do you enjoy doing on a weekend or a day away from the office?

[00:25:14] Well, I love to write stories, songs, plays, musicals. I'm a piano player. Um, I hold up the book. The last thing I did was this children's picture book called The Piano and it tells the story of a girl and her piano and how their friendship evolves over time, and it's told from the perspective of the piano.

[00:25:32] So that's my last, my latest, one of my latest creative projects. How fun. That's wonderful. Thanks. Um, so Chari, we are so appreciative of your time and your expertise today. If folks want to learn more about the work that you do, how can they do that? How might they be able to access resources? Sure. There are two ways.

[00:25:53] The first is go to my website, evaluation into action.com. There's a resources page [00:26:00] there that has lots of free things that you can download and take a look at and learn from. There's also my book. Nonprofit Program Evaluation Made Simple. And I wanna point out that also comes with a companion website that I update on a periodic basis.

[00:26:14] So it's loaded with evaluation plans, a survey template, and more. Uh, and you can access that web address and passcode in the book, and you can get that wherever you like to buy your books. Chari, this has been absolutely wonderful. Thanks so much for joining us. You bet. Thanks for having me. And to our listeners, this might be a volatile year in the making.

[00:26:36] Who knows? We have so many resources on our website, fiducient.com. We've got our three themes for 2024, all the past episodes of this podcast, and a lot more. So again, that's fiducient.com. And to all you good stewards, thanks for investing time to help your nonprofits prosper. We'll connect with you soon on the next episode. [00:27:00]

[00:27:00] Thank you for listening to the Nonprofit Investment Stewards podcast. Click the subscribe button below to be notified of new episodes and visit fiducientadvisors.com for more information. The information covered and posted represents the views and opinions of the guest and does not necessarily represent the views or opinions of Fiducient Advisors.

[00:27:19] Content is made available for informational and educational purposes only and does not represent a specific recommendation. Always seek the advice of qualified professionals familiar with your unique circumstances.

Previous