Cheaters Never Prosper
Creating Conscious Capitalists
Protect Your Cake
Jack Pelo: Bottling a Winning Team
Sink or Swim
Filling the Void
Industry Outlook: Nonprofits
Paper or Plastic?
Big as Life
Reaching New Heights
Utah Valley Economic Outlook
Businesses have a very easy way to measure their outcome: profit and loss. But beyond these metrics, some of the benefits we offer are really social in nature. How can we measure the impact of our work?
BALKEN: Equality Utah is a policy organization, and that’s supported by our education and outreach as well as our work with elected officials. Initially, we were really looking at our measurements as being statewide policy—and I don’t think it would surprise anyone around this table that it’s not an easy thing to achieve when you’re working on behalf of gay and transgender people in this state.
What we found was it was really important for us to begin to look at all the areas where we could have policy impacts. We had an opportunity to work with businesses, talking to them about their internal nondiscrimination policies, which is very measurable.
Another metric that we’ve begun to implement is measuring the increase in our membership base. In each community that we go into, we measure how many new members we’ve picked up and how many low-level donors we’ve picked up. These are the $5 to $10 gifts a month. That’s been a way for us to demonstrate to our donors that not only is the message getting to the population, but it’s helping to build the internal capacity and strengthen the organization toward our long-term goals.
WASHBURN: It’s tricky measuring impact. We do a lot of online surveying regarding guest experiences, and we’re getting better at it. There are other institutions that have 12 or 15 people doing that. We just hired our first one. But we’ve been doing it for a year, so we have a benchmark, and then we are growing that as far as deepening the relationship, deepening the experience.
One of the ways we measure it is in our membership program and year over year, we’re up 22 percent in our membership. We do specific surveys with members as well, trying to gauge how long have they been a member and do they renew. Then we’ll survey those who don’t renew to understand why they don’t renew. So we’re getting a lot more targeted.
WUNDERLI: We feel very strongly about showing return on investment not only to our investors but also to our participants in the program, so we’ve done a lot of collaborations in the community, working with the Social Research Institute from the University of Utah. We work with low-income people to increase their financial capability, and with the economy the way it is, a lot of middle-class people have fallen into poverty who have never been there before. So we’re working with the Department of Workforce Services to really help people with practical financial skills.
We have other concrete measurements. In our program, when people buy homes or start businesses or go back to school, we measure all of that. What was your job when you came into the program? What were you earning? What is your earning capacity when you complete the program? Were you in authority housing when you came into the program? What kind of a home did you buy? What kind of an interest rate did you get? What kind of mortgage payment do you have?
Everything we do, we want to make sure is measurable not just for our own internal controls, but also to the stakeholders in the community and the people who fund us and the people we serve.
TRENBEATH-MURRAY: We pinpointed three problems that we’re not very good at that we have been working on for about a year. One is correlating the data so we can show an outcome for, say, children’s academic gains or for parents’ self-sufficiency. I can show that, but what I struggle with is having the research-based knowledge within my shop to make really strong correlations between the two. So when children’s attendance is this level, it’s a predictor that this will be their scores or those kinds of things. I don’t have the knowledge base within my organization to take it to the level of research that I need in order to make a strong case.
Second is the presentation piece. There’s a real science to how it’s presented to policy makers, to donors. What does it look like visually on the paper? When do you use a bar graph versus a pie chart? How do you show that correlation in a quick and strong, powerful way that has a marketing message to it? We’re not good at that, so we’re trying to find somebody to help us with that.
Third is we’re not very good at demonstrating impact: turning it into dollars saved or how it contributes and helps our state. I can tell you that 92 percent of our children left Head Start academically ready for kindergarten last year, but I don’t have it refined to the point that I can tell you how much money that saved the state in long-term benefits.