By Tara Leonard
WATSONVILLE, CA (September, 2011) - As the staff and volunteers at Second Harvest Food Bank work to combine food distribution with community-based nutrition education, the obvious questions arise: Do these peer education programs actually make a difference? Do participants change their eating habits for the better? And do these behavioral changes create measurable differences in participants' health? While anecdotal evidence points towards a positive impact, hard data is not yet available. But with scarce social service resources increasingly allocated to evidence-based practices, data collection is becoming a bigger focus at Second Harvest.
Second Harvest plans to "institute some more discipline around program evaluation" in the coming months and years said Food Bank CEO and Executive Director Willy Elliot-McCrea. However, the collection of personal health data can create a disincentive for participants, especially those who are already wary of the system.
"We don't want evaluation efforts to get in the way of program effectiveness," Elliot-McCrea explained. "But we need to measure program impact in order to attract resources. So what potential assets can we leverage to do that? Trusted neighbors? Trusted churches? It's a big issue and we're working carefully to get it right."
LACK OF FUNDING, STAFFING, MAKE EVALUATION DIFFICULT
There are a number of challenges involved, according to Second Harvest's Chief Operations and Programs Officer Brooke Johnson.
"There are so many variables," she said. "If we're giving someone fruits and vegetables twice a month, that means they're still eating a lot of meals that don't contain those foods. You hope that they're making healthier choices as a result of participation in our programs, but that's not always possible within the economic constraints of their household. Is it reasonable to expect to see a measurable difference in one's BMI (body mass index)? They're still facing significant barriers in terms of finances, food access, and other issues."
Many of the evaluation tools that other social service agencies use aren't appropriate to the literacy level of people coming to Second Harvest classes, according to Johnson. As a result, it's difficult to get people to complete the assessment or it takes a lot of staff time to walk people through it.
That leads to questions of cost containment. In 2010 Second Harvest spent only 5% of its annual operating budget on administration and fundraising efforts. 92% went directly to food purchase, storage, and distribution with another 3% going towards education and outreach. That stellar financial stewardship earned Second Harvest a four-star rating from Charity Navigator, a firm that evaluates America's largest nonprofits.
If every nutrition education program requires a slightly different evaluation tool, development and implementation costs can add up.
"We try to go where the people are and make our nutrition education accessible to them," Johnson said. "We have programs like Food for Children where the nutrition education is more on-the-go. We have drop-in, sit-down programs like the Passion for Produce sites and then we have more long-term, involved programs like Dominican."
In 2004 Second Harvest began a collaborative nutrition education effort with the pediatric clinic at Dominican Rehabilitation Hospital. Twice a year, 30 families sign a contract committing to attend nutrition classes once a week for six months in exchange for fresh produce. As a part of the program, they get a free visit with a nutritionist and their basic health statistics are taken. This program became the basis for more recent nutrition outreach programs, including Passion for Produce and Nutrition Ambassadors. Moving forward, Second Harvest hopes to consolidate the Dominican data into an easily accessible database.
PROCESS EVALUATION VERSUS IMPACT EVALUATION
But is it really necessary to collect weight, blood pressure, and other specific health data to assess program effectiveness? Or is it enough to show that behavior changes have taken place? And in the end, how much money and staff time should be spent on evaluation efforts?
"It's a common challenge that all service-based organizations are facing," explained Leslie Goodfriend, MPH, Senior Health Services Manager for Santa Cruz County Health Services Agency. "As much as they understand the importance of data, when looking at how to spend limited money, community-based organizations want to focus on getting critical services to people, not go through an expensive evaluation process. Funders generally don't provide enough funding to do true impact evaluation, so programs focus on less-expensive process evaluation – such as the amount of food distributed, or how many people have been reached."
Second Harvest makes such data readily available in their annual report. In addition, they have created a simple evaluation tool for education outreach participants, designed to measure self-reported behavior change rather than pre- and post-program health measurements. It asks participants questions such as "Since you began the Passion for Produce program have you increased the amount of fruits and vegetables you eat? Adopted healthier cooking habits, for instance cooking with less oil? Started reading nutrition labels at the grocery store? Decreased the amount of soda you and your family drink?"
Johnson is excited about the early results, along with anecdotal evidence from participants.
"At the graduation ceremonies, our new promotores talk about how the program impacted them," she said. "Participants talk about reading food labels while shopping with their children. At work, they tell their fellow employees about the sugar content in soda. They're definitely learning the information and sharing it with people in all aspects of their lives. They support one another in making behavioral changes that are pretty major."
"We don't have the data, which is frustrating!" Elliot-McCrea said. "Moving forward, we need to create community measures to support what we already know – we're doing important work."
Tara Leonard's reporting on food and nutrition banking was undertaken as part of a health journalism program offered through The California Endowment Health Journalism Fellowships, administered by the University of Southern California's Annenberg School for Communication & Journalism. We thank them for their support.