EPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator, College of Education and Human Development, University of Minnesota Mike Greco, Director, Resilient Communities Project, University of Minnesota
Overview Ripple Effects Mapping (REM) : what it is, why it’s useful in this context Resilient Communities Project (RCP) evaluation findings Using evaluation findings to inform the evolution of RCP
Why did RCP do this? We wanted to answer some questions about the program: What project-specific and community-wide changes result from RCP partnerships? What’s working well from partner perspectives? What could we do to improve the partnership experience for communities? Is the program making good on its promises to partner communities? What goals is the program achieving? Are these the goals we want to achieve?
Evaluation Approach & Rationale Interrogating our program model using: ● Bottom-up approach : partners tell us what benefits/outcomes they realize ● Reflection: are these the outcomes we want to see? Developmental evaluation approach — incorporate ongoing feedback into program processes in real time ● Observe, propose, test, repeat ● Not formative or summative evaluation ● What are the standards for “measuring merit, value, and worth”?
Methods: Surfacing Community Outcomes Method: Ripple Effects Mapping--reflecting on work, collecting data ● Participatory group process with 6-14 people in a room ● Pair-share using a short interview, report back to the group ● Facilitators frantically record, move comments around, organize ● “But for…” principle Approach: Appreciative Inquiry ● Pair-share interview questions focus on outcomes ● Used seven community capitals as a heuristic device ● Did some projects make leaps forward ● Also asked participants to air grievances
Overall Data Collection Process REM Sessions (worked with REM expert from UMN Extension) ● One session for each partner community (4 total, 6-8 attendees each) ● Communities chosen based on how much time had passed Post-Session Follow-Ups ● Emailed interview questions to people who couldn’t attend ● Follow-up interviews to get more depth Generate a map during the REM session Note: We used XMind software, but could do low-tech version with Post-Its + flipcharts
Making Sense of Outcomes for RCP Recontextualizing REM data for RCP program improvement ● Read REM responses, thought about categories of outcomes ● Generated (as a team) new themes for organizing outcomes ● Used an iterative process to name outcomes ● Mapped outcomes backward to program activities ● Generated an “outcome map” to visualize how RCP works
“Pros” of the Process Group discussion surfaces lots of information Structure helps participants think about outcomes systematically Participants connected ideas and events they hadn’t before Sessions produced a tangible reminder of the partnership ⇢ validation of all the work partner communities did Sessions give partner communities a chance to “air grievances” Helped RCP recognize consistent challenges across communities Generated metadata about projects
“Cons” of the Process It’s hard to get a bunch of busy people in the room at the same time The projects represented are the ones you hear about Time is short, and revelations are incomplete Staff dynamics have a strong impact on success of these sessions
What We Learned
Impacted Municipal Programs, Policies, & Processes Adopting new policies Undertaking new programs and initiatives Incorporating new design ideas Hiring new staff
Strengthened Relationships Establishing ongoing relationships between city/county staff and University of Minnesota Strengthening interagency relationships Improving communication with residents and businesses
Changed Staff Attitudes and Perceptions Working across silos New perspectives and ideas bring new energy and motivation Shifting focus to devote time to neglected issues
Reframed Issues Rethinking problems and strategies Refocusing community goals Validating existing community initiatives Building momentum for ongoing projects
Created Space to Consider New Ideas Floating unpopular or “radical” ideas Raising awareness of an issue Creating opportunities to engage with the public
Saved Money/Was Cost-Effective Monetary savings from process efficiencies and cost-beneficial solutions Laying groundwork to engage professionals more effectively/efficiently Providing professional-level assistance more cheaply (sometimes at the expense of consultants) Ultimately, a good return on investment
Challenges Workload associated with RCP projects Projects that lead to “less useful” results Projects assigned to unwilling/unmotivated staff Staff turnover resulting in lack of continuity/follow-through Information overload
What We Learned Validated many of the “benefits” claimed for the EPIC model (. . .and highlighted contingencies that can intervene) A lot of impactful program work happens before students begin work on projects (the pre-work can make or break a partnership) Many outcomes involve changes in how staff see, think about, and do things Some staff found the demands of the partnership overwhelming Some partners were unable to fully capitalize on the partnership VS. once the formal relationship ended (free kittens vs. free beer)
Building the Plane While Flying It: Responding to What We Learned
Potential Program Changes Scale and length of partnerships (“deep immersion”) Pre-partnership preparation ● Community involvement (listening session? charette? advisory committee?) ● "Resiliency” assessment ● Asset mapping ● Project project lead orientation/training (job descriptions?) Non-course based assistance
Potential Program Changes Program efficiencies (survey example) Post-partnership processing ● Distill and aggregate findings ● Focused follow-up work on selected projects ● Prioritize projects for further action (longitudinal implementation plan)
Assessment and Evaluation Changes Document feedback from project leads on the heels of presentations ● ○ Reaction to presentation and key takeaways ○ Next steps ○ Resources to assist (poster, project brief, community presentation…) ● Redesigned and streamlined end-of-semester surveys for students, faculty, and staff (what, so what, now what) Two-year follow-up with partners using REM/interviews ● Two-year retrospective survey to students and faculty to assess outcomes ● Regularly incorporate feedback to inform program operation/strategic plan ●
Resources Ripple Effects Mapping Emery, M., Higgins, L., Chazdon, S., & Hansen, D. (2015). Using Ripple Effect Mapping to Evaluate Program Impact: Choosing or Combining the Methods That Work Best for You. Journal of Extension , 53 (2), n2. Kollock, D. H., Flage, L., Chazdon, S., Paine, N., & Higgins, L. (2012). Ripple effect mapping: A" radiant" way to capture program impacts. Journal of Extension , 50 (5), 1-5. Chazdon, S., Emery, M., Hansen, D., Higgins, L., & Sero, R. (2017). A Field Guide to Ripple Effects Mapping. https://conservancy.umn.edu/handle/11299/190639
EPIC-N Evaluation and Assessment Resources Elise Amel, Faculty Director, Sustainable Communities Partnership, University of St. Thomas Marshall Curry, EPIC-N Program Associate, University of Oregon
www.epicn.org . www.rcp.umn.edu @epicn.org @RCPumn Questions? rcp@umn.edu @ EPICNtweet
Recommend
More recommend