robert e hoke independent evaluation consultant goals for
play

ROBERT E. HOKE INDEPENDENT EVALUATION CONSULTANT GOALS FOR THIS - PowerPoint PPT Presentation

EVALUATION MODELS AND METRICS PART II: UTILIZING YOUR GRANTEES REPORTS TO MEASURE IMPACT ROBERT E. HOKE INDEPENDENT EVALUATION CONSULTANT GOALS FOR THIS SESSION Build on the lessons from part one Apply tools of the evaluator in


  1. EVALUATION MODELS AND METRICS PART II: UTILIZING YOUR GRANTEES’ REPORTS TO MEASURE IMPACT ROBERT E. HOKE INDEPENDENT EVALUATION CONSULTANT

  2. GOALS FOR THIS SESSION  Build on the lessons from part one  Apply tools of the evaluator in analyzing grantee reports  Introduce to a framework of population and performance metrics  Share successes and challenges of measuring and reporting impact 2

  3. My question is: are we making an impact? 3

  4. EXERCISE  Think of a previous report with aggregated grantee performance/impact information that you thought was particularly effective.  What were the characteristics of that report that made it effective?  Share with your colleagues at your table the background of the report and what made it effective.  Any common characteristics? 4

  5. SO YOU HAVE A STACK OF EVALUATIONS FROM YOUR GRANTEES: NOW WHAT? 5

  6. UTILIZATION FOCUSED EVALUATION (UTE) INTENDED USES FOR INTENDED USERS 6

  7. POTENTIAL USES OF EVALUATION REPORTS FROM GRANTEES  Accountability: Oversight and Compliance  Monitoring: Progress on Process Benchmarks  Effectiveness: Desired Outputs and Results  Program Improvement: Implementation Changes  Knowledge Generation: General Understanding of Problem  Impact: Individual and Collective Impact of Grantees 7

  8. POTENTIAL USERS OF EVALUATION REPORTS FROM GRANTEES  Program Officer or Other Foundation Staff  Foundation Volunteer Leadership  Donors  Community Partners  Grantees 8

  9. WHAT EVERYONE IS TALKING ABOUT Collective Impact collective impact • • a very specific a more model of generalized organizing concept of joint community-wide decision making efforts based on about community Cincinnati's Strive goals, outcomes, Initiative (e.g., and priorities backbone organization) 9

  10. KEY EVALUATION QUESTIONS RESULTS BASED MICHAEL QUINN PATTON EVALUATION FRAMEWORK • What did you do? • What? • How well did you • So what? do it? • Now what? • Is anyone better off? 10

  11. COMBINED PROGRAM LOGIC MODEL 11

  12. EVALUATING PORTFOLIOS: ITS NOT JUST COMPARING APPLES AND ORANGES 12

  13. WHAT DO WE CHOOSE TO REPORT? What we aspire What we tend to report. to report. 13

  14. RESULTS BASED ACCOUNTABILITY (RBA) FRAMEWORK Mark Friedman: Fiscal Policy Study Institute Results-Based Accountability™ (RBA), also known as Outcomes- Based Accountability™ (OBA), is a disciplined way of thinking and taking action that communities can use to improve the lives of children, youth, families, adults and the community as a whole.  Performance Accountability  Population Accountability 14

  15. PERFORMANCE ACCOUNTABILITY Performance level accountability for the performance of programs, agencies service systems We identify measures / indicators that tell us the service / program is performing well In RBA, there are three different types of performance measures: (1) How much service was provided? (2) What was the quality of service? (3) Are the clients better off because of the service? 15

  16. POPULATION ACCOUNTABILITY Population level accountability for the well-being of an entire population in a geographic area. Population level accountability concerns conditions we hope to attain. Example: “Children in our community are healthy.” We identify measures / indicators that tell us the outcome is achieved for the population. Measures related to population accountability are solely focused on whether the populations are better off because of the services provided. 16

  17. RBA: FROM THE ENDS TO THE MEANS  What are the quality of life conditions we want for the children, adults and families who live in our community?  What would these conditions look like if we could experience them?  How can we measure these conditions?  How are we doing on the most important measures?  Who are the partners that have a role to play in doing better?  What works, including low-cost and no-cost ideas?  What do we propose to do? 17

  18. APPLY RBA TO IMPROVING HIGH SCHOOL GRADUATION RATES PERFORMANCE POPULATION Examining our Examining the portfolio of programs population in the community • How much service • Was there a did we provide? positive change in • How good was the high school that service? drop-out rate as a • Are the clients result of our better off? programs? 18

  19. WHAT IS THE OVERLAP BETWEEN POPULATION AT RISK AND THE CLIENTS OF YOUR PROGRAMS? Clients of Population Programs at Risk 19

  20. WHAT IS THE OVERLAP BETWEEN POPULATION AT RISK AND THE CLIENTS OF YOUR PROGRAMS? Clients of Population Programs at Risk The larger the overlap the greater the connection between performance and population measures 20

  21. PERFORMANCE ACCOUNTABILITY: BENEFITS AND CHALLENGES BENEFITS CHALLENGES • Timely Data • No information on impact on target • Causal link fairly population strong • Shows • Blind adherence to accountability to program plans donors • Program • Helpful for participants may program not be in target improvement population 21

  22. POPULATION ACCOUNTABILITY: BENEFITS AND CHALLENGES BENEFITS CHALLENGES • Data may not be • More flexibility timely • Stronger focus on • More difficult to at-risk populations show causal link • Community-wide • Increased collaborations and dependence on buy-in outside partners • Speaks to donors’ interest in results 22

  23. PERFORMANCE MEASUREMENT IS ABOUT PROVING WE DID THINGS IN THE RIGHT WAY: “WE FUND GOOD PROGRAMS” POPULATION MEASUREMENT IS ABOUT PROVING WE DID THE RIGHT THINGS: “OUR FUNDING HAD POSITIVE IMPACT” 23

  24. TRIANGULATION Triangulation in evaluation: • multiple methods • quantitative (data and statistics) • qualitative methods (interviews, focus groups) • various stakeholders Using both performance and population measures: “ WE FUND GOOD PROGRAMS THAT HAVE POSITIVE IMPACT ON OUR COMMUNITY ” 24

  25. LOGIC MODELS AND RBA 25

  26. EXERCISE: APPLY RBA TO HIGH SCHOOL GRADUATION RATES 26

  27. RESOURCES AND GOOD EXAMPLES 27

  28. BACK TO THE EVALUATORS’ TOOLKIT Apply the lessons from the first session on evaluating grantee measurement and methods to aggregation of their results: • Are your methods rigorous and appropriate? • Are you applying a one-size fits all approach or focusing only on reporting results from programs that had rigorous evaluations? • Are the evaluation findings consistent with your theory of change or program logic? 28

  29. EVALUATION TERMS YOU SHOULD KNOW • Meta-Evaluation • Evaluating Strategy 29

  30. CONVERSATIONS WITH YOUR GRANTEES • Are you sharing the information you learned from your analysis of the grant reports? • Are you giving the grantee feedback on how to improve their performance reports? • Are you engaging the grantees in post-grant review of how to improve the process? • Have you considered having a focus group of grantees to discuss how to change the grant program to have a greater impact on the community change desired? • Have you considered allowing the grantees to work together to recommend a standard set of reporting measures? 30

  31. UNANSWERED QUESTIONS? 31

  32. CONTACT INFORMATION Robert E. Hoke Independent Consultant 5301 E. 9 th Street Indianapolis, Indiana 46219 317-356-6367 robert@roberthoke.com 32

Recommend


More recommend