evaluation in today s political social and technological
play

Evaluation in Todays Political, Social and Technological Climate - PowerPoint PPT Presentation

Evaluation in Todays Political, Social and Technological Climate Kathryn Newcomer September 8, 2017 1 Evidence-based Policy, Data-Driven Decision-making the New Normal? 2 Question to Address Today u What is the impact of


  1. Evaluation in Today’s Political, Social and Technological Climate Kathryn Newcomer September 8, 2017 1

  2. “Evidence-based Policy,” “Data-Driven Decision-making”– the New Normal? 2

  3. Question to Address Today u What is the impact of the “evidence-based policy” imperative, as well as the current political, social and technological climate on evaluation practice in the public and non-profit sectors? 3

  4. What are Challenges for Evaluators in Providing Evidence to Inform Policymaking? u What constitutes sufficient evidence? u How transferable is evidence? u When and where do we underestimate the role played by the “impactees?” u Where is the capacity to support both the demand and supply of evidence? 4

  5. Contrasting Views on Evidence- Based Policy #1 u We need to collect data to test if programs work or do not work. Versus u We need to learn which program mechanisms work for whom, where and under what circumstances. 5

  6. Contrasting Views on Evidence- Based Policy #2 u Policy should be made at the top and based on evidence. Versus u Policy is “made” through implementation processes at multiple levels by multiple actors with different types of data available to them. 6

  7. Contrasting Views on Evidence- Based Policy #3 u Program impact can be measured precisely. Versus u Measuring program impact is difficult as programs and intended impactees change and evolve. 7

  8. Contrasting Views on Evidence- Based Policy #4 u Random Control Trials (RCTs) are the gold standard for research and evaluation design. Versus u Research designs must be matched to answer the question raised; RCTs are appropriate for certain impact questions. 8

  9. Contrasting Views on Evidence- Based Policy #5 u Proven program models can be replicated in multiple locations as long as they are implemented with fidelity to the original design. Versus u Program mechanisms may be replicated in multiple locations as long as they are adapted to meet local conditions. 9

  10. Contrasting Views on Evidence- Based Policy #6 u Benefit-cost analysis should be used to compare social programs. Versus u Benefit-cost analysis is difficult to use to compare social programs given the challenge of costing out benefits, especially those accruing over time. 10

  11. Why isn’t There Agreement About the Quality of Evidence? u Differing professional standards and “rules” or criteria for evidence, e.g., lawyers, engineers, economists u Disagreements about methodologies within professional groups, e.g., RCTs u The constancy of change in problems and the characteristics of the targeted “impactees” 11

  12. We Overstate the Ease of Flow of Evidence It plays a wide (enough) causal role Policy prediction: It will play Study conclusion: It plays a a causal role here causal role there 12 Source: Cartwright, N. (2013). Knowing what we are talking about: why evidence doesn't always travel. Evidence & Policy: A Journal of Research, Debate and Practice , 9(1), 97-112.

  13. We Underes)mate the Role of Voli)on Among Impactees and their Decision-making 13

  14. We Underestimate the Evolving Sources of Complexity Affecting the Production of Relevant Evidence u Change in the nature of problems to be addressed by government and the philanthropic sector u Change in the context in which programs and policies are implemented u Changing priorities of political leaders – and under Trump? 14

  15. We Overstate The Current Evaluation Capacity among Decision-Makers Government 15

  16. Evaluation Capacity = Both Demand and Supply u How clear is the understanding between providers and requestors on what sort of data (evidence) is needed? u Are there sufficient resources to respond to demands for specific sorts of evidence? u How can evaluators instruct users about how to assess the quality and appropriateness of evidence? 16

  17. Transmission Process u Just as a there are many producers, there are many potential users of the evidence provided, e.g., different policy designer and implementers in complex service delivery networks u Understanding and strengthening the linkage between the producers of evaluative data and the many potential users of that information requires time and resources 17

  18. Evaluators Need to Help Information Users Frame Pertinent Questions and then Match the Questions with the Appropriate Evaluation Approach Questions Relevant Evaluation to Users Design 18

  19. Match Evaluation Approach to Questions Objective Illustrative Questions Possible Design • How extensive and costly are the program activities? #1: Describe • Monitoring • How do implementation efforts vary across sites, beneficiaries, • Exploratory Evaluations program regions? • Evaluability Assessments activities • Has the program been implemented sufficiently to be evaluated? • Multiple Case Studies #2: Probe targeting • How closely are the protocols implemented with fidelity to the • Multiple Case Studies original design? • Implementation or Process & implementation • What key contextual factors are likely to affect achievement of evaluations intended outcomes? • Performance Audits • How do contextual constraints affect the implementation of a • Compliance Audits intervention? • Problem-Driven Iterative • How does a new intervention interact with other potential Adaptation solutions to recognized problems? • What are the average effects across different implementations of • Experimental Designs/RCTs #3: Measure the the intervention? • Non-experimental Designs: impact of policies & • Has implementation of the program or policy produced results Difference-in-difference, Propensity programs consistent with its design (espoused purpose)? score matching, etc. • Cost-effectiveness & Benefit Cost • Is the implementation strategy more (or less) effective in relation Analysis to its costs? • Systematic Reviews & Meta-Analyses #4 : Explain how/ • How/why did the program have the intended effects? • Impact Pathways and Process • To what extent has implementation of the program had important why programs & tracing unanticipated negative spillover effects? • System dynamics policies produce • How likely is it that the program will have similar effects in other • Configurational analysis, (un)intended effects 19 communities or in the future?

  20. A Delicate Balancing Act Accountability Learning There is an ongoing tension between producing evidence to demonstrate accountability versus to promote learning. 20

  21. u Please join us in November, 2017 in Washington, D.C.! u "From Learning to Action" is the theme of our American Evaluation Association Annual Conference (3500+ attendees and 120+ workshops & panels), and in line with this theme, I have worked with committee of 17 (from 7 countries) to plan our approach, and we have challenged participants to: u think creatively about innovative ways to engage audiences at the annual conference – beyond panels and posters; u invite evaluators or evaluation users who might not normally attend AEA, but are clearly stakeholders in our work, to participate in conference sessions; and u submit a 60 second video on Learning from Evaluation to highlight how we can foster learning from evaluation in a variety of settings. 21

  22. Relevant References u Dahler-Larsen, Peter. 2012. The Evaluation Society . Stanford University Press. u Donaldson, S., C. Christie, and M. Mark (editors) 2015. Credible and Actionable Evidence , 2 nd Edition. Sage. u Head, B. 2015. “Toward More “Evidence-Informed” Policy Making?” Public Administration Review . Vol.76, Issue 3, pp. 472-484. u Kahneman, D. 2011. Thinking, Fast and Slow . Farrar, Straus and Giroux Publishers. u Mayne, J. 2010. “Building an evaluative culture: The key to effective evaluation and results management.” Canadian Journal of Program Evaluation , 24(2), 1-30. u Newcomer, K. and C. Brass. 2O16. “Forging a Strategic and Comprehensive Approach to Evaluation within Public and Nonprofit Organizations: Integrating Measurement and Analytics within Evaluation.” American Journal of Evaluation , Vol. 37 (1), 80-99. u Olejniczak, K., E. Raimondo, and T . Kupiec. 2016. “Evaluation units as knowledge brokers: Testing and calibrating an innovative framework.” Evaluation , Volume 22 (2)., 168-189. u Sunstein. C. and R. Hastie. 2015. Wiser: Getting Beyond Groupthink to Make Groups Smarter . Harvard Business Review Press. u World Bank Group . Mind, Society and Behavior . 2015. 22

  23. Thank You! Questions? I can be reached at newcomer@gwu.edu 23

Recommend


More recommend