How Are We Doing and Where Do We Go From Here? Practical Methods and Approaches to Evaluating Your ADR Program Russell Saltz & Zachary Miller Center for Alternative Dispute Resolution Annual Conference June 27, 2014
Presentation Overview • Introductions • Evaluation & ADR • Evaluation Design Exercise • Q & A Discussion 2
Getting to Know Us • Russell Saltz , Senior Research Associate M.P.A. Public and Economic Policy London School of Economics and Political Science • Zachary Miller , Senior Research Analyst M.P.A Nonprofit Management George Washington University 3
Getting to Know You • Name, affiliation • Where does your work or program fall on the ADR continuum? • What do you hope to take- away from today’s session? • Where on the conflict management mechanisms continuum do you fall? Conflict Management Mechanisms 4
Evaluation Overview • What does evaluation mean? – Oxford Dictionary: to form an idea of the amount, number, or value of; assess [something] – AEA: To assess the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness – Rossi: Use of social research methods to systematically investigate the effectiveness of social intervention programs in ways that are adapted to their political and organizational environments and are deigned to inform action to improve conditions 5
Evaluation Overview Program Evaluation Performance Policy Analysis / Analysis Management 6
Deciding Whether to Evaluate When should you evaluate a program? Dimension 1: Utility Will the evaluation be useful? For whom? Is the evaluation mandated? Dimension 2: Significance Is there an important issue with your ADR program? Is your ADR program or are there components of your ADR program that are a strategic priority? Dimension 3: Feasibility Is your ADR program evaluable and ready to be evaluated? Dimension 4: Accuracy Can your ADR program be evaluated well? 7
Deciding Whether to Evaluate • What Questions Can an ADR Evaluation Address? – Formative- Programmatic elements • What worked well? • What did not work well? • What needs to improve or change? – Summative- Outcomes • How many people served? • How many cases resolved? • How much did it cost? • Value of time and money saved? 8
Deciding Whether to Evaluate • Decisions: – What are the evaluation questions? – Identify and review the data sources (adequate & available). – Determine resource requirements (financial & human/professional). – Timing (enough for a good study?) – Institutional support? – What is the program’s status? (proposed vs. new; steady-state vs. startup; temporary vs. permanent/ongoing) 9
Deciding Whether to Evaluate • What type of evaluation is appropriate? Type of Evaluation Recognizable Features Fieldwork, observation, surveys, Implementation Analysis administrative data Program outcome data and Performance Analysis performance measures Cost-Benefit/Cost-effectiveness Cost analysis Analysis/Return on Investment Quasi-experimental, Net Impact, and Multivariate statistical modeling; Outcome Analysis comparison group Experimental Net Impact Analysis Random assignment; control group 10
Examples • ADR at the World Bank – Outcomes examined- direct impact : cost; time; client satisfaction; jobs lost, retained, or created; avoidance of bankruptcy; avoid a negative public image – Outcomes examined – indirect impact : increase effectiveness of courts; improve perceptions of the quality of the legal system and increase trust in the fair resolution of conflicts; influence on investors’ perceptions; improvements in business relationship quality 11
Logic Model - A Helpful Tool • What is a Logic Model? – A tool to help evaluate the effectiveness of a program – Visual representation of inputs and how they translate to outputs and ultimately outcomes – Identifies the “What”, “If” and “Then” 12
Ex. Logic Model for an ADR Program 13
Evaluation Design Exercise Please fill out the provided Evaluation Design Worksheet. 14
Evaluation Design Exercise 1. Involve intended users (key informant interviews/surveys) (participatory) 2. Clarify program design (goals, priorities, structure, population target groups, client flow) 3. Explore program reality (data systems, structure, implementation status) 4. Assess plausibility of the program (readiness, demo, test, etc.) 5. Reach agreement on any needed changes to program design or implementation 6. Reach agreement on evaluation questions, focus, use of findings. 15
Contact Information Russell Saltz Senior Research Associate rsaltz@impaqint.com Office: (443) 718-4342 Zachary Miller Senior Research Analyst zmiller@impaqint.com Office: (443) 283-6231 16
Recommend
More recommend