pro bono design management accelerator
play

Pro Bono Design & Management Accelerator 1 Decem ecember er - PowerPoint PPT Presentation

Pro Bono Design & Management Accelerator 1 Decem ecember er 12, 2 2018 2 Session 3 Impact Evaluation & Data Tracking 3 Coach introductions Peter James Senior Manager of Research & Evaluation Rene J. Schomp Senior Staff


  1. Pro Bono Design & Management Accelerator 1 Decem ecember er 12, 2 2018

  2. 2

  3. Session 3 Impact Evaluation & Data Tracking 3

  4. Coach introductions Peter James Senior Manager of Research & Evaluation Renée J. Schomp Senior Staff Attorney, Pro Bono Consulting 4

  5. Logistics - Nuts and bolts • Thank you to DREDF & Ed Roberts Campus! • Restrooms • Water • Lunch • Snacks 5

  6. Mindfulness moment 6

  7. Pro bono accelerator objectives 1. Shared pro bono language 2. Inspiration from peers 3. Role of pro bono in larger civil justice movement 4. Lens of equity & inclusion 5. Support on concrete action steps towards organizational change & pro bono design 7

  8. Pro bono accelerator roadmap 1. October 10: Volunteerism Overview 2. November 14: Recruitment, Cultivation, & Training 3. December 12: Impact Evaluation & Data Tracking 4. January 9: Placement, Supervision, & Technical Assistance 5. February 13: Capstone Project Presentations & Organizational Change Planning 8

  9. Today’s agenda ... 1. Morning: the theory and the methods a. Evaluation frameworks b. Quantitative methods c. Qualitative methods 2. Afternoon: the realities a. Doing evaluation in practice b. Working with funders 3. Capstone work time 9

  10. Grounding pro bono programs in a larger civil justice movement Who benefits from evaluation and data tracking? 10

  11. Ground rules • Beach ball conversations • One diva, one mic • Make space, take space • Be here now • Confidentiality 11

  12. Introduction to Impact Evaluation and Data Tracking 12

  13. A simple example 13

  14. Insights from data Understanding -- usually a bit late, but sometimes really late! 0 14

  15. Why evaluate le legal a l aid ? 15

  16. Why evaluate pro b ro bon ono p o prog rogra rams ? 16

  17. Evaluation Frameworks • Evaluation frameworks help ensure that your evaluation is answering relevant questions 17

  18. Defining your problem • Define and understand the problem that your org/program is trying to solve 18

  19. Theory of change • Explains how your org/program/new initiative solves problem and achieves goals • Think about comparison and counterfactuals: how does it bring about a change that otherwise would not happen? We believe that more people will file successfully with the help of a pro bono attorney (than without) because the applications are complex and daunting for most people without legal training. 19

  20. Logic models • A step-by-step diagram showing how your program will achieve its results in the real world Inputs Outputs Outcomes Impact Pro bono Completed Improved Staff time services applications status 20

  21. Focusing evaluation • For detailed evaluation, you can typically only focus on parts of the theory or logic model • Choose and define your focus area 21

  22. Focus area: process evaluation • Sometimes we are most interested in evaluating a specific process (rather than full program) • In this context, the process measures become the outcomes/impacts of interest

  23. Focus area: service evaluation • We are often interested in one or more phases of service provision

  24. Focus area: training evaluation • The Kirkpatrick Model: 4 Levels • Useful for defining outcomes and impacts

  25. Creating questions • Ultimately, all evaluation activity needs to be grounded in a question (or set of questions) What proportion of clinic participants successfully file? 25

  26. Today’s exercise: Part 1 Introduction • Problem + goal • Theory & logic model • Evaluation question 26

  27. The Evaluation Toolbox: overview of mixed methods 27

  28. Introduction to mixed methods • Actively choose your method(s) • Consider using multiple methods • Tailor the methods to question (esp whether you are interested in causation) 28

  29. Quantitative methods • For things that you can count or measure • Use to either explore or to confirm theories • Examples for legal aid/pro bono: • What proportion of clients in our pro bono program achieve a successful outcome? • Did our new retention strategy result in increased retention of pro bono volunteers? 29

  30. Selecting a measure for your question • Two broad types of quantitative measure that may be relevant to your evaluation question • Categorical variables • Numerical variables • e.g. win/lose • e.g. # cases handled • Counts/totals and • Descriptive statistics proportions (mean, median etc) 30

  31. Finding a data source • Internal administrative data - case management system and data tracking tools • External administrative data - matching to records held by courts, firms etc • Survey data - data provided by clients/pro bonos • Think about: • Appropriateness • Completeness (often a limitation of surveys) • Accuracy 31

  32. Observational frameworks Post-intervention Post-intervention outcomes known outcomes unknown Caseload Caseload 32

  33. Limitation of observational frameworks Post-intervention Counterfactual outcomes known unknown Caseload Caseload (- help) 33 33

  34. Causal frameworks Outcomes Outcomes without intervention with intervention Caseload (- Caseload help) 34

  35. Pragmatic approaches • Causal framework is gold standard but experiments often require professional assistance • Observational data is still valuable: • Understand what is happening • Develop new questions and theories • You can also try to approximate causal methods by using comparisons: • Comparing groups • Change over time 35

  36. Example of observational study • Jessica Steinberg (2011), In Pursuit of Justice Question How effective are limited scope services in the housing context? Methods Comparison of case outcomes between groups: ● tenants with no representation ● tenants with limited scope representation ● tenants with full scope representation. 36

  37. Learning from In P In Pursuit o of J Justice? Outcome measure Unrepresented Limited scope Full scope Possession 14% 18% 55% Mean days to move 47 54 97 • Insight: in this jurisdiction, lawyer may be needed at each step • Return to theory: what have we learned about the problem we are trying to solve? • Return to logic model: how might we adapt our program in light of new information? 37

  38. Demo of data analysis 38 https://docs.google.com/spreadsheets/d/1RUobI78KlCXhq5_qCaV3KgRcydv6zUvXYgJrR_52Aj0/edit?usp=sharing

  39. Tech tips: new skills, new tools 39

  40. Quick brainstorm What is on one way that you could use an quantitative m method to evaluate your pro bono program? 40

  41. Qualitative methods • To understand the nuance - expectations, experiences, reasons, perceptions • Good for why and how questions • Inductive: generating ideas and theories • Examples for legal aid/pro bono: • Individual interviews with clients to explore how comfortable they felt working with their pro bono attorney • Focus groups with pro bono attorneys to explore experiences of new mentorship model 41

  42. Qualitative methods detail Formats • Semi-structured interview • Topic guides: sequence of prompts/questions • Individual interviews: individualized • Focus groups: collective • Take notes or use a recorder Samples: • Principle of saturation • Professional rule of thumb = 25 participants • But think about subgroups 42

  43. Qualitative methods: example • Sarah Sternberg Greene (2016), Race, Class and Access to Civil Justice • What explains inaction in response to legal problems? Why does this differ by race? • 97 interviews with public housing residents in Cambridge, MA • Interview explored a range of related topics that might influence decision-making on using legal aid (e.g. past experience with justice system and other social institutions) 43

  44. Key findings • Negative experiences/perceptions spillover from criminal justice system • View that your treatment in justice system is dependent on money • Prior experience w/ public institutions - “ashamed, inadequate, degraded, and confused” • Desire for self-sufficiency • Racial differences in levels of trust and in level of comfort seeking help from formal systems 44

  45. Quotation from study “More money, more justice. I mean it. More money, more justice. It is true. The more money you have for an attorney, whether you are a big case or not, the more justice. If you have more money, they have more time to do the paperwork, investigate, that kind of thing. Oh I can get an attorney, let me tell you. No problem at all. But it won't be one of the good ones.” 45

  46. Listening exercise https://www.legalaidsmc.org/videos/ 46

  47. Quick brainstorm What is on one way that you could use an qualitative m method to evaluate your pro bono program? 47

  48. Observational methods • Specifically motivated observation • Often used in design research (recall Hagan) • Best for understanding (inter)actions - often phenomena without a formal record • Consider the impact of the observer • Examples for legal aid/pro bono: • Observing client flow at workshop • Observing client interviews • Observing court operations & hearings 48

Recommend


More recommend