pr preventing g a technol hnology ogy heada dache he
play

Pr Preventing g a Technol hnology ogy Heada dache he Presented - PowerPoint PPT Presentation

Pr Preventing g a Technol hnology ogy Heada dache he Presented by Trina Willard, Founder and Principal of The Knowledge Advisory Group Preventing a Technology Headache Welc elcome t e to the e Clien lientTrack web webin inar: We


  1. Pr Preventing g a Technol hnology ogy Heada dache he Presented by Trina Willard, Founder and Principal of The Knowledge Advisory Group

  2. Preventing a Technology Headache Welc elcome t e to the e Clien lientTrack web webin inar: We will begin shortly.  There will be a Q&A following today’s presentation.  This webinar is being recorded and will be available on our website.  Thank you for joining us.  #evaluationdata

  3. Today’ ay’s s Agenda  Welcome and introductions  Why organizations need evaluation data  Case studies  Creating a framework for identifying appropriate outcomes  Demo  Questions/Answers  Contact Information #evaluationdata

  4. Prev event ntin ing g a a Te Techno chnolo logy gy Hea eadache Trina Willard Founder and Principal Knowledge Advisory Group Sam Taylor Director, Product Management ClientTrack #evaluationdata

  5. Ove verv rview  Why organizations need evaluation data  The problem with evaluation tech efforts  Creating more successful tech solutions #evaluationdata 5

  6. Why Organi anizati ations ons Need E d Evaluati luation D on Data #evaluationdata 6

  7. Wha hat is is Ev Evalua luation? on? Program evaluation is carefully  collecting and analyzing information about a program or some aspect of a program in order to make necessary decisions The type of evaluation you  undertake to improve your program depends on what you want to learn Continual improvement is an  unending journey #evaluationdata 7

  8. Type pes of of Ev Evalua luation on  Process • Descriptions of what you do, how much you produce, and how you do it  Outcomes • Changes that occur within the client, member, or system as a result of your activities #evaluationdata 8

  9. Cons onside ider The hese Inf nflue luenc nces  External pressures • Increased public scrutiny • Funder accountability • Economic decline  Internal motivations • Continuous improvement • Greater mission focus #evaluationdata 9

  10. Recogn ogniz ize the he Adv dvant ntage ges of of Data  Data can provide powerful evidence  Data can be objective and reduce bias  Balanced information provides the opportunity for independent interpretation #evaluationdata 10

  11. Example ples of Evaluat luation ion-Rele levant ant Systems s  Case management or tracking  Outcomes reporting  Organizational dashboards  Data/survey collection interfaces  Web-based information platforms 11

  12. The The Prob oblem with th Evaluati tion Te Tech h Efforts #evaluationdata 12

  13. Have you ou ever seen n this his ha happe ppen? n?  IT provider & customer (nonprofits/government agency) make agreements and implement new system  Customer uses new system  Customer experiences difficulties  IT provider/system perceived as failure #evaluationdata 13

  14. Case se Study y 1  Description: Small parenting education nonprofit  Existing conditions: Managing data in Excel; limited staff capacity; limited training in data; cash strapped  Data Needs: Increasing demand from funders to demonstrate outcomes; wanted a case management and outcomes database  Solution: Found $$ for desired database and begin implementation #evaluationdata 14

  15. Case St Study udy 1 (c (cont ontin inue ued) d)  What happened? • Leadership changed two weeks before system implementation. • Board relatively disengaged • No buy-in from new leadership • Nonprofit terminates contract • Software is described as a “poor fit” with an “overambitious sales pitch” #evaluationdata 15

  16. Case se Study y 2  Description: Small, start up nonprofit coalition with a systems-change focus  Existing conditions: Very limited staff capacity, has a grant award to implement software  Data needs: Interested in creating a joint database with service providers in the area  Solution: Area university comes to the rescue as pro- bono data manager! Software is purchased and implementation begins #evaluationdata 16

  17. Case St Study udy 2 (c (cont ontin inue ued) d)  What happened? • No formal agreement with the university • Lack of engagement by key service partner • Collaborative data collection strategy remained undeveloped • Nonprofit terminates contract and blames the software system • Nonprofit is stuck with the bill • Foundation that funded the software no longer funds this nonprofit or efforts involving this software vendor #evaluationdata 17

  18. Creati ting M Mor ore Suc uccessful Te Tech S h Sol olut utions #evaluationdata 18

  19. St Step p 1: : People ople  Find a champion • Usually someone in leadership • Integrate communication about the effort into operations  Select someone who can implement and propel the effort forward • Choose a person who has a compatible mindset • Choose a clear communicator • Provide professional development if needed • Give authority that allows dedicated time  Ensure ongoing capacity #evaluationdata 19

  20. Step 2: 2: Pro roce cess  What outcomes will you measure?  How will you define each outcome?  What tools will you use to do so?  Who is responsible for data collection?  What are the timelines for doing so?  What questions do you want to be able to answer at the end of the year? #evaluationdata 20

  21. Know now Your our Audie udienc nce  Who are the possible audiences for your data?  What do they already know?  What do they want to know?  How will they use the data? #evaluationdata 21

  22. Crea eating a Framewor ework for for Ident dentifyi ying g Appr ppropriate e Out utcomes es What is a logic model? A logic model is a simple description of how your program, service, or initiative works that shows the linkages between:  Problem you are attempting to address  Program components  Program activities  Outcomes (both short- and long-term) #evaluationdata 22

  23. Wha hat You ou Shoul Should Mea easure Designing a good tool requires more time and attention than you may think. Questions can tap into:  Awareness  Knowledge  Attitudes  Skills  Behaviors #evaluationdata 23

  24. Type pes of of Sim Simple ple and nd Meaningf ningful ul Tools ools  Surveys  Personal interviews  Focus groups  Other data collections tools  Existing databases #evaluationdata 24

  25. Step 3: 3: Syst ystem  Document processes first  Choose the most simple solution that will get the job done  Include leadership, data entry personnel and data analysts in these conversations  Focus on how data relates to reporting  Clarify/understand the organization’s responsibilities for making the effort successful  Software is not the silver bullet #evaluationdata 25

  26. Wra Wrap Up! #evaluationdata

  27. #evaluationdata

  28. ClientTrack Demonstration #evaluationdata

  29. #evaluationdata

  30. Conta tact Informatio tion Ques uestions ns? Trina Willard Founder and Principal Knowledge Advisory Group trina@KnowledgeAdvisoryGroup.com 804-564-6969 www.knowledgeadvisorygroup.com #evaluationdata

Recommend


More recommend