performance management in vr what is it and why do it
play

Performance Management in VR: What is it and Why do it? Northern - PowerPoint PPT Presentation

Performance Management in VR: What is it and Why do it? Northern Arizona University (NAU)-Institute for Human Development (IHD) Evidence for Success Combined Disability Conference Scottsdale/Fountain Hills, Arizona July 10, 2018 Cayte


  1. Performance Management in VR: What is it and Why do it? Northern Arizona University (NAU)-Institute for Human Development (IHD) Evidence for Success Combined Disability Conference Scottsdale/Fountain Hills, Arizona July 10, 2018 Cayte Anderson, Ph.D., CRC

  2. Session Overview • Evaluation principles • Why should we integrate evaluation into our programs? • Demystifying the process • Culturally responsive approaches to evaluation • Program evaluation and quality assurance within VR and AIVRS programs • Resources (AIVRTTAC, PEQA, AEA, conferences)

  3. What happens when you hear the words “data” or “evaluation”? a) Do your eyes glaze over? b) Do you picture a room full of serious people, hunched over their keyboards and calculators, crunching numbers? c) Embrace your inner nerd? data nerd.

  4. What if using data is really more like this?

  5. What is Program Evaluation? “the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development.” 1 Program Evaluation or Informal Assessment? -the difference is that PE is conducted according to guidelines and grounded in principles

  6. Program Evaluation Guiding Principles Systematic Inquiry Competence Integrity Respect for People Common Good and Equity

  7. Principle A: Systematic Inquiry 2 Evaluators conduct systematic, data-based inquiries that: • Conduct data-based inquiries that are thorough, methodical, and contextually relevant. • Explore strengths and shortcomings of evaluation questions and approaches • Communicate approaches, methods, and limitations accurately

  8. Principle B: Competence Evaluators provide competent performance to stakeholders: • Possess appropriate skills and experience • Demonstrate cultural competence • Practice within limits of competence • Continually improve competencies -

  9. Principle C: Integrity/Honesty Evaluators display honesty and integrity and attempt to ensure them throughout the entire evaluation process: • Practice honesty with clients and stakeholders • Disclose values, interests, and conflicts of interest • Represent accurately methods, data, and findings • Disclose source of request and financial support for evaluation

  10. Principle D: Respect for People Evaluators respect security, dignity, and self-worth of all stakeholders: • Honor the dignity, well-being, and self-worth of individuals • Acknowledge the influence of culture within and across groups

  11. Principle E: Common Good and Equity Evaluators take into account general and public interests: • Include relevant stakeholders • Balance client and stakeholder needs • Examine assumptions and potential side effects • Present results in understandable forms

  12. Debunking Evaluation Myths Evaluation is thought to be... Evaluation can be... Expensive Cost-Effective Time-consuming Strategically timed Tangential Integrated Technical Accurate Not Inclusive Engaging Academic Practical Punitive Helpful Political Participatory Useless Useful

  13. Are You Being Pushed or Pulled Into Evaluation? Pushed: external mandates from funders, authorizers, or others. Performance measures are really just another way of asking, “How are we doing?” leading to a deeper exploration of “Why? ” Pulled: internal need to determine how the program is performing and what can be improved It’s usually a combination of both!

  14. A Framework for Program Evaluation Standa ds Utili' ty F , ea.si bi ity Propr il , ety Accuracy https://www.cdc.gov/mmwr/PDF/rr/rr4811.pdf

  15. Research or Evaluation?

  16. RESEARCH EVALUATION Seek to , g1 enerate Informati on for new knowledg, e decision mak i ng Researcher;..focused Stakeholde1 r-J ooused Key Questions Hypot hes - es Make res,eairch Recommendations recomm , endati ons based on key , ques ti ons Report to - stakeholders Pub li sh res ul ts

  17. Distinguishing Principles of Research and Evaluation (CDC, Introduction to Program Evaluation for Public Health Programs, 2012) Concept Research Principles Program Evaluation Principles Planning Scientific Method: state Framework for program hypothesis, collect data, evaluation: engage analyze data, draw conclusions stakeholders, describe the program, focus eval design, gather information, justify conclusions, ensure use and share lessons learned Decision Investigator-controlled Stakeholder-controlled Making Standards Validity: internal (accuracy, Repeatability: utility, feasibility, precision) and external propriety, accuracy (generalizability) Questions Facts: descriptions, Values: merit (quality), worth associations, effects (value), significance (importance)

  18. Distinguishing Principles of Research and Evaluation (cont’d) Concept Research Principles Program Evaluation Principles Design Isolate changes & control Incorporate changes and circumstances (minimize account for circumstances context) (maximize context; encourage flexibility) Data Limited number; sampling Multiple (triangulation Collection strategy critical; concern for preferred); sample strategy protecting human subjects critical; concern for protecting human subjects Analysis & At the end; scope focused on Ongoing; scope integrates all Synthesis specific variables data Uses Disseminate to interested Feedback to stakeholders; audiences in various formats build capacity; disseminate to interested audiences

  19. Why Evaluate our Programs? • Monitor progress toward program goals • Determine whether program components are producing the desired progress on outcomes • Permit comparisons among groups, particularly among populations with disproportionately high risk factors and adverse health outcomes • Justify the need for further funding and support • Find opportunities for continuous quality improvement • Ensure that effective programs are maintained and resources are prioritized

  20. The Top Six Reasons to Love Your Data Data... 1. Helps us make better decisions 2. Tells us more about our “customers” 3. Helps us improve services 4. Allows us to do really cool things 5. Helps us analyze processes 6. Helps us learn what’s working and what’s not

  21. “Vision without measurement is just dreaming. Measurement without vision is just wasted effort.” (IRI, 2011) ____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

  22. Understanding & Using the Data DATA _ .,_• KNOWLEDGE _ .,_• ACTION ____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

  23. no data 22%

  24. Performance Management • Design Program •Data Collection Eva I uatio, n • Data Analysis • R port R SU Its , •Quality o the data Qual lity • Ensure standards of quality met • Correct i ve Acti i ons . Assuranoe • Evaluat1 io n of Actions Taken Goal: continuous improvement through systematic and constructive action ____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

  25. Performance and WIOA • As of July 1, 2017...392 data elements • 150 new elements- WIOA, Barriers to Employment, Education, credentials, customized training, measurable skills gains, employment, post-exit • From annual to quarterly and FFY to PY (July 1-June 30) • Reports due 45 days after end of quarter-a firm deadline Nov. 15 th , Feb. 15 th , May 15 th , Aug. 15 th • RSA Portal and Edit Checks

  26. Performance and WIOA % of participants who are in competitive integrated 1. employment during the 2 nd quarter after exit % of participants who are in competitive integrated 2. employment during the quarter 12 months after exit Of those in competitive integrated employment during the 2 nd 3. quarter after exit, report the median earnings that are at the midpoint between the highest and lowest total earnings Credential Rate 4. Measurable Skills Gains 5. Effectiveness Serving Employers-determined jointly with core 6. partners

  27. WIOA Common Performance Measures % of participants who are in competitive integrated employment during the 2nd quarter after exit % of participants who are in competitive integrated employment during the quarter 12 months after exit Of those in competitive integrated employment during the 2nd quarter after exit, report the median earnings that are at the midpoint between the highest and lowest total earnings Credential Rate Measurable Skills Gains Effectiveness Serving Employers - determined jointly with core partners ____________________________________________________________________________________________________________ ____________________________________________________________________________________________________________

Recommend


More recommend