performance metrics project
play

Performance Metrics Project Megan Lee, October 2015 - PowerPoint PPT Presentation

Monash Library Performance Metrics Project Megan Lee, October 2015 monash.edu/library monash.edu Which metrics have value? What is the Library trying to achieve Use metrics to measure how well were achieving our goals Use


  1. Monash Library Performance Metrics Project Megan Lee, October 2015 monash.edu/library monash.edu

  2. “Which metrics have value?”

  3. “ What is the Library trying to achieve”

  4. Use metrics to measure how well we’re achieving our goals

  5. Use metrics to measure how well we’re achieving our goals

  6. We want to measure the OUTCOMES, VALUE or IMPACT of the work

  7. The value of an academic library is complex, because the total value is composed of many separate values for each type of collection or service and because the value differs for different constituents and over time. (Tenopir 2013.)

  8. Libraries need to be [equally] deliberate and systematic in communicating the value proposition. …they need to be bold – sometimes even audacious – to convey an authentic belief that the information is important and that others want and need to know it. (Lewis 2015.)

  9. 4:19

  10. Milestone/Deliverable Date / to be updated Consider environment and identify all strategic documents as a basis for developing library April 2015 performance measurement matrix. Establish Library performance metric data collection principals. April Finalise Library performance metric project plan. April Review Library strategic documents as a basis for developing performance measurement April models. Evaluate and accept appropriate performance measurement model. April Develop template of Library performance metrics for strategic goals and add draft metrics. April Solicit feedback on Library performance metrics template and draft performance metrics from April Library Strategy Group input. Identify capacity of Library systems to automate the generation of Library performance May / June metrics. Identify goals from section 4. The Library’s role, in the Library Annual Plan that are not included in the 2014 strategic initiatives. Write Actions / Hypothesis / Target audience and Performance Metrics for each, to be reported against in the Library annual report. Review capacity of University applications to provide data for comparison against Library Deferred performance metrics. Consider a cost/benefit analysis comparing University system metrics against Library metrics, Deferred eg map Aspire reading list unit penetration against Moodle unit breakdown to demonstrate Library impact on student learning. Review all proposed metrics and minimise manual data collection. June / July Develop and document process for extracting all required system reports July / August Develop recommendations for Library metrics collection and assign ownership of roles July / August responsibilities at the individual level Develop secure data location for manual and system generated metrics data July / August Set up processes to map collected Library performance metrics against Library strategies and July / August objectives, for inclusion in the Library annual report. Develop calendar for release of mini reports that map collected Library performance metrics July / August against Library strategies and objectives based on annual University events, eg report on number of exam prep sessions attended May / June for July that Research & Learning staff can use in discussion with academics prior to exams in 2 nd semester. Develop a process to create presentation tools / Library metrics packages (eg infographics, July / August videos, case studies, stories) for staff to use in communicating the Library’s contribution to the University. Develop eLearning (Captivate) walk-through to help stakeholders understand why, what, how, August who and when Library performance metrics are collected and used. Build staff engagement in the collection of performance metrics, as staff experience the value August of access to packaged data that demonstrate the impact of the library’s contributions to the University. Develop report of project outcomes and recommendations for submission to LMC, IRSC & ILFC. August

  11. Recognising achievements & Prioritise remaining work

  12. Recognise achievements to date & Prioritise remaining work (Hampton, 2010)

  13. Reflective Initial report understanding of the LPM project development tool Current Significant Understanding of project actions the LPM Project What’s been done ? to date What’s left to be done? Critical evaluation:

  14. Reflective report development tool

  15. Reflective Initial report understanding of the LPM project development tool Current Significant Understanding of project actions the LPM Project What’s been done ? to date What’s left to be done? Critical evaluation:

  16. Developing Performance Metrics Partnering with faculty to ensure explicit development of information Strategic Initiative research and learning skills in the curriculum and to develop in-and extra-curricular resources and programs Design & implement Research & Learning Skills sessions, based Action on the Research Skills Development framework, that broaden student independent research and reporting capacities. Why do we decide we want to By explicitly articulating a development path for achieving the skills and capacities of a mature researcher , within the context of the Hypothesis of success work on a project? student’s units of study , the Library will escalate student research skills development Looking Out: Students, Academic Staff, Professional staff, Target Audience Looking In: Library Staff, Research & Learning Staff  Improved student performance in online learning assessment modules (Captivate, Moodle analytics)  Analysis of post session student feedback, solicited via survey Performance Metrics monkey (Survey monkey analytics)  Monitor grade average of students who use the library, via student self reported grade average, in Library user survey

Recommend


More recommend