Personal Analytics: Getting off the deficit path Professor Gregor Kennedy The University of Melbourne
What are Personal Analytics? http://birdsontheblog.co.uk/getting-fitter-and-healthier-with-the-fitbit/
What are Personal Analytics? https://gigaom.com/2011/11/07/is-klout-crossing-the-line-when-it-comes-to-privacy/
What are Personal Analytics?
Big Data = Analytics https://www.linkedin.com/today/post/article/20140312180810-246665791-the-future-of-big-data-and-analytics
Learning Analytics is all the Rage
With Great Promise • Detect potential “at risk” students • Formative and summative feedback to students on their learning processes and outcomes • Assist with evidence-based resource allocation • Improve institutional decision-making and responsiveness to known challenges • Promote a shared understanding of institutional successes and challenges • Academic research and development (Long & Siemens, 2011)
Defining Analytics Learning Analytics is the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and optimizing learning, and the environments in which it occurs. Learners, Educators, Teachers Academic Analytics is the improvement of organizational processes, workflows, resource allocation, and institutional measurement through the use of learner, academic, and institutional data. THIS BIT IS Managers, Administrators, Funders NOT NEW Society for Learning Analytics Research (2011)
Two Traditions Intelligent Interactivity Tutoring Research Systems
Two Traditions Interactivity Research
Taxonomies of Interaction Interactivity Research Taxonomies and Classifications e.g. Thompson & Jorgenson (1989) Reactive Interactive Proactive
Taxonomies of Interaction Interactivity Research Taxonomies and Classifications e.g. Schwier & Misanchuk (1993) Reactive Proactive Mutual
Concerns about the past Interactivity Research • Often use fairly raw metrics, simple student measures and inputs (e.g. MCQs, simple access counts). • Largely descriptive (useful) but often fails to complete the feedback loop to students and/or teachers.
Two Traditions Intelligent Interactivity Tutoring Research Systems
Two Traditions Intelligent Tutoring Systems Feedback Student Pedagogical Model Model Domain Knowledge
Two Traditions Give a Hint Intelligent Tutoring Systems Flag the Error Explain the Error Show a worked example (Mike Timms)
Concerns about the past Intelligent Tutoring Systems • “ITS were recognised as narrow and brittle” (Cumming & McDougall, 2000) • … heavily reliant on educational programs and applications that had defined or discrete stages and steps. • They were often tied to a program and were not generalisable.
Two Traditions Combined Intelligent Interactivity Tutoring Research Systems Assess Smart Diagnose Student System Recognise Personal Adaptive
Two Traditions Intelligent Interactivity Tutoring Research Systems Drill and Practice Procedural Simulation Conceptual Simulation
Drill and Practice A A A A A X B B B B B C C C C C Content Content Content Student Path Content Content Feedback Content
Procedural Simulation X Student Path Implicit Feedback Explicit Feedback
Conceptual Simulation X Student Path Implicit Feedback Explicit Feedback
Back to (Today’s) Analytics
How Today’s Analytics are Used Detect “At Risk” ✓ Students for Retention Teaching & Learning Research, Evaluation ✓ & QA Personalised or Adaptive Feedback for Learning ✓
#1 … “At Risk” Analytics • Purdue University’s “Signals” • Used to predict students who are “at risk” • Individual student risk is predicted using an algorithm based on data from four sources: − Performance … “points earned in a course to date” − Effort … interaction with the learning management system as compared to peers − Academic history … e.g. GPA, prior academic history − Student characteristics … e.g. residency, age (Arnold & Pistilli, 2012)
“At Risk” Analytics • Post to students’ LMS • Email or text students • Refer them to an advisor • Call for a chat (Arnold & Pistilli, 2012)
How Today’s Analytics are Used
Analytics = Diagnosing Deficit Path The field of educational technology has always been interested in using students’ digital traces to assess and diagnose when they move away from preferred learning pathways. Assess, diagnose and recognise “deficit” Feedback X Student Path Preferred Parameters or Pathway
Deficit Pathways This approach has useful pedagogical applications … but Macro: Attrition Micro: Drill & Practice
So what? Is this a problem?
The Promise of Learning Analytics A core promise of learning analytics is to improving students’ micro learning processes in order to enhance their learning outcomes How can we … • harness different data analysis techniques • for the provision of more meaningful feedback • to students on their learning processes • in real time • for genuinely personlised learning environments?
Getting off the Deficit Path From Personal Deficit Analytics … … to Personal Learning Analytics
Example 1: Surgical Skills Simulation James Bailey Professor, Computing & Information Systems Ioanna Ioannou Research Fellow, Otolaryngology Stephen O'Leary Professor, Otolaryngology Patorn Piromchai PhD Student, Otolaryngology Sudathi Wijewickrema Research Fellow, Otolaryngology Yun Zhou PhD Student, Computing & Information Systems
Example 1: Surgical Skills Simulation O'Leary, S., et al. (2008). Validation of a networked virtual reality simulation of temporal bone surgery. The Laryngoscope , 118 (6), 1040-1046.
Metrics from the Simulator • Tool position, orientation and force metrics - e.g. current force applied by the drill • Burr metrics - e.g. radius of the current burr • Anatomical structure metrics - e.g. distance of the drill tip to the closest point of one of three key anatomical structures • Bone specimen metrics - e.g. rotation of the bone ----- 15 records of 48 metrics generated per second -----
A Key Metric: Stroke • A sequence of points containing a continuous drilling motion • The end of a stroke is reached - when drilling ceases; or - when there is an abrupt change in the direction of drilling • Once a way of identifying strokes has been determined a range of “stroke metrics” can be calculated from the data stream output by the simulator (e.g. stroke duration, stroke length, average stroke speed, minimum distance of stroke to structures, etc.) (Hall, Rathod, et al., 2008)
Data Mining for Personal Feedback • We needed to provide personalised feedback to trainees across multiple dimensions or features in an open, complex, procedural simulation. • Not just deficit feedback about manifest error or procedural stage - “You hit the facial nerve” - “You should have completed X before Y” • For example: - force used - stroke length - stroke smoothness - distance to critical structures, etc.
Data Mining for Personal Feedback • Prototype 1: Hidden Markov Models built to discriminate patterns of novice and expert behaviour on a single association rule. • Prototype 2: A range of analysis techniques used to develop models to provide feedback on multiple features: - A random forest model to determine expert/novice behaviour - Nearest neighbour techniques along with a random forest model to generate feedback in the case of novice behaviour - An independent feedback system (application) was built
A Personal Feedback System Simulator Metrics Simulator Stroke Detector Proximity Triggers Stroke Metrics Feedback Feedback Parser Technique Feedback Stroke Metrics Feedback Generator
A Personal Feedback System
Feedback System Test • 24 medical students - 12 were provided with automated feedback - 12 were not • Knowledge of anatomy but not surgery; video tutorial of surgery and simulator familiarisation. • Two group comparison of students’ performance on a cortical mastoidectomy - Effectiveness of technique feedback - Accuracy of feedback - Usability of system
Effectiveness of Technique % of Expert Stokes With Without Feedback Feedback M (SD) M (SD) F p 61.59 (16.19) 38.86 (13.11) 14.29 <.001
Effectiveness of Technique
Accuracy of Feedback • A surgeon undertook a post hoc analysis of the feedback provided by the system - False Positives: feedback was provided when stroke technique was acceptable - False Negatives: feedback was not provided when technique was unacceptable. - Wrong Feedback: participants’ technique was accurately classified as “trainee” but the content of the feedback was inaccurate.
Accuracy of Feedback # of Feedback Percentage Messages (of Total) False Positives 39 6.8% False Negatives 69 11.4% Wrong Feedback 52 9.0% Total Feedback 576
Recommend
More recommend