just culture
play

Just Culture Recognizing and Reporting Errors, Near Misses and - PowerPoint PPT Presentation

Just Culture Recognizing and Reporting Errors, Near Misses and Safety Events Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita Why A Cultural Change? The single greatest impediment to error prevention in the medical industry is


  1. Just Culture Recognizing and Reporting Errors, Near Misses and Safety Events Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita

  2. Why A Cultural Change? — The single greatest impediment to error prevention in the medical industry is “that we punish people for making mistakes.” – Dr. Lucian Leape, Professor, Harvard School of Public Health in Testimony before Congress — Insanity: Doing the same thing over and over again and expecting different results –Albert Einstein — Change the people without changing the system and the problems will continue – Don Norman, Author of “The Design of Everyday Things”

  3. Are We Improving? — Let’s form a few small groups and discuss recent interactions you have observed between the health care system and a friend or family member’s medical care. — Did everything go perfectly? — If not, how did that make you feel? — What could have been different? — Was anything changed to keep a similar event from happening again?

  4. Is Our Culture Changing? — In your groups, now think of yourselves as being workers in healthcare. — Discuss, do you feel we are providing safer care? — What are the impediments to safe care? — Are you being supported by the health care system? — Are you being supported by your co-workers, managers? — Are you being supported by your patients?

  5. What Are Errors? — Acts of commission or omission leading to undesirable outcomes or significant potential for such outcomes — Errors may be active (readily apparent) or latent (less apparent) — Latent (less apparent) errors can lead to “Normalization of Deviance” wherein behaviors leading to such errors become “normal” and stripped of their significance as warnings of impending danger.

  6. Is It a “Slip” or a “Mistake”? — A “slip” is a lapse in concentration or inattentiveness — Slips are increased by fatigue, stress and distractions, including emotional distraction — Mistakes are failures during conscious thought, analysis and planning — Methods of addressing mistakes (training, supervision, discipline) are ineffective and often counterproductive in addressing slips.

  7. Are Human Errors Inevitable? — Only two things in the universe are infinite, the universe and human stupidity, and I’m not sure about the former. –Albert Einstein — Yes, Virginia, humans will continue to make errors. –Apologies to Francis Pharcellus Church (1839–1906) — Thus, if humans are involved, the system MUST be designed to either prevent errors or to prevent the adverse outcomes associated with errors. — Errors must be reported and analyzed to improve safety

  8. Sources of Human Error — Irrationality (Rationality = good judgment) — Negligence, conscious disregard of risks (including risks resultant from an error), gross misconduct (e.g., falsifying records, intoxication, etc.) — Cognitive Biases (Wikipedia lists about 100 types) — Heuristics – rules governing judgment or decision making — As short cuts, cognitive biases are used more often in complex, time pressured (production pressured) systems such as healthcare — Motivational Biases (wishful thinking) – believing something is true (or false) simply because you want it to be so. (e.g., Barry Sanders will win in 2016)

  9. “If you see it on the Internet, it’s So!” The 1897 version: If you see it in THE SUN, it’s so.

  10. Heuristics Mental shortcuts that decrease cognitive load — Availability Heuristic – mental shortcuts that recall the most recent information or the first possibility that comes to mind. — Representativeness Heuristic – similar observations have similar causes, e.g., fever in the last 2 patients was from atelectasis, thus it must be from atelectasis this time. — Affect Heuristic – “going with your gut feeling”, e.g., “I can do this safely this time” (estimate risk is lower than it is) or I’m afraid of this (very rare) outcome – overestimates risk due to fear

  11. Uses of the Affect Heuristic — Smiling (and better looking) people are — More likely to be treated with leniency — Seen as more trustworthy, honest, sincere and admirable — Negative affect — Feeling negative increases perceived risk of a negative outcome — Terrorism in the U.S. — This also increases the frequency of perceived negative outcome — Lack of affect heuristic can also lower perceived risk — Climate change is thought unlikely by those unexposed to significant weather changes — Affect (feeling) trumps numbers (statistics) — Explains why terrorism is more scary than driving even though you are far more likely to be killed just driving to work

  12. Is Your “Gut Feeling” this Lecture will be Too Long? — Mine says yes! — I have confirmation bias – my children tell me I lecture them far too much! — I may have a negative affect — (Some of you may be asleep) So I must ask … — Am I at fault? — or Did you have a poor night’s sleep? — or Is this mandatory and you have no interest in the subject? — or Did you have a great lunch and have postprandial fatigue? — All of these might affect my conclusion in a biased manner

  13. Common Cognitive Biases That Lead to Errors — Status quo bias – my stable patient will remain so — Planning fallacy – the tendency to underestimate task-completion times — Time crunches furthermore increase the use of cognitive biases as short cuts — Information bias – the tendency to seek information even when it cannot affect action — Focusing effect – placing too much importance on one aspect of an event

  14. Normal Accident Theory (Perrow) — Highly complex settings (e.g., medical care) — No single operator can immediately foresee the consequences of a given action + — Tight coupling of processes — Must be completed within a certain time period — Such as a crash cesarean section) = — Potential for error is intrinsic to the system — i.e., major accident becomes almost inevitable

  15. Normal Accident Theory versus A High Reliability Organization — Though normal accident theory is likely true, it is also probably that most medical errors are NOT related to the complexity of the system — Moreover, some organizations are remarkably adept at avoiding errors – even in complex systems. — HROs operate with nearly failure-free performance records, e.g., at Six Sigma (3.4 errors per 1,000,000 events).

  16. So What Characterizes a High Reliability Organization? — Preoccupation with failure — Commitment to resilience — Detecting unexpected threats and containing them before they can cause harm — Sensitivity to operations — A culture of safety – can draw attention to hazards, failures and errors without fear of censure from management.

  17. How Can Medicine Become Highly Reliable? — Increased Use of (Unbiased) Technological Aids — Triggers and Flags, Forcing Functions, Decision Support, Checklists, Protocols, CPOE, Medication Scanners — Use of Rapid Response Teams (intervention before harm) — Culture of Safety with Root Cause Analyses and Reporting of Actual or Potential Safety Breeches, i.e., of Critical Incidents — Quality Improvement Cycles (e.g., PDSA) to Address Error Chains — Team training and Crisis resource management (CRM) — Education, e.g. The Five Rights (right medication, right dose, right time, right route, right patient), EBM protocols, etc.

  18. So What Is Just Culture? — Addressing the twin need for a no-blame culture and appropriate accountability — Has an open, transparent environment where human errors are expected to occur but are uniformly reported and are used as learning events to improve systems and individual behaviors — A culture of safety is foremost — Example: FAA reporting system — Zero tolerance is given to conscious disregard of clear risks to patients (e.g., taking shortcuts), reckless behavior (e.g., refusing to perform safety steps, not reporting errors ) or gross misconduct — A purely blameless culture would allow willfully disruptive, negligent or harmful behavior to persist and lead to patient harm.

  19. Error Reporting — Anonymous — Identifiable Source — Higher likelihood of errors — Errors less likely to be being reported reported — “Safe” reporting with less — A just culture with fear of reprisal punishment for non- reporting can help — Less concern about need for legal protection — Can verify accuracy of report — Can be associated with — Can usually obtain more increased level of false details about error including reports investigation into the error chain — May be malicious and untrue reports — Less likely to be a false or malicious report — Error causes may be difficult to investigate as you can’t seek additional information

  20. Incident Reporting Systems — Supportive environment that protects privacy of staff who report — In a fully just culture, such protection of privacy would be unnecessary — Any personnel should be able to report — Summaries of reported events must be disseminated in a timely fashion — A structured mechanism must be in place for reviewing reports and developing action plans — Incident reporting is a passive form of surveillance — May miss many errors and latent safety problems

  21. Small Group Discussion — Please discuss why you might not report an error that you have made? — If you chose to report it, how would you do it? — Please discuss why you might not report an error that you have observed a co-worker make? — If you chose to report it, how would you do it?

Recommend


More recommend