220 likes | 811 Views
National Institute of Medicine report 1999. Significance of Medical Error. 44,000 - 98,000 deaths per year3 jumbo jets crashing every other day5th leading cause of deathMore in 6 months than in VietnamAnnual cost 37-50 billion dollars. How Hazardous Is Health Care?. . . . . . . . . . . 1/2.
E N D
1. Patient Safety Imperative to change attitude toward error. Everyone makes mistakes abandon defensiveness. IOM report call to arms , recognition that Medicine is a hazardous industry.
Safe care is what its all about. Everyone here probably know how bad it feels when an error occurs (ex giving Ca# to Dig toxic patient could just be cognitive but maybe a drug ordering system which asked if patient on Dig before releasing the drug could have prevented it
We have to change things! Its how we deal with error that defines us. And there is an enthusiasm to change Don’t have to look far to see results aviation, nuclear power, anesthesia
Physicians have a great opportunity to regain control
Imperative to change attitude toward error. Everyone makes mistakes abandon defensiveness. IOM report call to arms , recognition that Medicine is a hazardous industry.
Safe care is what its all about. Everyone here probably know how bad it feels when an error occurs (ex giving Ca# to Dig toxic patient could just be cognitive but maybe a drug ordering system which asked if patient on Dig before releasing the drug could have prevented it
We have to change things! Its how we deal with error that defines us. And there is an enthusiasm to change Don’t have to look far to see results aviation, nuclear power, anesthesia
Physicians have a great opportunity to regain control
2. Call to arms in 1999
Political document document as a result of results of Harvard medical practice study which showed nearly 4% of hospitalized patients suffered an adverse event and 30% were related to “human error”
Most important outcome was to introduce fundamental concepts of safety in complex systems. This is common knowledge in most other high reliability organizations. Focus on process not individuals.Call to arms in 1999
Political document document as a result of results of Harvard medical practice study which showed nearly 4% of hospitalized patients suffered an adverse event and 30% were related to “human error”
Most important outcome was to introduce fundamental concepts of safety in complex systems. This is common knowledge in most other high reliability organizations. Focus on process not individuals.
3. Significance of Medical Error 44,000 - 98,000 deaths per year
3 jumbo jets crashing every other day
5th leading cause of death
More in 6 months than in Vietnam
Annual cost 37-50 billion dollars Magnitude is startling but, many have become too focused on number don’t get caught up just realize it can be safer.Magnitude is startling but, many have become too focused on number don’t get caught up just realize it can be safer.
4. How Hazardous Is Health Care? Recognize Healthcare is hazardous
We need to look at home to become a high reliability organization(learn and adapt from safe industry aviation, nuclear power
Barriers to becoming safe Barach article Annals of Internal Medicine
(C) 2005 American College of Physicians
Volume 142(9), 3 May 2005, pp 756-764
Five System Barriers to Achieving Ultrasafe Health Care
Although debate continues over estimates of the amount of preventable medical
harm that occurs in health care, there seems to be a consensus that health care
is not as safe and reliable as it might be. It is often assumed that copying and
adapting the success stories of nonmedical industries, such as civil aviation
and nuclear power, will make medicine as safe as these industries. However, the
solution is not that simple. This article explains why a benchmarking approach
to safety in high-risk industries is needed to help translate lessons so that
they are usable and long lasting in health care. The most important difference
among industries lies not so much in the pertinent safety toolkit, which is
similar for most industries, but in an industry's willingness to abandon
historical and cultural precedents and beliefs that are linked to performance
and autonomy, in a constant drive toward a culture of safety. Five successive
systemic barriers currently prevent health care from becoming an ultrasafe
industrial system: the need to limit the discretion of workers, the need to
reduce worker autonomy, the need to make the transition from a craftsmanship
mindset to that of equivalent actors, the need for system-level (senior
leadership) arbitration to optimize safety strategies, and the need for
simplification. Finally, health care must overcome 3 unique problems: a wide
range of risk among medical specialties, difficulty in defining medical error,
and various structural constraints (such as public demand, teaching role, and
chronic shortage of staff). Without such a framework to guide development,
ongoing efforts to improve safety by adopting the safety strategies of other
industries may yield reduced dividends. Rapid progress is possible only if the
health care industry is willing to address these structural constraints needed
to overcome the 5 barriers to ultrasafe performance.Recognize Healthcare is hazardous
We need to look at home to become a high reliability organization(learn and adapt from safe industry aviation, nuclear power
Barriers to becoming safe Barach article Annals of Internal Medicine
(C) 2005 American College of Physicians
Volume 142(9), 3 May 2005, pp 756-764
Five System Barriers to Achieving Ultrasafe Health Care
Although debate continues over estimates of the amount of preventable medical
harm that occurs in health care, there seems to be a consensus that health care
is not as safe and reliable as it might be. It is often assumed that copying and
adapting the success stories of nonmedical industries, such as civil aviation
and nuclear power, will make medicine as safe as these industries. However, the
solution is not that simple. This article explains why a benchmarking approach
to safety in high-risk industries is needed to help translate lessons so that
they are usable and long lasting in health care. The most important difference
among industries lies not so much in the pertinent safety toolkit, which is
similar for most industries, but in an industry's willingness to abandon
historical and cultural precedents and beliefs that are linked to performance
and autonomy, in a constant drive toward a culture of safety. Five successive
systemic barriers currently prevent health care from becoming an ultrasafe
industrial system: the need to limit the discretion of workers, the need to
reduce worker autonomy, the need to make the transition from a craftsmanship
mindset to that of equivalent actors, the need for system-level (senior
leadership) arbitration to optimize safety strategies, and the need for
simplification. Finally, health care must overcome 3 unique problems: a wide
range of risk among medical specialties, difficulty in defining medical error,
and various structural constraints (such as public demand, teaching role, and
chronic shortage of staff). Without such a framework to guide development,
ongoing efforts to improve safety by adopting the safety strategies of other
industries may yield reduced dividends. Rapid progress is possible only if the
health care industry is willing to address these structural constraints needed
to overcome the 5 barriers to ultrasafe performance.
5. Five Precepts forError Management (Helmreich and Merritt, Culture at Work in Aviation and Medicine) Human Error is inevitable in complex systems
Limitation of human performance imposed by cognitive capabilities
High workload and stress increase error
Safety is a universal value but there is a continuum. How much safety we want and what can can we afford?
High Risk Organizations must develop a safety culture to make individuals and teams responsible Make physicians understand and accept these precepts and then they can move forward.
While basic the culture of medicine trains us to believe we as individuals can overcome bad systems we should rather focus on how system should be designed to help provide safety barriers and plug holes in reasons swiss cheese modelMake physicians understand and accept these precepts and then they can move forward.
While basic the culture of medicine trains us to believe we as individuals can overcome bad systems we should rather focus on how system should be designed to help provide safety barriers and plug holes in reasons swiss cheese model
6. Error, stress and teamwork in medicine and aviation: cross sectional surveys crews (Sexton, BMJ,2000) Medicine more likely to deny the effects of stress and fatigue
MD 60% v. CC 26%
Staff did not acknowledge they make mistakes
Surgeons more likely than intensivists and pilots to advocate hierarchies
45% v. 6% and 3%
Good to talk about early because realizing are biases and addressing them will help us move forward. I know I was skeptical when we first got into teamwork and many still are that way.
Culture of medicine will be slow to change and different specialties may be more or less amenable to nontechnical and cognitive interventions to improve safety.Good to talk about early because realizing are biases and addressing them will help us move forward. I know I was skeptical when we first got into teamwork and many still are that way.
Culture of medicine will be slow to change and different specialties may be more or less amenable to nontechnical and cognitive interventions to improve safety.
7. Clinician Attitudes About Teamwork Operating Room (Sexton JB et al. BMJ. 2000; 320(7237): 745-9)
Only 55% of consultant surgeons rejected steep hierarchies
Minority of Anesthesia and Nursing reported high levels of teamwork
Critical Care (Thomas EJ et al. Crit Care Med. 2003; 31(3): 992-3)
Discrepant attitudes between physician and nurses about teamwork
73% physicians “High” or “Very High”
33% nurses “High” or “Very High” Teamwork is essential to delivery of safe care. However in Medicine there is little formal training and aside from some routinely rostered teams such as cardiac surgery most teams are actually working groups with limited teamwork skills. Our goal is to make them high performing teams through teaching skills and behaviors such as those taught in aviation teamwork training.Teamwork is essential to delivery of safe care. However in Medicine there is little formal training and aside from some routinely rostered teams such as cardiac surgery most teams are actually working groups with limited teamwork skills. Our goal is to make them high performing teams through teaching skills and behaviors such as those taught in aviation teamwork training.
8. 2001 AAMC Policy Statement 80 hour week maximum
No more than 24 continuous hours
ED and critical care only 12 hours
8 hours between duty shifts
Maximum call 1 in 3
Day off every seven
This is an example, while not aggressive enough finally recognized effect of fatigue on performance something long recognized in other high risk industries.This is an example, while not aggressive enough finally recognized effect of fatigue on performance something long recognized in other high risk industries.
9. What is a Medical Error? “An act or omission that would have been judged wrong by knowledgeable peers at the time it occurred”
Institute of Medicine Definition of error is difficult and may be difficult to come to consensus of definition but this is a guideline to use
To improve safety and reduce medical error we need to accurately identify ,disclose and report errorsDefinition of error is difficult and may be difficult to come to consensus of definition but this is a guideline to use
To improve safety and reduce medical error we need to accurately identify ,disclose and report errors
10. Other Definitions Sentinel Event
An unexpected incident involving death or serious physical or
psychological injury, or risk thereof.
Example: Incompatible blood given to a patient resulting in death.
Incident
Error makes it to the patient
Does not require harm
Near Miss / Close Call
Used to describe any variation, which did not affect the
outcome, but for which a recurrence carries a significant
chance of a serious outcome.
Example: Wrong medication is dispensed for a patient, but the error is identified before the patient received it.
11. Schematic of definitions
Not all adverse event are a medical error and many medical errors do not result in harm thankfully
Schematic of definitions
Not all adverse event are a medical error and many medical errors do not result in harm thankfully
12. Human Error Models Person
Traditional approach
Unsafe acts, aberrant mental processes
Counter-measures directed at human behavior
System Approach
Accepts fallibility
Errors consequences, not causes
System defenses
Need to create shift away from traditional model of only focusing on contribution of individual in error.
Person at sharp end
Forgetful, unmotivated, negligent
Does not work even though it may be more satisfying to blame someone
System
People make errors, difficult to change human condition
Create defenses
Safety is a balance of individual and system contributions both contribute to error occurrence and conversely error mitigation.Need to create shift away from traditional model of only focusing on contribution of individual in error.
Person at sharp end
Forgetful, unmotivated, negligent
Does not work even though it may be more satisfying to blame someone
System
People make errors, difficult to change human condition
Create defenses
Safety is a balance of individual and system contributions both contribute to error occurrence and conversely error mitigation.
13. This is a commonly used illustration to show how holes in the safety barriers when aligned can result in error. These holes are latent system weakness which should be the target of patient safety interventions. This could be process of medication administration and computerized entry could be an initiative to decrease the number of hole in a piece of cheese. Be aware new technology may close one hole and open others.
Reason J: Managing the Risks of Organanizational Accicdents. Aldershot, UK, Ashgate Publishing Co, 1997.This is a commonly used illustration to show how holes in the safety barriers when aligned can result in error. These holes are latent system weakness which should be the target of patient safety interventions. This could be process of medication administration and computerized entry could be an initiative to decrease the number of hole in a piece of cheese. Be aware new technology may close one hole and open others.
Reason J: Managing the Risks of Organanizational Accicdents. Aldershot, UK, Ashgate Publishing Co, 1997.
14. System v. Person Balance between system and person
Help clinicians to be part of HRO
Address human factors training
Integrate people and technology
Dekker S. The Field Guide to Human Error Investigations. Ashgate Publishing, Limited. 2002, Burlington, VT.
Shapiro MJ, and Jay GD. “High Reliability Organizational Change for Hospitals: Translating Tenets for Medical Professionals.” Qual Saf Health Care 2003; 12(4): 238-9. Safety is a balance of individual and system contributions both contribute to error occurrence and conversely error mitigation. While the pendulum has shifted to focusing on systems and sometimes too much on technology we need to also train our physicians to be an effective part of the system and be human factors system.
The human contribution to safety should not be minimized.
Safety is a balance of individual and system contributions both contribute to error occurrence and conversely error mitigation. While the pendulum has shifted to focusing on systems and sometimes too much on technology we need to also train our physicians to be an effective part of the system and be human factors system.
The human contribution to safety should not be minimized.
15. Finally, don’t rely exclusively on new technology making patients safer…..
16. System Approach Advantages
Effect a Cultural Change
Enhances reporting
Identifies recurrent patterns
Promotes safeguards
Even the best commit errors,not flawless under stress
Effective risk management depends on reporting
Latent system failure to improve process of care delivery.Even the best commit errors,not flawless under stress
Effective risk management depends on reporting
Latent system failure to improve process of care delivery.
17. This is an expanded/ advanced schematic of Reason’s error model, which illustrates the complexity of causes of medical error. It shows the impact of organizational decisions, work environment, resources and indivuduals. Most analysis of error only focuses on the active error (care management problem) and does not dig back to latent system problems or error producing conditions.
This slide is busy but we want to give participant a deeper understanding of the complexity of how an incident occurs.This is an expanded/ advanced schematic of Reason’s error model, which illustrates the complexity of causes of medical error. It shows the impact of organizational decisions, work environment, resources and indivuduals. Most analysis of error only focuses on the active error (care management problem) and does not dig back to latent system problems or error producing conditions.
This slide is busy but we want to give participant a deeper understanding of the complexity of how an incident occurs.
18. SYSTEM THINKING in other high risk industries Aviation - Zero deaths in 1998.
Anesthesia - Deaths: 20 years ago 1 of 20,000 Today 1 of 200,000
Aluminum Refining (ALCOA)
“You can’t make the safety better without having a profound understanding of the process.”
Use this to motivate participants the power of system v. individual in helping medicine transform its performance. Anaesthesia led the way by looking at technology and teamwork earlier than most medical specialties.
Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): A decade of experience. Simulation and Gaming. 2001;32(2):175-93.Use this to motivate participants the power of system v. individual in helping medicine transform its performance. Anaesthesia led the way by looking at technology and teamwork earlier than most medical specialties.
Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): A decade of experience. Simulation and Gaming. 2001;32(2):175-93.
19. Error Management Lessons from High Reliability Organizations
Airlines fatality rate 0.27 per 1,000,000 departures
Serious medication errors 6.7 per 100 patients
Human variability is desired
Need to be preoccupied with failure
Train for the eventual error
Greater use of Simulation Nuclear power and aircraft carriers,aviation and maybe anesthesia
Goals
Managing complex tech and system to prevent catastrophe
Capacity to meet high demand
Nuclear power and aircraft carriers,aviation and maybe anesthesia
Goals
Managing complex tech and system to prevent catastrophe
Capacity to meet high demand
20. Your role? Seek non-technical safety education
Error Models and Process Improvement
Teamwork
Decision Making
Error Disclosure
Identify and report incidents
Participate in error disclosure
Participate in local safety improvements and national goals (JCAHO)
21. Mandates for Reporting JCAHO 2001 Standards
“Inform patients and, when appropriate, their families about the outcomes of care, including unanticipated outcomes”