170 likes | 303 Views
Regulating Patient Safety: Is it Time for a Technological Response?. Roger Brownsword Centre for Technology, Ethics, Law & Society (TELOS), Dickson Poon School of Law, King’s College London. Overview. The Francis Report (February, 2013): changing the culture; patients come first
E N D
Regulating Patient Safety: Is it Time for a Technological Response? Roger Brownsword Centre for Technology, Ethics, Law & Society (TELOS), Dickson Poon School of Law, King’s College London
Overview • The Francis Report (February, 2013): changing the culture; patients come first • Why traditional regulation fails • Should we try a more technological approach? Surveillance and ‘nursebots’.
The Francis Report • Shocking tales of patients being neglected. • ‘We need a patient centred culture, no tolerance of non compliance with fundamental standards, openness and transparency, candour to patients, strong cultural leadership and caring, compassionate nursing, and useful and accurate information about services.’
Five essential elements • Clear standards geared for patient safety. • Openness, transparency and candour. • Improved support and training. • Patient-centred leadership. • Accurate, useful and relevant information.
Why Traditional Regulation Fails • Government accepted nearly all Francis’ recommendations, so why should they be less than effective? • Problems with the competence, commitment, and resources available to regulators. • Problems with regulatees.
Regulatee resistance and response • According to Francis: • ‘In the maelstrom of discussions and efforts devoted to reorganisation, devising and implementing new systems and so on, the core purpose of healthcare services has all too often been overlooked. This Inquiry has seen evidence of many different examples of leaders, managers, regulators and others failing to have the interests and needs of patients at the forefront of their minds. Very few, if any, of the individuals involved have deliberately or consciously acted in this way. However, the pressures of their work and circumstances have led to this.’
Recent concern • Laura Donnelly, ‘“Alarming” culture of NHS care’ The Sunday Telegraph, February 2, 2014, p 1 (reporting the view of David Prior, the chairman of the CQC, that a dysfunctional rift between managers and clinical staff jeopardises the safety of vulnerable patients, that bullying, harassment, and abuse persist, and that whistle-blowers are ostracised).
Should We Try a More Technological Approach? • Technologies may function as regulatory tools (Lawrence Lessig: ‘code’) • The technologies might involve the design of products, places, or (one day perhaps) people. • Imagine hospitals that use surveillance technologies (e.g. CCTV) for patient safety; or that use robots to dispense medicines and to care for patients (‘nursebots’). • Assuming reliability, is there any reason to forego the use of such an approach?
Surveillance technologies • Standard objection: the impingement on privacy. • But, if a patient consents to surveillance for his/her own safety, that meets the objection. • And, if surveillance is designed to protect P1 from harm caused by P2, this is acceptable if proportionate.
Surveillance and the ‘complexion’ of the regulatory environment • Surveillance amplifies the prudential (self-interested) reasons for compliance. • But, in a moral community, the aspiration is to do the right thing for the right reason. • Even if, because of surveillance, regulatees do the right thing, they do not do it for the right reason
Hard technological management • CCTV supplements the norms of law; the signal is still normative. • But, regulators might go beyond this, using technological management to give regulatees no choice. • The regulatory environment is no longer normative; the signals concern only what can and cannot be done. • This seriously compromises moral community (Ian Kerr: moral virtue cannot be automated).
Nursebots ‘Phred’ (Pharmacy Robot-Efficient Dispensing) at hospital in Birmingham. ‘This sort of state-of-the-art technology is becoming more popular in pharmacy provision, both in hospitals and community pharmacies. It can dispense a variety of different medicines in seconds—at incredible speeds and without error. This really is a huge benefit to patients at UHB.’
Is There a Problem? • Robots can care for us but they do not really care about us---leading to concerns about ‘authenticity’. • Sandel and concerns about human dignity. • We might fail and abuse nursebots by neglecting their fundamental rights or interests.
Nursebots do not really care • Possibly a problem in relation to child development but surely not for adults who understand the limits of nursebots. • Some adults might prefer human carers even if their safety is jeopardised. Sherry Turkle’s story of ‘Richard’. For Richard, ‘dignity requires a feeling of authenticity, a sense of being connected to the human narrative. It helps sustain him. Although he would not want his life endangered, he prefers the sadist to the robot.’
Do Nursebots compromise human dignity? • Michael Sandel: ‘It is commonly said that enhancement, cloning, and genetic engineering pose a threat to human dignity. This is true enough. But the challenge is to say how these practices diminish our humanity.’ • So, how does it? Are we shifting our moral caring responsibilities?
Neglecting the fundamental rights or interests of nursebots • Do nursebots have fundamental rights or interests? • Do they have the capacity for moral agency, to do the right thing for the right reason? • If nursebots can be programmed so that they really do care, would this give us pause? (Risk of ‘speciesism’?)
Concluding remarks The objections to nursebots do not look compelling; but, for those who prefer to be cared for by human nurses, this can be accommodated. The use of technological regulatory tools might change the complexion of the regulatory environment in ways that challenge moral community. More particularly, if (like Francis) our ambition is to change the culture so that patients come first, we should be careful---morality cannot be automated.