When it comes to a doctor’s daily tasks, they can be compared to three different types of fires.
A fire in the garden, for example an inpatient who is rapidly deteriorating or a patient who urgently needs a scan when the waiting list is too long. The fire down the road, like the broader challenges of delivering care in outpatient clinics that are resulting in unacceptable delays or trying to ensure adequate staffing for minimal viable care. Then there are the fires far away—these are the more distant priorities. This is the category where most of service improvement tends to happen. Traditionally, the adoption of new technology like Artificial Intelligence (AI) fits here because it has (understandably) taken second or third place to patient care. The brilliant Adam Kay’s “This is Going to Hurt” paints a detailed picture of the reality of life as a doctor. If you want to understand more about working in the system, “Do No Harm” by Henry Marsh clearly articulates how challenging the life of a consultant can be. Both books show how a doctor is driven to deliver patient care, systems need to enable them to do so.
It’s true patients should be prioritised. Unfortunately, workforce limitations mean progress on large-scale digital transformation of how we work on data is stalled. There is a limited number of doctors even joining the conversation, and we won’t be able to advance patient care if we continue this way.
While the popular press sensationalises AI somewhat, the Accenture Executive Survey on AI in Healthcare reveals that healthcare executives and organisations have a more realistic understanding of where it can help. “Operational areas, less likely to cause anxiety among patients and clinicians, are preceding more clinical, life-critical functions in terms of adoption. This approach shows wisdom and could help to limit the disappointment phase often present when new technologies are oversold to the market.”
Additionally, clinical responsibility cannot be fully assigned to a machine. In technical terms it’s not possible yet, and in ethical terms the implications still have to be thought through and properly controlled. Clinicians are well trained by their respective professional bodies—there’s simply no equivalent for machines. We won’t be handing over full clinical decision making to an algorithm for a while yet. While humans do make mistakes, there is an expectation (rightly or wrongly) that machines shouldn’t. In the meantime, clinical AI needs to learn from humans and improve, while minimizing risks to patients. Effectively, we’re talking about augmented rather than artificial intelligence.
The conundrum, then, is a bit like that of a first-time job seeker who needs experience to be employed but can’t get experience because nobody will employ her. How can the health systems introduce AI when doctors are already stretched, while AI needs to be trialled but requires clinical help for the machines to learn?
The health system of the future will be enhanced by AI, just as we are seeing in all other industries. The missing link for health is problems with the current data access and quality. This is the foundation we need to “train” AI solutions. We should never expect a doctor to prioritise AI development over patient care, but we should facilitate engagement and inclusion of ideas from lots of doctors rather than just a few. We do this by fixing the basics.
Many doctors have overflowing clinics with patients who are struggling to understand and battle their diseases. It’s not very helpful, relevant or realistic to suggest that their hardworking integrated care team will be replaced with an AI robot.
How about we look at ways to improve that clinic. Could some appointments be done through telemedicine? If so, then which ones? What digital tools do the patients have access to? How is the workforce enabled to use digital tools? We need analytics for insights on the patient and healthcare workforce experience.
Let’s use data and insights to put out some of the most immediate fires for the health service. This will free up time for the health workforce to spend more time with patients. Some of them may even have time to develop the artificial intelligence of the future! Please get in touch if you have thoughts to share, or questions to ask. I’d love to hear from you.