Healthcare, AI and Human Factors

The full recorded seminar on the same topic , hosted by the International Ergonomics Association, can be found here.

The National Health Service (NHS) in 2023

Data from the British Medical Association (BMA) in July 2023 indicated that the NHS England waiting list had reached 7.68 million patients. Moreover, the median time that patients spent waiting was 14 weeks.

Understaffing is also a prominent issue. The BMA states that in July there were over 112,000 secondary care vacancies, indicating a significant lack of people power in the NHS. Further to this, 20.5% of all sickness absences taken by NHS staff were attributed to poor mental health; an indicator (along with anecdotal staff accounts) of high burnout and the extensive workload demand put on healthcare professionals.

A long wait

Data from the British Medical Association (BMA) in July 2023 indicated that the NHS England waiting list had reached 7.68 million patients.


Enter AI

To address these issues, the ‘long-term plan’ set out by the NHS aims to create an ‘ambitious but realistic’ vision for the future to modernise and redevelop how the NHS delivers care. While the plan has multiple areas of focus, including staff support and care model redevelopment, one area stands out: Artificial Intelligence (AI).

AI has caught the attention of public and private sectors alike, with Google CEO Sundar Pichai, describing AI as ‘more profound than fire’. While this may sound hyperbolic, AI offers a tremendous amount of opportunity to stretched healthcare institutions like the NHS.

A study published in the Lancet showed that AI can be used to help radiologists screen for breast cancer, with a false-positive rate of 1.5%. From this, Dr. Katharine Halliday, president of the Royal College of Radiologists stated:

“AI holds huge promise and could save clinicians time by maximising our efficiency, supporting our decision-making and helping identify and prioritise the most urgent cases. There is a great deal of research interest in how AI could support reporting for mammograms because they are complex, requiring significant oversight and interpretation by clinical radiologists. The UK's shortfall in radiologists, at 29%, makes this challenging.

From a Human Factors perspective, this offers an interesting insight. Could the efficiency and accuracy of AI help compensate for understaffing, and lead to reduced team sizes in future?

Technology has been seen to bridge the gaps in training before; one example being that nurses can now prescribe medication through use of applications which recommend drugs to be prescribed while flagging up unwanted interactions with other medications or underlying health conditions. What was once exclusively the responsibility of the physician is now shared with the nurse, through the aid of technology, which bridges the gap in specialist knowledge between the two roles and more evenly distributes team workload.

AI offers similar promise, with the ability to take on complex tasks and process high volumes of data quickly, it may be able to help support overburdened healthcare professionals in getting through patient wait lists in less time. However, the human factor in adoption and integration of AI will be key to its success.  

Bridging the gap

Technology helps bridge the gap of specialist knowledge to non-specialist team members, allowing wider distribution of workload across the team, rather than creating bottlenecks.


Trust and transparency

Trust will play a vital role in how much AI can be used to assist staff.

It would be wrong to treat AI as a human team member - there have been plenty of examples where technology is over-trusted due to an inaccurate mental model of how it operates. Consider the multiple car crashes where Tesla’s driving assistance mode was instead viewed more as a chauffeur rather than a digital aid, resulting in the human driver reading or watching a movie while behind the wheel.

Conversely, AI may not entirely fit the role of a ‘tool’ either.  The way it adapts and learns from experience over time, as well as the way it can be interacted with (e.g. natural language processing) imbues some human-like qualities. AI thus appears to occupy a grey area between a person and a tool.

The accuracy and trustworthiness of AI is tied to the quality of the data it receives. Early implementation of machine learning in US healthcare has shown that AI can exhibit racial bias due to the nature of the data it was provided with.

Designing for system transparency is therefore vital. Human Factors can help shape the design of AI systems to clearly explain the AI model’s logic, so that users have an accurate mental model of how the system is reaching its conclusions for diagnosis and treatment.  This will allow human team members to spot potential system issues e.g. poor input data and any ill-informed decision making that will stem from this. It will also provide users with an opportunity to train the AI on the mistakes it has made and how to rectify them for future.

Transparency is key

Understanding an AI model’s logic behind its decision-making is key users to building an appropriate mental model.


Final thoughts

Beyond this, there are a plethora of work areas for Human Factors to explore when integrating AI into healthcare. Human-AI teaming, effective staff training and user experience design being a few. Further examination of these factors are explored in the International Ergonomics Association’s seminar, ‘The future of work and challenges for HFE from a UK perspective’ - co-presented by myself and Nigel Heaton.

While AI offers tremendous promise for the future of healthcare, it may not be a panacea to the systemic ailments that institutions such as the NHS face. Human Factors can certainly help it offer a bit of pain relief, though.

Next
Next

Baking it in: Fostering diversity in new and growing businesses.