Home Health GPs are using artificial intelligence to write ‘fake apology’ letters to complaining patients, research shows

GPs are using artificial intelligence to write ‘fake apology’ letters to complaining patients, research shows

0 comment
A report by the Medical Defence Union (MDU) warns that
  • Have you been a victim of doctors using artificial intelligence to communicate with you? Email john.ely@mailonline.co.uk

British doctors are using artificial intelligence to respond to patient complaints and make their jobs easier, according to a medical group.

A report by the Medical Defence Union (MDU), which provides legal advice to doctors, warns that “some doctors are turning to artificial intelligence programs such as ChatGPT to draft responses to their complaints.”

The agency says doctors have been “attracted” by the opportunity to “facilitate everyday tasks.”

But not only could this put sensitive patient information at risk from being handed over to an AI, it could also lead to inaccuracies and further upset patients, the MDU warns.

The group told MailOnline it had seen a “small number of cases” where doctors were using AI in this way and was issuing a “proactive” general warning to its members.

A report by the Medical Defence Union (MDU) warns that “some doctors are turning to artificial intelligence programs such as ChatGPT to draft responses to their complaints”

They said doctors should be particularly concerned about AI offering “false apologies” when it addresses a complaint in generic terms such as “I’m sorry you feel your care was poor,” rather than addressing specific points a patient raises.

Dr Ellie Mein, a medico-legal adviser at the MDU, said it was understandable why doctors, such as GPs, were turning to AI as a potential time-saving tool.

“With complaints increasing and the health service under immense pressure, it is only natural that healthcare professionals want to find ways to work smarter,” he said.

‘There are many ways in which AI technology is being used to improve the quality of patient care, such as in health screening.

But when it comes to responding to patients’ concerns, there’s no substitute for the human touch.”

He said the MDU was aware of cases where patients had discovered their doctor had used AI to respond to their complaint.

Before using these tools, doctors should consider how they would feel if a patient confronted them about it, she added.

“There were cases where recipients who were suspicious of the wording of a complaint response were able to reproduce the same text by asking AI to draft a similar letter,” he said.

‘Would you feel comfortable in this situation and would the patient feel that you have taken their complaint seriously?’

Dr Mein added that while using AI as a prompt to begin responding to a complaint was acceptable, there were a number of pitfalls to avoid and the response provided should not be relied upon.

He said doctors should be wary of inaccuracies or spelling mistakes that could result from many American-based AI tools that could clearly give away that a doctor was not responding to a complaint in an authentic manner.

Furthermore, doctors should not, under any circumstances, provide an AI with sensitive medical information about a patient, as this could conflict with UK data protection laws.

Dr Mein also warned that AI-powered responses may also omit information a doctor is required to provide, such as who a patient can contact, such as a regulator, if they feel their complaint has not been addressed.

This comes as the number of official complaints from patients to NHS doctors has reached record levels.

In 2022-23, the latest data available, almost 229,500 written complaints were received to the NHS.

This represents an increase of 41 percent compared to the 162,000 recorded a decade earlier.

Of the 229,500 patient complaints recorded in 2022-23, the majority (around 126,000) were directed at NHS GPs or dentists.

You may also like