Home Tech Survey reveals that one in five GPs uses artificial intelligence like ChatGPT for daily tasks

Survey reveals that one in five GPs uses artificial intelligence like ChatGPT for daily tasks

0 comments
Survey reveals that one in five GPs uses artificial intelligence like ChatGPT for daily tasks

A fifth of GPs use artificial intelligence (AI) tools such as ChatGPT to help with tasks such as writing letters for their patients after appointments, according to a survey.

He surveypublished in the journal BMJ Health and Care Informatics, interviewed 1,006 GPs. They were asked whether they had ever used any type of AI chatbot in their clinical practice, such as ChatGPT, Bing AI or Google’s Gemini, and then asked what they used these tools for.

One in five respondents reported having used generative AI tools in their clinical practice, and of these, almost a third (29%) reported having used them to generate documentation after patient appointments, while 28% reported having used the tools to suggest a different diagnosis.

A quarter of respondents said they have used AI tools to suggest treatment options to their patients. These AI tools, like ChatGPT, work by generating a written response to a question posed to the software.

The researchers said the results showed that “GPs can gain value from these tools, particularly in administrative tasks and to support clinical reasoning.”

However, the researchers went on to question whether these AI tools could risk harming and undermining patient privacy “since it is unclear how the internet companies behind generative AI use the information they collect.”

They added: “While these chatbots are increasingly the target of regulatory efforts, it remains unclear how legislation will practically relate to these tools in clinical practice.”

Dr Ellie Mein, medico-legal adviser at the Medical Defence Union, said the use of AI by GPs could raise issues such as inaccuracy and patient confidentiality.

“This is interesting research and it resonates with our own experience advising MDU members,” said Mein. “It is natural that healthcare professionals would want to find ways to work smarter with the pressures they face. In addition to the uses identified in the BMJ article, we have found that some doctors are turning to AI programmes to help them write responses to their complaints. We have warned MDU members about the issues this raises, such as inaccuracy and patient confidentiality. Data protection also needs to be considered.”

He added: “When dealing with patient complaints, AI-written responses may seem plausible, but they can contain inaccuracies and refer to incorrect guidance which can be difficult to spot when woven into highly eloquent passages of text. It is critical that doctors use AI ethically and comply with relevant guidelines and regulations. This is clearly an evolving field and we agree with the authors that current and future doctors need greater awareness of the benefits and risks of using AI in the workplace.”

You may also like