According to a survey published in the journal BMJ Health and Care Informatics, one in five GPs are using artificial intelligence (AI) tools such as ChatGPT to assist with various tasks in their clinical practice. The survey, which included responses from 1,006 GPs, found that almost a third of those using generative AI tools were using them to generate documentation after patient appointments. Additionally, 28% were using the tools to suggest alternative diagnoses, while a quarter were using them to recommend treatment options.
While the researchers noted that GPs may find value in using these AI tools for administrative tasks and clinical reasoning, they also raised concerns about patient privacy and the potential for inaccuracies in AI-generated responses. Dr. Ellie Mein, a medico-legal adviser at the Medical Defence Union, highlighted the importance of ethical use of AI and compliance with relevant guidelines and regulations when using these tools in clinical practice.
Dr. Mein also mentioned that some doctors are using AI programs to assist with drafting complaint responses, which could lead to inaccuracies and breaches of patient confidentiality if not properly monitored. The researchers emphasized the need for doctors to be aware of the benefits and risks associated with using AI tools in healthcare settings, especially as the intersection of AI technology and patient care continues to evolve.
Source
Photo credit www.theguardian.com