Doctors believe artificial intelligence has a role in healthcare, but express caution regarding its use as a chatbot, citing instances of inaccurate medical advice. Dr. Sina Bari, a practicing surgeon and AI healthcare leader at data company iMerit, recounted an experience where a patient challenged a medication he prescribed, presenting information from ChatGPT claiming a 45% risk of pulmonary embolism.
Dr. Bari discovered the statistic originated from a study focused on a specific subgroup of tuberculosis patients, making it irrelevant to his patient's case. Despite this concern, Dr. Bari expressed optimism about OpenAI's recent announcement of ChatGPT Health, a dedicated chatbot designed for health-related conversations with enhanced privacy.
ChatGPT Health aims to provide a more secure environment for users to discuss their health concerns, ensuring their messages are not used to train the underlying AI model. "I think it's great," Dr. Bari said. "It is something that's already happening, so formalizing it so as to protect patient information and put some safeguards around it is going to make it all the more powerful for patients to use."
The medical community acknowledges the potential benefits of AI in healthcare, such as personalized guidance and improved access to information. However, experts emphasize the importance of verifying AI-generated information with qualified healthcare professionals. The concern stems from the possibility of AI models misinterpreting data or providing advice that is not applicable to an individual's specific medical condition.
ChatGPT Health, expected to roll out in the coming weeks, represents an effort to address these concerns by creating a more controlled and private environment for health-related AI interactions. The platform is designed to offer personalized guidance while safeguarding patient information. The development marks a step toward integrating AI into healthcare responsibly, with a focus on accuracy and patient safety.
Discussion
Join the conversation
Be the first to comment