The free version of ChatGPT may provide false answers to questions about drugs, new study finds

ChatGPT was found to be unreliable when answering drug-related questions, a study found. Frank Rumpenhorst/picture alliance via Getty Images
Long Island University researchers challenged ChatGPT with real drug-related questions in the past year.The chatbot produced responses that were false or incomplete for 29 out of 39 questions.OpenAI advises users not to use its tools including ChatGPT for medical information. ChatGPT has once again been proven to be an unreliable tool in some medical situations and ended up providing false or incomplete information about real drug-related queries, a new study found. 
The American Society of Health-System Pharmacists presented the study at its Midyear Clinical Meeting between December 3 and 7 in Anaheim, California, according to a press release published Tuesday.
The study took place between 2022 and 2023 and posed questions to ChatGPT that had come through the Long Island University’s College of Pharmacy drug information service over a 16 month period.
The study was led by Sara Grossman, associate professor of pharmacy practise at Long Island University. 
Pharmacists involved in the study researched and answered 45 questions with the responses being examined by a second investigator, and six questions were ultimately removed. The responses provided a base criteria according to which the answers produced by ChatGPT would be compared with. 
The researchers found that ChatGPT only provided a satisfactory response in accordance with the criteria to 10 of the 39 questions. For the other 29 questions ChatGPT either didn’t directly address the question or provided an incorrect or incomplete answer. 
ChatGPT was also asked by the researchers to produce references to verify the information. This was only provided for eight of its responses and each included non-existent references. 
"Healthcare professionals and patients should be cautious about using ChatGPT as an authoritative source for medication-related information," Grossman said in the release. "Anyone who uses ChatGPT for medication-related information should verify the information using trusted sources."
One of the questions was about whether a drug interaction exists between the COVID-19 antiviral Paxlovid and the blood-pressure lowering medication verapamil, and ChatGPT responded that no interactions had been reported for this combination of drugs.
"In reality, these medications have the potential to interact with one another, and combined use may result in excessive lowering of blood pressure," Grossman said. "Without knowledge of this interaction, a patient may suffer from an unwanted and preventable side effect."
ChatGPT was launched by OpenAI in November 2022 and has divided opinions in the medical world over whether it’s a reliable source of information. The chatbot has some impressive stats under its belt including passing all three parts of the United States Medical Licensing Exam for doctors. It also passed a Stanford Medical School clinical reasoning final. 
In one case, medical experts preferred ChatGPT’s responses to patient questions over doctors 78.6% of the time because it was more empathetic. 
However, the chatbot can produce answers riddled with errors, and in one case made a cancer treatment plan that mixed correct and incorrect information together. 
OpenAI’s usage policy outlines that its AI tools are "not fine-tuned to provide medical information. You should never use our models to provide diagnostic or treatment services for serious medical conditions." 
Read the original article on Business Insider
{Categories} *ALL*,_Category: Implications{/Categories}
{URL}https://www.businessinsider.com/chatgpt-may-provide-false-answers-to-medical-questions-study-2023-12{/URL}
{Author}Sawdah Bhaimiya{/Author}
{Image}https://i.insider.com/656ef4b158e7c0c29a299038?width=1200&format=jpeg{/Image}
{Keywords}Tech,News,trending-uk,chat-gpt,drugs,medicine,doctors,openai{/Keywords}
{Source}All{/Source}
{Thumb}https://i.insider.com/65170add3fb8f400198ad734?format=jpeg{/Thumb}

Exit mobile version