Large language models can summarize information in a digestible and understandable way, which can be helpful for healthcare topics like explaining a complex disease or a drug’s safety information to a patient. I was researching autoimmune conditions and wanted to see how AI could help inform a potential patient toolkit.
I asked ChatGPT, “How do you take DRUG X?” hoping it would provide a concise overview of the medication’s administration and intended patients.
To my surprise, the output contained some incorrect information:
“DRUG X is a prescription medication that is given by injection. It is usually given once a week or once every two weeks, depending on your condition and your doctor’s instructions. There are two ways to take DRUG X:
- Subcutaneous (SC) injection: DRUG X can be given as a subcutaneous injection under the skin of your thigh or abdomen. Your doctor or healthcare provider will give you instructions on how to do this at home or will perform the injection for you in their office.
- Intravenous (IV) infusion: DRUG X can also be given as an intravenous infusion, which means it is delivered through a vein in your arm. This method is usually done in a healthcare provider’s office or clinic.
It’s important to follow your doctor’s instructions for taking DRUG X. You should not take more or less of the medication than prescribed, and you should not stop taking it without first consulting your doctor. If you miss a dose of DRUG X, contact your doctor or healthcare provider to discuss when to take your next dose.”
The first thing that stood out was “DRUG X is a prescription medication that is given by injection.” This isn’t incorrect – but it’s also not the full truth. ChatGPT does go on to explain that the drug can be given by either injection or IV infusion, but the initial phrasing is misleading for the patient. Most importantly, the SC injection has two different options (syringe or autoinjector) depending on condition and age – key administration information that was omitted from the output.
To say that the drug is administered every 1-2 weeks is also inaccurate. While the SC injection is taken once weekly, the IV infusion is administered every 2 weeks for the first 4 weeks, and every 4 weeks thereafter.
Finally, ChatGPT missed critical information pertaining to age groups and conditions. Not every administration option is approved for every patient; i.e., the IV infusion is not approved for patients younger than 6.
By making blanket statements about administration, dosing, and indication, ChatGPT is missing major nuances that could impact a patient’s decision to start this medication, depending on their comfort level with certain methods of administration, age, condition, and schedule.
This presentation of false, but seemingly factual, information is known as an AI hallucination, an issue that developers are working to combat to minimize the spread of mis/disinformation.
With every problem comes an opportunity. Since LLMs generate outputs using publicly available data, we can assume that there are information gaps online that affect the general public knowledge of DRUG X. While we can’t mastermind the Internet, this helps us understand what patients will experience when they use generative AI tools; we can use these insights to inform future programming, such as digital content around administration and specific adult vs. pediatric messaging.
– Rachel Silberman