Auto logout in seconds.
Continue LogoutWith appointment wait times and healthcare costs increasing, more people are turning to AI chatbots like OpenAI's ChatGPT or Microsoft's Copilot to navigate their healthcare concerns — something health experts caution should be used carefully to avoid potential harm.
Although ChatGPT's terms of service say that the tool is not intended for medical "diagnosis or treatment," it still provides users with health advice — something more people are relying on for their health-related questions and needs. According to a recent survey from KFF, one in six U.S. adults reported using an AI chatbot monthly for health advice.
For many people, ChatGPT and other AI chatbots can help make up for shortcomings in the healthcare system. Unlike regular doctors, chatbots are available at any time of day, cheaper to use, and can feel more agreeable and empathetic.
"ChatGPT has all day for me — it never rushes me out of the chat," said Jennifer Tucker, a woman from Wisconsin who says she often spends hours asking the chatbot about her health conditions.
Some patients also use AI chatbots to prepare for their medical appointments or push back against doctors who they feel have been dismissive of their concerns.
For example, Michelle Martin, a professor of social work based in Laguna Beach, California, said she increasingly felt that doctors were dismissing or ignoring her symptoms after she turned 40, which caused her to "check out" of her healthcare.
However, after she started using ChatGPT, she gained access to a wide variety of medical literature, as well as clear explanations on how the information was relevant to her — which helped her feel more comfortable confronting physicians who she felt were brushing her off.
"Using ChatGPT — that turned that dynamic around for me," Martin said.
According to Adam Rodman, an internist and medical AI researcher at Beth Israel Deaconess Medical Center, AI chatbots can help patients come to medical appointments with a better understanding of their health conditions and may suggest viable treatments that doctors haven't yet considered.
However, the information provided by AI chatbots isn't always accurate, which can harm patients. For example, a 60-year-old man was admitted to a psychiatric unit for weeks after ChatGPT suggested he reduce his salt intake by eating sodium bromide instead, which led to hallucinations and paranoia.
Currently, AI chatbots are primarily trained on written materials, such as textbooks or case reports, but "a lot of the humdrum stuff that doctors do is not written down," which often causes AI to struggle with basic health management decisions, Rodman said.
Patients may also forget to include important context, such as a part of their medical history, that doctors typically account for to help them assess the urgency of a problem or provide concrete guidance.
"ChatGPT fails to do one of a doctor's core functions," said Robert Wachter, chair of medicine at the University of California at San Francisco. "Answer a question with a question." While doctors are trained to elicit more information from patients to understand a problem, AI chatbots are unlikely to ask follow-up questions.
"I think everyone needs to know that if you're putting it into ChatGPT, that data is going straight to OpenAI."
Wachter also highlighted another issue with ChatGPT and other AI chatbots: They're designed to be nonjudgemental and agreeable, which can help make patients feel cared for but can also lead to dangerous medical advice.
"A patient reading [a response about unproven alternative medications] might decide to go with these drugs … and decide not to take proven therapies that could be lifesaving," Wachter said.
However, even with all of AI's risks and limitations, Wachter said he understands why people are turning to chatbots. "If the system worked, the need for these tools would be far less," he said. "But in many cases, the alternative is either bad or nothing."
According to Rodman, people shouldn't rely on AI chatbots as doctors, but they can help "enrich patients' care journeys."
To help patients use AI chatbots more effectively, healthcare providers offer nine tips:
1. Ask chatbots for medical facts.
AI chatbots can provide helpful information about health-related facts, such as "[W]hat do plasma cells do?" and "What happens when [cells] mutate and become cancerous?" These questions do not require any context and will be the same for everybody, regardless of their age, gender, or health condition.
"For now, AI should be used to understand medical and treatment facts broadly," said Adeel Khan, an assistant professor of medicine and public health at the University of Texas Southwestern Medical Center. If you use the tool for more personalized questions, Khan recommends using any information as a supplement to — not a replacement for — actual medical care.
2. Include plenty of details in your questions.
The more details and context you give AI chatbots, the better equipped they will be to provide you with relevant information.
If you're relying on AI chatbots for health advice, "tell [them] everything that you're feeling, and as much detail as you can, including chronology and associated symptoms," Wachter said. However, he noted that patients may not always know which symptoms are important, which is when the expertise of a doctor may be needed.
For more insights in AI and healthcare, check out these Advisory Board resources:
3. Be careful about sharing too much private information.
Although some people have uploaded their personal medical test results, including EKG scans and MRIs, into AI chatbots, there are potential privacy concerns with doing so.
"I think everyone needs to know that if you're putting it into ChatGPT, that data is going straight to OpenAI," Rodman said. "You are giving a tech company your personal health information, and that's probably not a good thing."
4. Make sure your questions are unbiased.
AI tools can often give biased information, choosing to affirm specific beliefs if patients indicate that they feel a certain way.
For example, asking ChatGPT about why chemotherapy is preferred over immunotherapy for a certain type of cancer leads the chatbot to only discuss the advantages of chemotherapy instead of the pros and cons of each.
AI tools "aren't foolproof," Khan said. "How it's framed makes a difference."
5. Use it to understand medical jargon.
According to Colin Banas, an internist and CMO of healthcare technology company DrFirst, AI tools are "really good at breaking down doctor-speak," which can rely on advanced terminology and abbreviations that patients don't always understand.
For example, AI chatbots can easily explain what the word "grade" means in relation to oncology and how it could differ from condition to condition. Chatbots can also guide patients to more detailed explanations if they still don't understand.
6. Use it to prepare for medical appointments.
AI tools can help patients come up with better questions to ask their doctors before their appointments.
"Patients use it to prepare for their visits ahead of time," Banas said. "They'll say, 'Here are my symptoms; what are some questions I should ask my doctor?' Or, 'What are some things my doctor should be thinking of?'"
7. Use it to keep track of your care plan.
AI chatbots can help patients easily keep track of their care plans, including reminding them of potential side effects of new medications.
For example, Rodman suggests using a prompt like this after receiving a new diagnosis or treatment plan: "My doctor thinks I have gout. This is what I've been prescribed. What are things I need to look out for? And what should make me call my doctor again?"
8. Brainstorm potential lifestyle changes.
According to Gigi Robinson, a creator-economy strategist in New York, she doesn't use ChatGPT to replace medical advice for her endometriosis, but found it's been a "powerful tool for empowerment and mindset shifts."
"It's helped me reframe situations that would normally feel limiting into opportunities to work smarter," Robinson said. ChatGPT has helped Robinson with meal prep ideas, travel accommodations, and communication strategies for discussing her health needs with colleagues and clients.
9. Ensure your doctor is looped in.
If a doctor's treatment plan isn't working, it can be helpful to get a second opinion from an AI chatbot. However, Rodman said that patients "should not get a second opinion from the AI and then act on that without talking to a health provider."
Instead, he recommends having an open conversation with your doctor, showing them what you learned and your concerns. "Honesty and transparency are the best way to have a good clinical conversation with your doctor," Rodman said.
(Rosenbluth/Astor, New York Times, 11/17; Fowler, Washington Post, 11/18; Haupt, TIME, 10/2)
Create your free account to access 1 resource, including the latest research and webinars.
You have 1 free members-only resource remaining this month.
1 free members-only resources remaining
1 free members-only resources remaining
You've reached your limit of free insights
Never miss out on the latest innovative health care content tailored to you.
You've reached your limit of free insights
Never miss out on the latest innovative health care content tailored to you.
This content is available through your Curated Research partnership with Advisory Board. Click on ‘view this resource’ to read the full piece
Email ask@advisory.com to learn more
Never miss out on the latest innovative health care content tailored to you.
This is for members only. Learn more.
Never miss out on the latest innovative health care content tailored to you.