Dive Brief:
- Misuse of artificial intelligence-powered chatbots in healthcare has topped ECRI’s annual list of the top health technology hazards.
- The nonprofit ECRI, which shared its list Wednesday, said chatbots built on ChatGPT and other large language models can provide false or misleading information that could result in significant patient harm.
- ECRI put chatbot misuse ahead of sudden loss of access to electronic systems and the availability of substandard and falsified medical products on its list of the biggest hazards for this year.
Dive Insight:
AI is a long-standing concern for ECRI. Insufficient governance of AI used in medical technologies placed fifth on the nonprofit’s rankings of the top hazards in 2024, and risks associated with AI topped its list last year.
The nonprofit shifted its focus to AI chatbots in its 2026 report. LLMs will answer users’ health questions. Yet while AI chatbot responses sound plausible, ECRI said the tools have suggested incorrect diagnoses, recommended unnecessary testing, promoted subpar medical supplies and invented body parts. One response seen by ECRI would put the patient at risk of burns from incorrect electrode placement.
While AI chatbots are not validated for healthcare purposes, ECRI said clinicians, patients and healthcare personnel are increasingly using the tools in that context. OpenAI said this month that more than 5% of all messages sent to its ChatGPT model are about healthcare. One quarter of ChatGPT’s 800 million regular users ask healthcare questions every week.
ECRI said users must recognize the limitations of the models and carefully scrutinize responses whenever using an LLM in a way that could influence patient care. While warning that chatbots are not substitutes for qualified medical advice or professional judgment, ECRI said higher healthcare costs and hospital or clinic closures could drive more people to rely on the tools.
The nonprofit named healthcare facilities’ lack of preparation for a sudden loss of access to electronic systems and patient information as the second biggest hazard in 2026. Substandard and falsified medical products ranked third on ECRI’s list.
Most of ECRI’s seven other hazards relate to medical devices. The nonprofit is concerned that details of recalls and updates for diabetes technologies such as insulin pumps and continuous glucose monitors are taking too long to reach patients and caregivers. ECRI said manufacturers should provide product safety information in a clear, easy-to-understand form.
The nonprofit flagged cybersecurity risks from legacy medical devices as another hazard. Because legacy devices are common and few organizations can afford to replace them, ECRI is recommending mitigating measures such as disconnecting the products from the network.