ChatGPT’s ‘hallucinations’ make it difficult to trust for advice on cancer care

Updated

Patients and doctors should not rely too much on OpenAI’s ChatGPT for cancer treatment advice, as a new study reveals the popular artificial intelligence (AI) technology often weaves incorrect and correct information together, rendering its recommendations unreliable.

Researchers from Brigham and Women's Hospital, part of the Mass General Brigham healthcare system aimed to shed light on ChatGPTs limitations when it comes to recommending cancer treatments. The findings, published in JAMA Oncology, have shown that the AI chatbot often provides recommendations that do not align with established guidelines, raising concerns about the reliability of its advice for not only cancer treatments, but potentially other medical questions as well.

【MORE】