advertisement

Follow Mint Lounge

Latest Issue

Home > Health> Wellness > Can you count on ChatGPT for cancer information?

Can you count on ChatGPT for cancer information?

A first of its kind investigates the reliability and accuracy of ChatGPT’s cancer information

Researchers studied ChatGPT’s capabilities  in providing cancer-related information. (Pexels/Porapak Apichodilok)
Researchers studied ChatGPT’s capabilities in providing cancer-related information. (Pexels/Porapak Apichodilok)

Listen to this article

Since ChatGPT started the artificial intelligence race in November 2022, chatbots have been promoted as an easier way of accessing information in a world that is always in a hurry. However, this has also expanded to people using it as a resource for health information, which can raise concerns. Acknowledging their popularity, researchers studied ChatGPT’s capabilities and limitations in providing cancer-related information. 

Also read: New study links cancer patients and long Covid symptoms

A study, published in the Journal of The National Cancer Institute Cancer Spectrum, looked at whether chatbots and artificial intelligence (AI) can provide accurate information when asked about common cancer myths and misconceptions. This is the first of its kind study to investigate the reliability and accuracy of ChatGPT’s cancer information.

For the study, National Cancer Institute's (NCI) common myths and misconceptions about cancer were used and the team led by Skyler Johnson, physician-scientist at Huntsman Cancer Institute found that 97% of the answers were correct. However, the team expressed concerns about the interpretation of some answers. While answers were accurate, the chatbot’s language was vague and indirect. 

"This could lead to some bad decisions by cancer patients. The team suggested caution when advising patients about whether they should use chatbots for information about cancer," said Johnson.

A previous study by Johnson and his team, published in the Journal of the National Cancer Institute, found that misinformation was common on social media and had the potential to harm cancer patients. When the information provided by ChatGPT is unclear, it could leave the interpretation open-ended, which could potentially impact patients’ understanding of it.

The researchers also acknowledged the concern that algorithms reinforce current health disparities and inequities, although the extent of this regarding health information is currently unknown. There are also concerns about obscure cancer myths that may have incomplete or English-only information, highlighting the chatbot’s inability to be trained on a large and diverse data set. 

Finally, ChatGPT’s outputs could be trained on scientifically outdated models. At present, the collected data is limited to before 2021, so as new scientific information comes in, the chatbot may not be an accurate source, at least the accuracy will be delayed, the published paper explained.

Also read: Why running enthusiasts need to watch out for their safety

 

 

 

Next Story