Using an AI Chatbot for Therapy or Health Advice? Experts Want You to Know These 4 Things

Using an AI Chatbot for Therapy or Health Advice? Experts Want You to Know These 4 Things


robot with artificial intelligence pushing a shopping cart

Using an AI Chatbot for Therapy or Health Advice? Experts Want You to Know These 4 Things

August 21, 2025 | Source: PBS News | by Laura Santhanam

As chatbots powered by artificial intelligence explode in popularity, experts are warning people against turning to the technology for medical or mental health advice instead of relying upon human health care providers.

There have been a number of examples in recent weeks of chatbot advice misleading people in harmful ways. A 60-year-old man accidentally poisoned himself and entered a psychotic state after ChatGPT suggested he eliminate salt, or sodium chloride, from his diet and replace it with sodium bromide, a toxin used to treat wastewater, among other things. Earlier this month, a study from the Center for Countering Digital Hate revealed ChatGPT gave teens dangerous advice about drugs, alcohol and suicide.

The technology can be tempting, especially with barriers to accessing health care, including cost, wait times to talk to a provider and lack of insurance coverage. But experts told PBS News that chatbots are unable to offer advice tailored to a patient’s specific needs and medical history and are prone to “hallucinations,” or giving outright incorrect information.

Here is what to know about the use of AI chatbots for health advice, according to mental health and medical professionals who spoke to PBS News.

The post Using an AI Chatbot for Therapy or Health Advice? Experts Want You to Know These 4 Things appeared first on Organic Consumers.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *