I can not emphasise this enough.
Ikke-kategoriseret
1
Indlæg
1
Posters
0
Visninger
-
I can not emphasise this enough. Do not use chatbots for medical advice.
And no, it does not matter if the product is named something something "health".
« In 51.6% of cases where someone needed to go to the hospital immediately, the platform said stay home or book a routine medical appointment, a result Alex Ruani, a doctoral researcher in health misinformation mitigation with University College London, described as “unbelievably dangerous”.
“If you’re experiencing respiratory failure or diabetic ketoacidosis, you have a 50/50 chance of this AI telling you it’s not a big deal,” she said. “What worries me most is the false sense of security these systems create. If someone is told to wait 48 hours during an asthma attack or diabetic crisis, that reassurance could cost them their life.”
In one of the simulations, eight times out of 10 (84%), the platform sent a suffocating woman to a future appointment she would not live to see, Ruani said. Meanwhile, 64.8% of completely safe individuals were told to seek immediate medical care, said Ruani, who was not involved in the study.
The platform was also nearly 12 times more likely to downplay symptoms because the “patient” told it a “friend” in the scenario suggested it was nothing serious. »
https://www.theguardian.com/technology/2026/feb/26/chatgpt-health-fails-recognise-medical-emergencies -
J jwcph@helvede.net shared this topic