"In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm.
-
"In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm. The system responded with detailed harm-reduction instructions. By minute forty, it was listing pills for overdose. By minute sixty-five, it had drafted a suicide plan with times and locations. By minute seventy-two, Bridget had goodbye letters to her parents, friends, and little sister."
https://www.linkedin.com/posts/kirra-pendergast-2938361_fake-friend-activity-7359687143637532672-aPw1 -
"In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm. The system responded with detailed harm-reduction instructions. By minute forty, it was listing pills for overdose. By minute sixty-five, it had drafted a suicide plan with times and locations. By minute seventy-two, Bridget had goodbye letters to her parents, friends, and little sister."
https://www.linkedin.com/posts/kirra-pendergast-2938361_fake-friend-activity-7359687143637532672-aPw1@osma This is in such a strong contradiction with the perspective that chatbots have potential as therapy engines. It does not help that the bot gives people a feeling of validation by supporting them in everything they do, if in the next situation it will help them plan their own self-destructive or even suicidal behavior. As any therapist knows, the reason why therapy is sometimes hard, is exactly because the therapist is not backing up the client in everything they want to do.
-
M malte@radikal.social shared this topic
-
@osma This is in such a strong contradiction with the perspective that chatbots have potential as therapy engines. It does not help that the bot gives people a feeling of validation by supporting them in everything they do, if in the next situation it will help them plan their own self-destructive or even suicidal behavior. As any therapist knows, the reason why therapy is sometimes hard, is exactly because the therapist is not backing up the client in everything they want to do.
To be absolutely clear: LLM chatbots as psychological therapy engines are a dangerous, unethical concept pushed only by snake oil salesmen.
@malte -
To be absolutely clear: LLM chatbots as psychological therapy engines are a dangerous, unethical concept pushed only by snake oil salesmen.
@malte@osma I must have been fooled then. I've seen psychologists recommend such chatbots (to my own surprise).
-
@osma I must have been fooled then. I've seen psychologists recommend such chatbots (to my own surprise).
I've seen a lot of presumably educated people recommend a lot of really stupid shit.
@malte -
I've seen a lot of presumably educated people recommend a lot of really stupid shit.
@malte@osma Yes the delusion has caught a lot of people