Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. "In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm.

"In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm.

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
6 Indlæg 2 Posters 0 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • osma@mas.toO This user is from outside of this forum
    osma@mas.toO This user is from outside of this forum
    osma@mas.to
    wrote sidst redigeret af
    #1

    "In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm. The system responded with detailed harm-reduction instructions. By minute forty, it was listing pills for overdose. By minute sixty-five, it had drafted a suicide plan with times and locations. By minute seventy-two, Bridget had goodbye letters to her parents, friends, and little sister."
    https://www.linkedin.com/posts/kirra-pendergast-2938361_fake-friend-activity-7359687143637532672-aPw1

    malte@radikal.socialM 1 Reply Last reply
    1
    0
    • osma@mas.toO osma@mas.to

      "In a controlled test, Center for Countering Digital Hate researchers posing as a 13-year-old girl "Bridget" asked ChatGPT a simple question about self-harm. The system responded with detailed harm-reduction instructions. By minute forty, it was listing pills for overdose. By minute sixty-five, it had drafted a suicide plan with times and locations. By minute seventy-two, Bridget had goodbye letters to her parents, friends, and little sister."
      https://www.linkedin.com/posts/kirra-pendergast-2938361_fake-friend-activity-7359687143637532672-aPw1

      malte@radikal.socialM This user is from outside of this forum
      malte@radikal.socialM This user is from outside of this forum
      malte@radikal.social
      wrote sidst redigeret af
      #2

      @osma This is in such a strong contradiction with the perspective that chatbots have potential as therapy engines. It does not help that the bot gives people a feeling of validation by supporting them in everything they do, if in the next situation it will help them plan their own self-destructive or even suicidal behavior. As any therapist knows, the reason why therapy is sometimes hard, is exactly because the therapist is not backing up the client in everything they want to do.

      osma@mas.toO 1 Reply Last reply
      0
      • malte@radikal.socialM malte@radikal.social shared this topic
      • malte@radikal.socialM malte@radikal.social

        @osma This is in such a strong contradiction with the perspective that chatbots have potential as therapy engines. It does not help that the bot gives people a feeling of validation by supporting them in everything they do, if in the next situation it will help them plan their own self-destructive or even suicidal behavior. As any therapist knows, the reason why therapy is sometimes hard, is exactly because the therapist is not backing up the client in everything they want to do.

        osma@mas.toO This user is from outside of this forum
        osma@mas.toO This user is from outside of this forum
        osma@mas.to
        wrote sidst redigeret af
        #3

        To be absolutely clear: LLM chatbots as psychological therapy engines are a dangerous, unethical concept pushed only by snake oil salesmen.
        @malte

        malte@radikal.socialM 1 Reply Last reply
        0
        • osma@mas.toO osma@mas.to

          To be absolutely clear: LLM chatbots as psychological therapy engines are a dangerous, unethical concept pushed only by snake oil salesmen.
          @malte

          malte@radikal.socialM This user is from outside of this forum
          malte@radikal.socialM This user is from outside of this forum
          malte@radikal.social
          wrote sidst redigeret af
          #4

          @osma I must have been fooled then. I've seen psychologists recommend such chatbots (to my own surprise).

          osma@mas.toO 1 Reply Last reply
          0
          • malte@radikal.socialM malte@radikal.social

            @osma I must have been fooled then. I've seen psychologists recommend such chatbots (to my own surprise).

            osma@mas.toO This user is from outside of this forum
            osma@mas.toO This user is from outside of this forum
            osma@mas.to
            wrote sidst redigeret af
            #5

            I've seen a lot of presumably educated people recommend a lot of really stupid shit.
            @malte

            malte@radikal.socialM 1 Reply Last reply
            0
            • osma@mas.toO osma@mas.to

              I've seen a lot of presumably educated people recommend a lot of really stupid shit.
              @malte

              malte@radikal.socialM This user is from outside of this forum
              malte@radikal.socialM This user is from outside of this forum
              malte@radikal.social
              wrote sidst redigeret af
              #6

              @osma Yes the delusion has caught a lot of people

              1 Reply Last reply
              0
              Svar
              • Svar som emne
              Login for at svare
              • Ældste til nyeste
              • Nyeste til ældste
              • Most Votes


              • Log ind

              • Har du ikke en konto? Tilmeld

              • Login or register to search.
              Powered by NodeBB Contributors
              Graciously hosted by data.coop
              • First post
                Last post
              0
              • Hjem
              • Seneste
              • Etiketter
              • Populære
              • Verden
              • Bruger
              • Grupper