Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. Something a bit worrying to note about using Ai in healthcare.

Something a bit worrying to note about using Ai in healthcare.

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
29 Indlæg 12 Posters 0 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

    Something a bit worrying to note about using Ai in healthcare.

    I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

    I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

    Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

    nzjayzee@mastodon.nzN This user is from outside of this forum
    nzjayzee@mastodon.nzN This user is from outside of this forum
    nzjayzee@mastodon.nz
    wrote sidst redigeret af
    #3

    @bloodflowersburning Ask GPT to write code for you. Same problem,. forgets or invents context, addresses a different problem, gets fixated on minutiae when the problem is structural. And if you even hint at what you think the Delphi 2.0
    will fart out you get "CONGRATULATIONS YOUR A GENIUS" 😕

    bloodflowersburning@mastodon.nzB 1 Reply Last reply
    0
    • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

      @bloodflowersburning Ask GPT to write code for you. Same problem,. forgets or invents context, addresses a different problem, gets fixated on minutiae when the problem is structural. And if you even hint at what you think the Delphi 2.0
      will fart out you get "CONGRATULATIONS YOUR A GENIUS" 😕

      bloodflowersburning@mastodon.nzB This user is from outside of this forum
      bloodflowersburning@mastodon.nzB This user is from outside of this forum
      bloodflowersburning@mastodon.nz
      wrote sidst redigeret af
      #4

      @nzJayZee sadly these are all phrases that I don’t have much frame of reference with, but I’m assuming the tl/dr is basically “ai hallucinates most of the content it spews out and generally makes things worse”?

      nzjayzee@mastodon.nzN 1 Reply Last reply
      0
      • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

        @nzJayZee sadly these are all phrases that I don’t have much frame of reference with, but I’m assuming the tl/dr is basically “ai hallucinates most of the content it spews out and generally makes things worse”?

        nzjayzee@mastodon.nzN This user is from outside of this forum
        nzjayzee@mastodon.nzN This user is from outside of this forum
        nzjayzee@mastodon.nz
        wrote sidst redigeret af
        #5

        @bloodflowersburning Yes. In my experience it does make things worse by inventing and obsessing over bad solutions (keeps returning to the same shitty solution to a programming problem) What keeps me up at night is algorithmic cruelty when it's used in stuff like job applications or negotiating social services., "Computer Says No"

        bloodflowersburning@mastodon.nzB 1 Reply Last reply
        0
        • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

          Two friends have told me in the last week they had similar issues happen. One had an incorrect diagnosis listed before they had a procedure done. The other noted viral not bacterial infection (although they did at least get the medication they needed). I feel like I’m being a pain in the bum going over everything and requesting corrections, but I’m seeing so many mistakes, to the point where any human reading them would immediately say “that doesn’t even make sense”. I worry for those who don’t or aren’t capable of checking these things. Sure, using ai might save the docs 10 minutes per patient in the ER but is that really worth the risks?

          theron29@witter.czT This user is from outside of this forum
          theron29@witter.czT This user is from outside of this forum
          theron29@witter.cz
          wrote sidst redigeret af
          #6

          @bloodflowersburning Where are doctors using ai like this? And how did you check the transcribes?🤔

          bloodflowersburning@mastodon.nzB 1 Reply Last reply
          0
          • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

            Something a bit worrying to note about using Ai in healthcare.

            I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

            I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

            Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

            jd38@mastodon.socialJ This user is from outside of this forum
            jd38@mastodon.socialJ This user is from outside of this forum
            jd38@mastodon.social
            wrote sidst redigeret af
            #7

            @bloodflowersburning "your future doctor are studying using chat gpt". I don't think I've ever wanted more for something to collapse than AI.

            bloodflowersburning@mastodon.nzB 1 Reply Last reply
            0
            • theron29@witter.czT theron29@witter.cz

              @bloodflowersburning Where are doctors using ai like this? And how did you check the transcribes?🤔

              bloodflowersburning@mastodon.nzB This user is from outside of this forum
              bloodflowersburning@mastodon.nzB This user is from outside of this forum
              bloodflowersburning@mastodon.nz
              wrote sidst redigeret af
              #8

              @theron29 Genuine question or scepticism? I’m in Aotearoa NZ. Doctors were from two different specialist medical departments. Both used ai software to record consultation and take notes. The report letters sent to my GP contained multiple discrepancies about conditions discussed, and referenced in GP referrals. If they had checked before sending they would have realised mistakes had been made. My GP questioned the content, which was how I became aware. I can provide several specific examples but would rather not on a public forum to a stranger. However, both letters were re-assessed and sent again with corrections on request. Hope this helps.

              theron29@witter.czT 1 Reply Last reply
              0
              • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

                @bloodflowersburning Yes. In my experience it does make things worse by inventing and obsessing over bad solutions (keeps returning to the same shitty solution to a programming problem) What keeps me up at night is algorithmic cruelty when it's used in stuff like job applications or negotiating social services., "Computer Says No"

                bloodflowersburning@mastodon.nzB This user is from outside of this forum
                bloodflowersburning@mastodon.nzB This user is from outside of this forum
                bloodflowersburning@mastodon.nz
                wrote sidst redigeret af
                #9

                @nzJayZee quoted from this article on RNZ: “He said jobseekers were using AI to generate their applications, while employers were using AI to read them.”
                The snake is eating its own tail.

                https://www.rnz.co.nz/news/business/590746/jobseekers-and-advocates-disturbed-as-companies-screen-applications-with-ai

                nzjayzee@mastodon.nzN 1 Reply Last reply
                0
                • jd38@mastodon.socialJ jd38@mastodon.social

                  @bloodflowersburning "your future doctor are studying using chat gpt". I don't think I've ever wanted more for something to collapse than AI.

                  bloodflowersburning@mastodon.nzB This user is from outside of this forum
                  bloodflowersburning@mastodon.nzB This user is from outside of this forum
                  bloodflowersburning@mastodon.nz
                  wrote sidst redigeret af
                  #10

                  @JD38 I think most students of all disciplines are now.

                  jd38@mastodon.socialJ redrobyn@mastodon.nzR 2 Replies Last reply
                  0
                  • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                    @theron29 Genuine question or scepticism? I’m in Aotearoa NZ. Doctors were from two different specialist medical departments. Both used ai software to record consultation and take notes. The report letters sent to my GP contained multiple discrepancies about conditions discussed, and referenced in GP referrals. If they had checked before sending they would have realised mistakes had been made. My GP questioned the content, which was how I became aware. I can provide several specific examples but would rather not on a public forum to a stranger. However, both letters were re-assessed and sent again with corrections on request. Hope this helps.

                    theron29@witter.czT This user is from outside of this forum
                    theron29@witter.czT This user is from outside of this forum
                    theron29@witter.cz
                    wrote sidst redigeret af
                    #11

                    @bloodflowersburning Genuine question (from central EU). (AI Scepticism is expected to come a bit later on... 🙂 ).
                    Doctors are not using AI here, yet. I guess this tech had to be certified and tested before it was admitted into doctor's realm?

                    Although this does not seem to be the worst usecase scenario where&how to use AI, your detailed explanation gives doubts whether the tech is actually ready now for anything *this* important.... 😏

                    bloodflowersburning@mastodon.nzB 1 Reply Last reply
                    0
                    • theron29@witter.czT theron29@witter.cz

                      @bloodflowersburning Genuine question (from central EU). (AI Scepticism is expected to come a bit later on... 🙂 ).
                      Doctors are not using AI here, yet. I guess this tech had to be certified and tested before it was admitted into doctor's realm?

                      Although this does not seem to be the worst usecase scenario where&how to use AI, your detailed explanation gives doubts whether the tech is actually ready now for anything *this* important.... 😏

                      bloodflowersburning@mastodon.nzB This user is from outside of this forum
                      bloodflowersburning@mastodon.nzB This user is from outside of this forum
                      bloodflowersburning@mastodon.nz
                      wrote sidst redigeret af
                      #12

                      @theron29 Agreed. Not the worst case scenario. For me personally it could have caused issues with further treatment, getting reimbursed by insurance, and caused confusion when needing ongoing care with other providers. So more an avoidable inconvenience and extra paperwork rather than a dangerous outcome in this example. I hope that’s the worst possibility across the board, and that people check their notes carefully to catch any inconsistencies.

                      Mistakes in medical notes have always happened, unfortunately it’s inevitable. Only time will tell if this becomes more of an issue if/when ai transcription is used in medical settings more frequently and if it generates a higher number of errors as opposed to human note taking. What I think is essential is that we still retain a human buffer to assess factual accuracy, rather than simply assuming (hoping?) the software can do it better.

                      For more info, the software Heidi AI Scribes has been endorsed for use within Health NZ. https://www.tewhatuora.govt.nz/health-services-and-programmes/digital-health/generative-ai-and-large-language-models#naiaeag-endorsed-tools

                      1 Reply Last reply
                      0
                      • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                        @JD38 I think most students of all disciplines are now.

                        jd38@mastodon.socialJ This user is from outside of this forum
                        jd38@mastodon.socialJ This user is from outside of this forum
                        jd38@mastodon.social
                        wrote sidst redigeret af
                        #13

                        @bloodflowersburning yeah, i was just too lazy to write multiple disciplines 😅

                        1 Reply Last reply
                        0
                        • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                          @nzJayZee quoted from this article on RNZ: “He said jobseekers were using AI to generate their applications, while employers were using AI to read them.”
                          The snake is eating its own tail.

                          https://www.rnz.co.nz/news/business/590746/jobseekers-and-advocates-disturbed-as-companies-screen-applications-with-ai

                          nzjayzee@mastodon.nzN This user is from outside of this forum
                          nzjayzee@mastodon.nzN This user is from outside of this forum
                          nzjayzee@mastodon.nz
                          wrote sidst redigeret af
                          #14

                          @bloodflowersburning I like the idea of applicant pushback. for something like $40 NZD /mth you can have all the "job application agents" via Claude. Totally agree about the snake.eating its tail . We should be building community resilience instead of data centers IMO.

                          bloodflowersburning@mastodon.nzB 1 Reply Last reply
                          0
                          • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

                            @bloodflowersburning I like the idea of applicant pushback. for something like $40 NZD /mth you can have all the "job application agents" via Claude. Totally agree about the snake.eating its tail . We should be building community resilience instead of data centers IMO.

                            bloodflowersburning@mastodon.nzB This user is from outside of this forum
                            bloodflowersburning@mastodon.nzB This user is from outside of this forum
                            bloodflowersburning@mastodon.nz
                            wrote sidst redigeret af
                            #15

                            @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

                            nzjayzee@mastodon.nzN 2 Replies Last reply
                            0
                            • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                              @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

                              nzjayzee@mastodon.nzN This user is from outside of this forum
                              nzjayzee@mastodon.nzN This user is from outside of this forum
                              nzjayzee@mastodon.nz
                              wrote sidst redigeret af
                              #16

                              @bloodflowersburning I'd never! The market knows best.

                              1 Reply Last reply
                              0
                              • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                                @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

                                nzjayzee@mastodon.nzN This user is from outside of this forum
                                nzjayzee@mastodon.nzN This user is from outside of this forum
                                nzjayzee@mastodon.nz
                                wrote sidst redigeret af
                                #17

                                @bloodflowersburning When you help someone with their groceries/stairs/anything or call an ambulance when someone's hurt the most important thing shouldn't be "How am I compensated". David Graeber called this (deliberately provocative) "baseline communism" it's why when 2 people working in a repair shop go "pass me the wrench," ..""ok" instead of entering into a wrench contract

                                nzjayzee@mastodon.nzN 1 Reply Last reply
                                0
                                • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

                                  @bloodflowersburning When you help someone with their groceries/stairs/anything or call an ambulance when someone's hurt the most important thing shouldn't be "How am I compensated". David Graeber called this (deliberately provocative) "baseline communism" it's why when 2 people working in a repair shop go "pass me the wrench," ..""ok" instead of entering into a wrench contract

                                  nzjayzee@mastodon.nzN This user is from outside of this forum
                                  nzjayzee@mastodon.nzN This user is from outside of this forum
                                  nzjayzee@mastodon.nz
                                  wrote sidst redigeret af
                                  #18

                                  @bloodflowersburning (I know that's not how NZ works, and I feel sad about it)

                                  1 Reply Last reply
                                  0
                                  • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                                    Something a bit worrying to note about using Ai in healthcare.

                                    I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                                    I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                                    Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                                    mamalake@beige.partyM This user is from outside of this forum
                                    mamalake@beige.partyM This user is from outside of this forum
                                    mamalake@beige.party
                                    wrote sidst redigeret af
                                    #19

                                    @bloodflowersburning I also see this as a HIPAA violation

                                    bloodflowersburning@mastodon.nzB 1 Reply Last reply
                                    0
                                    • mamalake@beige.partyM mamalake@beige.party

                                      @bloodflowersburning I also see this as a HIPAA violation

                                      bloodflowersburning@mastodon.nzB This user is from outside of this forum
                                      bloodflowersburning@mastodon.nzB This user is from outside of this forum
                                      bloodflowersburning@mastodon.nz
                                      wrote sidst redigeret af
                                      #20

                                      @MamaLake Unfortunately HIPPA doesn't apply in New Zealand law. But I think it's covered under the Health Information Privacy Code 2020 (HIPC) as Health NZ have authorised the use of specific tools (Heidi AI Scribe) in healthcare.

                                      1 Reply Last reply
                                      0
                                      • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                                        Something a bit worrying to note about using Ai in healthcare.

                                        I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                                        I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                                        Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                                        kimcrawley@zeroes.caK This user is from outside of this forum
                                        kimcrawley@zeroes.caK This user is from outside of this forum
                                        kimcrawley@zeroes.ca
                                        wrote sidst redigeret af
                                        #21

                                        @bloodflowersburning

                                        Please check out https://stopgenai.com

                                        bloodflowersburning@mastodon.nzB 1 Reply Last reply
                                        0
                                        • kimcrawley@zeroes.caK kimcrawley@zeroes.ca

                                          @bloodflowersburning

                                          Please check out https://stopgenai.com

                                          bloodflowersburning@mastodon.nzB This user is from outside of this forum
                                          bloodflowersburning@mastodon.nzB This user is from outside of this forum
                                          bloodflowersburning@mastodon.nz
                                          wrote sidst redigeret af
                                          #22

                                          @kimcrawley interesting initiative. Is there any section in particular you’d like me to focus on?

                                          My plan going forwards is to refuse the use of ai when recording medical consultations and to record my own notes (as a disability accessibility need) and keep checking everything for inconsistencies/mistakes.

                                          kimcrawley@zeroes.caK 1 Reply Last reply
                                          0
                                          Svar
                                          • Svar som emne
                                          Login for at svare
                                          • Ældste til nyeste
                                          • Nyeste til ældste
                                          • Most Votes


                                          • Log ind

                                          • Har du ikke en konto? Tilmeld

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          Graciously hosted by data.coop
                                          • First post
                                            Last post
                                          0
                                          • Hjem
                                          • Seneste
                                          • Etiketter
                                          • Populære
                                          • Verden
                                          • Bruger
                                          • Grupper