Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. Something a bit worrying to note about using Ai in healthcare.

Something a bit worrying to note about using Ai in healthcare.

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
29 Indlæg 12 Posters 0 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

    @nzJayZee quoted from this article on RNZ: “He said jobseekers were using AI to generate their applications, while employers were using AI to read them.”
    The snake is eating its own tail.

    https://www.rnz.co.nz/news/business/590746/jobseekers-and-advocates-disturbed-as-companies-screen-applications-with-ai

    nzjayzee@mastodon.nzN This user is from outside of this forum
    nzjayzee@mastodon.nzN This user is from outside of this forum
    nzjayzee@mastodon.nz
    wrote sidst redigeret af
    #14

    @bloodflowersburning I like the idea of applicant pushback. for something like $40 NZD /mth you can have all the "job application agents" via Claude. Totally agree about the snake.eating its tail . We should be building community resilience instead of data centers IMO.

    bloodflowersburning@mastodon.nzB 1 Reply Last reply
    0
    • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

      @bloodflowersburning I like the idea of applicant pushback. for something like $40 NZD /mth you can have all the "job application agents" via Claude. Totally agree about the snake.eating its tail . We should be building community resilience instead of data centers IMO.

      bloodflowersburning@mastodon.nzB This user is from outside of this forum
      bloodflowersburning@mastodon.nzB This user is from outside of this forum
      bloodflowersburning@mastodon.nz
      wrote sidst redigeret af
      #15

      @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

      nzjayzee@mastodon.nzN 2 Replies Last reply
      0
      • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

        @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

        nzjayzee@mastodon.nzN This user is from outside of this forum
        nzjayzee@mastodon.nzN This user is from outside of this forum
        nzjayzee@mastodon.nz
        wrote sidst redigeret af
        #16

        @bloodflowersburning I'd never! The market knows best.

        1 Reply Last reply
        0
        • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

          @nzJayZee careful now, “community” seems to be a dirty word in some circles. Don’t be that radical lefty reminding people to be kind and care for others. 😉

          nzjayzee@mastodon.nzN This user is from outside of this forum
          nzjayzee@mastodon.nzN This user is from outside of this forum
          nzjayzee@mastodon.nz
          wrote sidst redigeret af
          #17

          @bloodflowersburning When you help someone with their groceries/stairs/anything or call an ambulance when someone's hurt the most important thing shouldn't be "How am I compensated". David Graeber called this (deliberately provocative) "baseline communism" it's why when 2 people working in a repair shop go "pass me the wrench," ..""ok" instead of entering into a wrench contract

          nzjayzee@mastodon.nzN 1 Reply Last reply
          0
          • nzjayzee@mastodon.nzN nzjayzee@mastodon.nz

            @bloodflowersburning When you help someone with their groceries/stairs/anything or call an ambulance when someone's hurt the most important thing shouldn't be "How am I compensated". David Graeber called this (deliberately provocative) "baseline communism" it's why when 2 people working in a repair shop go "pass me the wrench," ..""ok" instead of entering into a wrench contract

            nzjayzee@mastodon.nzN This user is from outside of this forum
            nzjayzee@mastodon.nzN This user is from outside of this forum
            nzjayzee@mastodon.nz
            wrote sidst redigeret af
            #18

            @bloodflowersburning (I know that's not how NZ works, and I feel sad about it)

            1 Reply Last reply
            0
            • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

              Something a bit worrying to note about using Ai in healthcare.

              I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

              I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

              Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

              mamalake@beige.partyM This user is from outside of this forum
              mamalake@beige.partyM This user is from outside of this forum
              mamalake@beige.party
              wrote sidst redigeret af
              #19

              @bloodflowersburning I also see this as a HIPAA violation

              bloodflowersburning@mastodon.nzB 1 Reply Last reply
              0
              • mamalake@beige.partyM mamalake@beige.party

                @bloodflowersburning I also see this as a HIPAA violation

                bloodflowersburning@mastodon.nzB This user is from outside of this forum
                bloodflowersburning@mastodon.nzB This user is from outside of this forum
                bloodflowersburning@mastodon.nz
                wrote sidst redigeret af
                #20

                @MamaLake Unfortunately HIPPA doesn't apply in New Zealand law. But I think it's covered under the Health Information Privacy Code 2020 (HIPC) as Health NZ have authorised the use of specific tools (Heidi AI Scribe) in healthcare.

                1 Reply Last reply
                0
                • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                  Something a bit worrying to note about using Ai in healthcare.

                  I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                  I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                  Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                  kimcrawley@zeroes.caK This user is from outside of this forum
                  kimcrawley@zeroes.caK This user is from outside of this forum
                  kimcrawley@zeroes.ca
                  wrote sidst redigeret af
                  #21

                  @bloodflowersburning

                  Please check out https://stopgenai.com

                  bloodflowersburning@mastodon.nzB 1 Reply Last reply
                  0
                  • kimcrawley@zeroes.caK kimcrawley@zeroes.ca

                    @bloodflowersburning

                    Please check out https://stopgenai.com

                    bloodflowersburning@mastodon.nzB This user is from outside of this forum
                    bloodflowersburning@mastodon.nzB This user is from outside of this forum
                    bloodflowersburning@mastodon.nz
                    wrote sidst redigeret af
                    #22

                    @kimcrawley interesting initiative. Is there any section in particular you’d like me to focus on?

                    My plan going forwards is to refuse the use of ai when recording medical consultations and to record my own notes (as a disability accessibility need) and keep checking everything for inconsistencies/mistakes.

                    kimcrawley@zeroes.caK 1 Reply Last reply
                    0
                    • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                      @kimcrawley interesting initiative. Is there any section in particular you’d like me to focus on?

                      My plan going forwards is to refuse the use of ai when recording medical consultations and to record my own notes (as a disability accessibility need) and keep checking everything for inconsistencies/mistakes.

                      kimcrawley@zeroes.caK This user is from outside of this forum
                      kimcrawley@zeroes.caK This user is from outside of this forum
                      kimcrawley@zeroes.ca
                      wrote sidst redigeret af
                      #23

                      @bloodflowersburning

                      We have a mutual aid fund for people who lost their livelihoods, guides to avoiding Gen AI, upcoming support groups for chatbot addicts, all kinds of stuff.

                      Share our website. Join us. There's lots of things you can do.

                      Why just let Gen AI's horrors happen, when you can join forces with us and push back?

                      https://stopgenai.com

                      1 Reply Last reply
                      0
                      • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                        Something a bit worrying to note about using Ai in healthcare.

                        I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                        I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                        Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                        fylgja@mastodon.socialF This user is from outside of this forum
                        fylgja@mastodon.socialF This user is from outside of this forum
                        fylgja@mastodon.social
                        wrote sidst redigeret af
                        #24

                        @bloodflowersburning thanks for the warning. My last couple appointments have used it too and I assumed providers would be double checking for errors, but maybe not. I'll be on the lookout. 🙃

                        1 Reply Last reply
                        0
                        • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                          Something a bit worrying to note about using Ai in healthcare.

                          I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                          I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                          Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                          medeavanamonde@beige.partyM This user is from outside of this forum
                          medeavanamonde@beige.partyM This user is from outside of this forum
                          medeavanamonde@beige.party
                          wrote sidst redigeret af
                          #25

                          @bloodflowersburning this has happened to me recently

                          1 Reply Last reply
                          0
                          • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                            @JD38 I think most students of all disciplines are now.

                            redrobyn@mastodon.nzR This user is from outside of this forum
                            redrobyn@mastodon.nzR This user is from outside of this forum
                            redrobyn@mastodon.nz
                            wrote sidst redigeret af
                            #26

                            @bloodflowersburning
                            Some of their lecturers are too.
                            The pharmacy school offers free medication reviews, The lecturer who I saw used ChatGPT to summarise a paper. Isn't that what the abstract is for?
                            @JD38

                            1 Reply Last reply
                            0
                            • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                              Something a bit worrying to note about using Ai in healthcare.

                              I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                              I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                              Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                              connychiwa@mastodon.nzC This user is from outside of this forum
                              connychiwa@mastodon.nzC This user is from outside of this forum
                              connychiwa@mastodon.nz
                              wrote sidst redigeret af
                              #27

                              @bloodflowersburning Yikes! That's really bad.
                              It's a good reminder to always check the notes on record after every appointment.
                              I think our GP gave us the option to decline use of the AI scribe. That should be the standard for everyone and part of the normal consent process.

                              1 Reply Last reply
                              0
                              • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                                Something a bit worrying to note about using Ai in healthcare.

                                I’ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, “huh, that directly contradicts what I put in the referrals.”

                                I have followed up both and requested amendments (which were done) but if I hadn’t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

                                Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

                                silversnapples@mastodonapp.ukS This user is from outside of this forum
                                silversnapples@mastodonapp.ukS This user is from outside of this forum
                                silversnapples@mastodonapp.uk
                                wrote sidst redigeret af
                                #28

                                @bloodflowersburning god this was inevitable. It cant even narrate a reel on IG without entirely misreading whole words for others. Even official international accounts.
                                This is terribly dangerous. I hope you email your local goverment representative about this (cc in 'other' party representation in your area so they dont ignore it) and also file your concern with your medical ombudsman.

                                Thank you for sharing this.

                                1 Reply Last reply
                                0
                                • bloodflowersburning@mastodon.nzB bloodflowersburning@mastodon.nz

                                  Two friends have told me in the last week they had similar issues happen. One had an incorrect diagnosis listed before they had a procedure done. The other noted viral not bacterial infection (although they did at least get the medication they needed). I feel like I’m being a pain in the bum going over everything and requesting corrections, but I’m seeing so many mistakes, to the point where any human reading them would immediately say “that doesn’t even make sense”. I worry for those who don’t or aren’t capable of checking these things. Sure, using ai might save the docs 10 minutes per patient in the ER but is that really worth the risks?

                                  cass_m@socialbc.caC This user is from outside of this forum
                                  cass_m@socialbc.caC This user is from outside of this forum
                                  cass_m@socialbc.ca
                                  wrote sidst redigeret af
                                  #29

                                  @bloodflowersburning it's amazing NZ would authorise something worse than simple voice to text transcription for doctor notes. But I'm old school, I still do searches and visit sites like Cleveland for medical guidance.

                                  1 Reply Last reply
                                  0
                                  • jwcph@helvede.netJ jwcph@helvede.net shared this topic
                                  Svar
                                  • Svar som emne
                                  Login for at svare
                                  • Ældste til nyeste
                                  • Nyeste til ældste
                                  • Most Votes


                                  • Log ind

                                  • Har du ikke en konto? Tilmeld

                                  • Login or register to search.
                                  Powered by NodeBB Contributors
                                  Graciously hosted by data.coop
                                  • First post
                                    Last post
                                  0
                                  • Hjem
                                  • Seneste
                                  • Etiketter
                                  • Populære
                                  • Verden
                                  • Bruger
                                  • Grupper