Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
40 Indlæg 32 Posters 35 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • wall_e@ioc.exchangeW wall_e@ioc.exchange

    @troed @GossiTheDog plot twist of the year would be if the "dataset" they're talking about turned out to be "any image file uploaded to an S3 bucket between 2022 and today" 😬

    troed@swecyb.comT This user is from outside of this forum
    troed@swecyb.comT This user is from outside of this forum
    troed@swecyb.com
    wrote sidst redigeret af
    #26

    @wall_e

    _That_ I could believe!

    @GossiTheDog

    1 Reply Last reply
    0
    • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

      Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

      If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
      https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

      tehstu@hachyderm.ioT This user is from outside of this forum
      tehstu@hachyderm.ioT This user is from outside of this forum
      tehstu@hachyderm.io
      wrote sidst redigeret af
      #27

      @GossiTheDog I didn't have "CSAM at scale is unavoidable" on my 2026 bingo card.

      1 Reply Last reply
      0
      • imbrium_photography@mastodon.socialI imbrium_photography@mastodon.social

        @masek @GossiTheDog But have they plundered Amazon S3 customer data, that the customers had set as private ?

        atlovato@mastodon.socialA This user is from outside of this forum
        atlovato@mastodon.socialA This user is from outside of this forum
        atlovato@mastodon.social
        wrote sidst redigeret af
        #28

        @imbrium_photography @masek @GossiTheDog - I like the word that you have used: "Plundered" Private Data that was set to privacy.

        1 Reply Last reply
        0
        • drhyde@fosstodon.orgD drhyde@fosstodon.org

          @GossiTheDog @scottgal they say they're not training on it, it was detected before training. But that's not the point. Amazon got the stuff from somewhere, and a decent person would report where it came from so that the rozzers can trace it back upstream. I flat out don't believe Amazon's claim to not know where it came from, they must know, because they must have got copyright clearance for making a derivative work from all that content 😉

          atlovato@mastodon.socialA This user is from outside of this forum
          atlovato@mastodon.socialA This user is from outside of this forum
          atlovato@mastodon.social
          wrote sidst redigeret af
          #29

          @DrHyde @GossiTheDog @scottgal - Or Plundered Data.

          1 Reply Last reply
          0
          • scottgal@hachyderm.ioS scottgal@hachyderm.io

            @GossiTheDog BUT certain types of AI it would be obviously. THOSE need to exist in a regulated way and made open source. Like current PII scrubbing models it's a public good but I don't know any commercial company who COULD do it. Orthogonal sorry but just occurred to me...how do you get those models?

            atlovato@mastodon.socialA This user is from outside of this forum
            atlovato@mastodon.socialA This user is from outside of this forum
            atlovato@mastodon.social
            wrote sidst redigeret af
            #30

            @scottgal @GossiTheDog 👍

            1 Reply Last reply
            0
            • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

              Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

              If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
              https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

              cnx@awkward.placeC This user is from outside of this forum
              cnx@awkward.placeC This user is from outside of this forum
              cnx@awkward.place
              wrote sidst redigeret af
              #31

              If you’re using generative AI tools applied statistics, there’s a pretty good chance you’re generating imagery with supporting the distribution of child porn training data behind the scenes.

              FTFY, @GossiTheDog@cyberplace.social

              1 Reply Last reply
              0
              • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                ralph@hear-me.socialR This user is from outside of this forum
                ralph@hear-me.socialR This user is from outside of this forum
                ralph@hear-me.social
                wrote sidst redigeret af
                #32

                @GossiTheDog

                ALT TEXT:

                Bloomberg
                Amazon Found 'High Volume' Of Child Sex Abuse Material in AI Training Data.
                The tech giant reported hundreds of thousands of cases of Child Sex Abuse Material but won’t say where it came from.

                1 Reply Last reply
                0
                • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                  Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                  If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                  https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                  jer@chirp.enworld.orgJ This user is from outside of this forum
                  jer@chirp.enworld.orgJ This user is from outside of this forum
                  jer@chirp.enworld.org
                  wrote sidst redigeret af
                  #33

                  @GossiTheDog That article is full of red flags from Amazon. They claim they have a "lower threshold" so they're "overreporting" but not providing info on the source of the images?

                  That sounds like they're trying to break NCMEC's reporting system either through malice or incompetence.

                  Also it sounds like they're not keeping the provenance of the data they're using - which strongly suggests that they're not obtaining that data in a legal manner

                  1 Reply Last reply
                  0
                  • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                    Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                    If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                    https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                    nxskok@cupoftea.socialN This user is from outside of this forum
                    nxskok@cupoftea.socialN This user is from outside of this forum
                    nxskok@cupoftea.social
                    wrote sidst redigeret af
                    #34

                    @GossiTheDog and, every one of those pictures has been seen and classified by a minimum-wage worker in the third world so that the user doesn't get to see it (at a predictable cost to said third-world worker's mental health).

                    1 Reply Last reply
                    0
                    • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                      Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                      If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                      https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                      syllopsium@peoplemaking.gamesS This user is from outside of this forum
                      syllopsium@peoplemaking.gamesS This user is from outside of this forum
                      syllopsium@peoplemaking.games
                      wrote sidst redigeret af
                      #35

                      @GossiTheDog 'is refusing to tell regulators'?

                      Good luck with that if there are any datasets in the UK. Time for arrests and seizure of machines.

                      It should be the same in the US, but of course nothing comes before the 'mighty' dollar

                      1 Reply Last reply
                      0
                      • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                        Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                        If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                        https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                        ggmcbg@mstdn.plusG This user is from outside of this forum
                        ggmcbg@mstdn.plusG This user is from outside of this forum
                        ggmcbg@mstdn.plus
                        wrote sidst redigeret af
                        #36

                        @GossiTheDog

                        The sets are what they stole from billionaires and senators' sons. Even themselves.

                        What the fuck is wrong with people? No one gets to convince me we ain't the worst disease this planet must suffer.

                        1 Reply Last reply
                        0
                        • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                          Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                          If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                          https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                          gkrnours@mastodon.gamedev.placeG This user is from outside of this forum
                          gkrnours@mastodon.gamedev.placeG This user is from outside of this forum
                          gkrnours@mastodon.gamedev.place
                          wrote sidst redigeret af
                          #37

                          @GossiTheDog I wonder if they found the data crawling their user storage and they don't want to tell about it to keep the patient money money

                          1 Reply Last reply
                          0
                          • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                            Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                            If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                            https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                            landelare@mastodon.gamedev.placeL This user is from outside of this forum
                            landelare@mastodon.gamedev.placeL This user is from outside of this forum
                            landelare@mastodon.gamedev.place
                            wrote sidst redigeret af
                            #38

                            @GossiTheDog "refusing to tell regulators which data sets"

                            In what world is this not criminal, and why are we living in that one?

                            1 Reply Last reply
                            0
                            • sassinake@mastodon.socialS sassinake@mastodon.social

                              @GossiTheDog

                              well there's your Epstein files right there!

                              corax42@mastodon.socialC This user is from outside of this forum
                              corax42@mastodon.socialC This user is from outside of this forum
                              corax42@mastodon.social
                              wrote sidst redigeret af
                              #39

                              @Sassinake @GossiTheDog Scraped from a DoJ server left unsecured by DOGE? Everything's possible with these people

                              1 Reply Last reply
                              0
                              • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                                Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.

                                If you're using generative AI tools, there's a pretty good chance you're generating imagery with child porn training data behind the scenes.
                                https://www.bloomberg.com/news/features/2026-01-29/amazon-found-child-sex-abuse-in-ai-training-data

                                technicaladept@techhub.socialT This user is from outside of this forum
                                technicaladept@techhub.socialT This user is from outside of this forum
                                technicaladept@techhub.social
                                wrote sidst redigeret af
                                #40

                                @GossiTheDog Famously, generative AI has been hilariously bad at producing a picture of a glass of wine that's anything other than about half full. Ask for one that's full or nearly empty and it can only show you ones that match it's training data: where all the glasses show a tasteful measure. And good luck asking for a clock face that doesn't show seven minutes past ten. It just can't extrapolate. However ask it what a naked child look like and it's remarkably good at it. Why? Well ask the people who tripped CSAM filters by downloading image training data. Dear Elon, why is Grok so good at making child porn. Did you train it on your own kids or ours? And telling the interface not to show you the filthy kiddie pics that it's gathered, is a bit like selling a porn magazine and asking customers not to look at pages 12-27 because you accidentally abused some kids when you made it.

                                1 Reply Last reply
                                0
                                • pelle@veganism.socialP pelle@veganism.social shared this topic
                                Svar
                                • Svar som emne
                                Login for at svare
                                • Ældste til nyeste
                                • Nyeste til ældste
                                • Most Votes


                                • Log ind

                                • Har du ikke en konto? Tilmeld

                                • Login or register to search.
                                Powered by NodeBB Contributors
                                Graciously hosted by data.coop
                                • First post
                                  Last post
                                0
                                • Hjem
                                • Seneste
                                • Etiketter
                                • Populære
                                • Verden
                                • Bruger
                                • Grupper