Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. "On the acceptance of GenAI"https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

"On the acceptance of GenAI"https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
78 Indlæg 44 Posters 0 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • buckfiftyseven@mastodon.socialB buckfiftyseven@mastodon.social

    @ai6yr @tante it seems pretty similar doesn't it? Taking what you want from a website, regardless of the host's intentions?

    mrbase@phpc.socialM This user is from outside of this forum
    mrbase@phpc.socialM This user is from outside of this forum
    mrbase@phpc.social
    wrote sidst redigeret af
    #12

    @buckfiftyseven @ai6yr @tante I think most running an adblocker is doing so to block data brokers, not the ad itself. Privacy is as much part of the equation here as the actual ad.

    buckfiftyseven@mastodon.socialB ozzelot@mstdn.socialO 3fingers@blorbo.social3 3 Replies Last reply
    0
    • mrbase@phpc.socialM mrbase@phpc.social

      @buckfiftyseven @ai6yr @tante I think most running an adblocker is doing so to block data brokers, not the ad itself. Privacy is as much part of the equation here as the actual ad.

      buckfiftyseven@mastodon.socialB This user is from outside of this forum
      buckfiftyseven@mastodon.socialB This user is from outside of this forum
      buckfiftyseven@mastodon.social
      wrote sidst redigeret af
      #13

      @mrbase @ai6yr @tante we definitely ended up in an unsatisfactory situation with respect to ads, brokers, and blockers. There's no denying that.

      It's interesting that no matter what your website license says, the courts say that the blockers are legal, filtering available content under some concept of fair use.

      So we are back to what exactly are AIs doing that is stealing? We can give public domain data a clean pass. I think that they honor most open source and Creative Commons licenses 1/2

      buckfiftyseven@mastodon.socialB 1 Reply Last reply
      0
      • buckfiftyseven@mastodon.socialB buckfiftyseven@mastodon.social

        @mrbase @ai6yr @tante we definitely ended up in an unsatisfactory situation with respect to ads, brokers, and blockers. There's no denying that.

        It's interesting that no matter what your website license says, the courts say that the blockers are legal, filtering available content under some concept of fair use.

        So we are back to what exactly are AIs doing that is stealing? We can give public domain data a clean pass. I think that they honor most open source and Creative Commons licenses 1/2

        buckfiftyseven@mastodon.socialB This user is from outside of this forum
        buckfiftyseven@mastodon.socialB This user is from outside of this forum
        buckfiftyseven@mastodon.social
        wrote sidst redigeret af
        #14

        @mrbase @ai6yr @tante so we are into a muddy legal ground that will probably have to be battled out in the actual courts, about how a fair use doctrine invented in 1741 for copyrighted works applies forward now.

        That's just the input side of course. On the output side it seems clear that too closely reproducing an existing work would be a violation as well.

        2/2

        1 Reply Last reply
        0
        • mrbase@phpc.socialM mrbase@phpc.social

          @buckfiftyseven @ai6yr @tante I think most running an adblocker is doing so to block data brokers, not the ad itself. Privacy is as much part of the equation here as the actual ad.

          ozzelot@mstdn.socialO This user is from outside of this forum
          ozzelot@mstdn.socialO This user is from outside of this forum
          ozzelot@mstdn.social
          wrote sidst redigeret af
          #15

          @mrbase
          I'd allow an ad that's a static image. Ads as they come now are full untrusted bits of code running on my machine without me inviting them. Blocking them is a security measure.
          @buckfiftyseven @ai6yr @tante

          1 Reply Last reply
          0
          • buckfiftyseven@mastodon.socialB buckfiftyseven@mastodon.social

            @ai6yr @tante it seems pretty similar doesn't it? Taking what you want from a website, regardless of the host's intentions?

            G This user is from outside of this forum
            G This user is from outside of this forum
            gbsills@social.vivaldi.net
            wrote sidst redigeret af
            #16

            @buckfiftyseven @ai6yr @tante Actually sites that don't want you to see their sites with ad blockers can easily do so.

            1 Reply Last reply
            0
            • tante@tldr.nettime.orgT tante@tldr.nettime.org

              "On the acceptance of GenAI"
              https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

              netraven@hear-me.socialN This user is from outside of this forum
              netraven@hear-me.socialN This user is from outside of this forum
              netraven@hear-me.social
              wrote sidst redigeret af
              #17

              @tante I don't use GenAI, I just try to find new and creative ways to break it.

              1 Reply Last reply
              0
              • tante@tldr.nettime.orgT tante@tldr.nettime.org

                "On the acceptance of GenAI"
                https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                crankylinuxuser@infosec.exchange
                wrote sidst redigeret af
                #18

                @tante

                None of these are true if you run your own LLMs on your own hardware, using FLOSS models.

                But the #MastodonHOA has deemed all AI to be abhorrent as a blanket decision.

                And frankly, if you exist in a capitalist society, and you're not an owner, there is 100% chance you are exploited. The capitalist system requires it.

                tante@tldr.nettime.orgT 1 Reply Last reply
                0
                • crankylinuxuser@infosec.exchangeC crankylinuxuser@infosec.exchange

                  @tante

                  None of these are true if you run your own LLMs on your own hardware, using FLOSS models.

                  But the #MastodonHOA has deemed all AI to be abhorrent as a blanket decision.

                  And frankly, if you exist in a capitalist society, and you're not an owner, there is 100% chance you are exploited. The capitalist system requires it.

                  tante@tldr.nettime.orgT This user is from outside of this forum
                  tante@tldr.nettime.orgT This user is from outside of this forum
                  tante@tldr.nettime.org
                  wrote sidst redigeret af
                  #19

                  @crankylinuxuser FLOSS Models (which are only freeware) fulfill most of those boxes. Trained on stolen data, massaged by people in global majority countries, trained in environmentally harmful data centers, outsourcing skills to the freeware product a company dumped on me, using a tool that is imbued and trained for how big tech wants to see the world, and effort could have gone to something meaningful. So yeah nope.

                  crankylinuxuser@infosec.exchangeC qgustavor@urusai.socialQ paelnever@masto.esP 3 Replies Last reply
                  0
                  • tante@tldr.nettime.orgT tante@tldr.nettime.org

                    "On the acceptance of GenAI"
                    https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                    kfort@social.sciences.reK This user is from outside of this forum
                    kfort@social.sciences.reK This user is from outside of this forum
                    kfort@social.sciences.re
                    wrote sidst redigeret af
                    #20

                    @tante I looove this! thanks!

                    1 Reply Last reply
                    0
                    • tante@tldr.nettime.orgT tante@tldr.nettime.org

                      @crankylinuxuser FLOSS Models (which are only freeware) fulfill most of those boxes. Trained on stolen data, massaged by people in global majority countries, trained in environmentally harmful data centers, outsourcing skills to the freeware product a company dumped on me, using a tool that is imbued and trained for how big tech wants to see the world, and effort could have gone to something meaningful. So yeah nope.

                      crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                      crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                      crankylinuxuser@infosec.exchange
                      wrote sidst redigeret af
                      #21

                      @tante

                      "Trained on stolen data". Its at best a copyright violation. And I view things like Anna's Archive and Libgen to be internationally renowned Public Libraries.

                      "Massaged by people in global majority countries" - yes, people work in capitalism. And guess what... You're exploited.

                      "Trained in environmentally harmful data centers". This assumes that training is always needed, and its not. You can train once, and run X times. Again, you're stretching to make local LLM look horrible.

                      And really, the rest of these are poor excuses. I won't use poop smear(anthropic), or OpenAI, or other SaaS token companies. I run local, and does not have those things you claim.

                      Except for the copyright issue. But again, I dont have that much respect for current US copyright.

                      epic_null@infosec.exchangeE 1 Reply Last reply
                      0
                      • crazyeddie@mastodon.socialC crazyeddie@mastodon.social

                        @tante Bad framing.

                        There's no such thing as GenAI.

                        That's some lofty goal they're supposedly going to reach by investing the entire world economy into it.

                        orange_lux@eldritch.cafeO This user is from outside of this forum
                        orange_lux@eldritch.cafeO This user is from outside of this forum
                        orange_lux@eldritch.cafe
                        wrote sidst redigeret af
                        #22

                        @crazyeddie @tante GenAI as in Generative AI, not Artificial General Intelligence (AGI).

                        downey@floss.socialD 1 Reply Last reply
                        0
                        • tante@tldr.nettime.orgT tante@tldr.nettime.org

                          "On the acceptance of GenAI"
                          https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                          lemgandi@mastodon.socialL This user is from outside of this forum
                          lemgandi@mastodon.socialL This user is from outside of this forum
                          lemgandi@mastodon.social
                          wrote sidst redigeret af
                          #23

                          @tante

                          x I accept that using this tool will make me measurably stupider

                          1 Reply Last reply
                          0
                          • tante@tldr.nettime.orgT tante@tldr.nettime.org

                            "On the acceptance of GenAI"
                            https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                            mitsosimo@mastodon.socialM This user is from outside of this forum
                            mitsosimo@mastodon.socialM This user is from outside of this forum
                            mitsosimo@mastodon.social
                            wrote sidst redigeret af
                            #24

                            @tante There should be a "I accept that all of my data will be used against me at some point" option.

                            1 Reply Last reply
                            0
                            • tante@tldr.nettime.orgT tante@tldr.nettime.org

                              "On the acceptance of GenAI"
                              https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                              feld@friedcheese.usF This user is from outside of this forum
                              feld@friedcheese.usF This user is from outside of this forum
                              feld@friedcheese.us
                              wrote sidst redigeret af
                              #25
                              @tante even Claude would have added a Select All option
                              mischievoustomato@tsundere.loveM 1 Reply Last reply
                              0
                              • feld@friedcheese.usF feld@friedcheese.us
                                @tante even Claude would have added a Select All option
                                mischievoustomato@tsundere.loveM This user is from outside of this forum
                                mischievoustomato@tsundere.loveM This user is from outside of this forum
                                mischievoustomato@tsundere.love
                                wrote sidst redigeret af
                                #26
                                @feld @tante and i would've clicked it after reading the first 2 lines and realizing some snob i would laugh at wrote it
                                1 Reply Last reply
                                0
                                • crankylinuxuser@infosec.exchangeC crankylinuxuser@infosec.exchange

                                  @tante

                                  "Trained on stolen data". Its at best a copyright violation. And I view things like Anna's Archive and Libgen to be internationally renowned Public Libraries.

                                  "Massaged by people in global majority countries" - yes, people work in capitalism. And guess what... You're exploited.

                                  "Trained in environmentally harmful data centers". This assumes that training is always needed, and its not. You can train once, and run X times. Again, you're stretching to make local LLM look horrible.

                                  And really, the rest of these are poor excuses. I won't use poop smear(anthropic), or OpenAI, or other SaaS token companies. I run local, and does not have those things you claim.

                                  Except for the copyright issue. But again, I dont have that much respect for current US copyright.

                                  epic_null@infosec.exchangeE This user is from outside of this forum
                                  epic_null@infosec.exchangeE This user is from outside of this forum
                                  epic_null@infosec.exchange
                                  wrote sidst redigeret af
                                  #27

                                  @crankylinuxuser @tante

                                  Its at best a copyright violation

                                  This may be true for published and public data... but that's not the only data that goes into these things. Any data that comes from breaches, users private cameras, and anything else stored with an expectation of privacy is much worse than a copyright violation.

                                  crankylinuxuser@infosec.exchangeC komali_2@mastodon.socialK 2 Replies Last reply
                                  0
                                  • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                    "On the acceptance of GenAI"
                                    https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                                    synthyx@social.vivaldi.netS This user is from outside of this forum
                                    synthyx@social.vivaldi.netS This user is from outside of this forum
                                    synthyx@social.vivaldi.net
                                    wrote sidst redigeret af
                                    #28

                                    @tante

                                    AI in the modern age is not going away. You shouldn't be shamed for using it, and at this point you should expect it.

                                    Even when the bubble goes pop we are still going to have AI in some form. AI is a useful tool for many people, and it's great when you self host it.

                                    Also, most things AI "steals" isn't really stealing if it's free and public on the internet.

                                    Only thing I really can agree with is environment impacts. At this point though we muck up the environment so much with plastics, overusage of gas, mass deforestation, etc that I don't know how big of an impact that really has. Ideally we would use green forms of energy for everything, and new tech innovation would reduce the absurd amounts of power required to run these supercomputers. Hopefully the ARM architecture is that light in the dark.

                                    tragivictoria@mastodon.catgirl.cloudT sharpcheddargoblin@kolektiva.socialS 2 Replies Last reply
                                    0
                                    • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                      @crankylinuxuser FLOSS Models (which are only freeware) fulfill most of those boxes. Trained on stolen data, massaged by people in global majority countries, trained in environmentally harmful data centers, outsourcing skills to the freeware product a company dumped on me, using a tool that is imbued and trained for how big tech wants to see the world, and effort could have gone to something meaningful. So yeah nope.

                                      qgustavor@urusai.socialQ This user is from outside of this forum
                                      qgustavor@urusai.socialQ This user is from outside of this forum
                                      qgustavor@urusai.social
                                      wrote sidst redigeret af
                                      #29

                                      @tante @crankylinuxuser I guess some people have zero idea of how AI model training works. They have the impression that "if I run this HuggingFace model in my hardware, it's ethical" but kinda think those models got uploaded there out of thin air, without any implications.

                                      qgustavor@urusai.socialQ 1 Reply Last reply
                                      0
                                      • tante@tldr.nettime.orgT tante@tldr.nettime.org

                                        "On the acceptance of GenAI"
                                        https://smallsheds.garden/blog/2026/on-the-acceptance-of-genai/

                                        komali_2@mastodon.socialK This user is from outside of this forum
                                        komali_2@mastodon.socialK This user is from outside of this forum
                                        komali_2@mastodon.social
                                        wrote sidst redigeret af
                                        #30

                                        @tante a lot of this applies to basically all participation in capitalism.

                                        1 Reply Last reply
                                        0
                                        • epic_null@infosec.exchangeE epic_null@infosec.exchange

                                          @crankylinuxuser @tante

                                          Its at best a copyright violation

                                          This may be true for published and public data... but that's not the only data that goes into these things. Any data that comes from breaches, users private cameras, and anything else stored with an expectation of privacy is much worse than a copyright violation.

                                          crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                                          crankylinuxuser@infosec.exchangeC This user is from outside of this forum
                                          crankylinuxuser@infosec.exchange
                                          wrote sidst redigeret af
                                          #31

                                          @Epic_Null @tante

                                          And yes, that is a big issue with the SaaS token vendors. Claude, OpenAI, MS, and the rest do use whatever user data they can get. I am not arguing their horrific behavior.

                                          I'm talking about locally running Qwen, or Deepseek, or other FLOSS models.

                                          That local LLM running on my machine only sees and uses data I provide. And a control-c in the relevant console window kills the LLM.

                                          What folks do not realize is this is #Leibniz's ultimate dream, of being able to do #calculus with words, sentences, and more. He tried to do single word-vectors, but even that had to wait for Word2Vec in 2012.

                                          grantrvd@hachyderm.ioG 1 Reply Last reply
                                          0
                                          Svar
                                          • Svar som emne
                                          Login for at svare
                                          • Ældste til nyeste
                                          • Nyeste til ældste
                                          • Most Votes


                                          • Log ind

                                          • Har du ikke en konto? Tilmeld

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          Graciously hosted by data.coop
                                          • First post
                                            Last post
                                          0
                                          • Hjem
                                          • Seneste
                                          • Etiketter
                                          • Populære
                                          • Verden
                                          • Bruger
                                          • Grupper