Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
technicaladept@techhub.socialT

technicaladept@techhub.social

@technicaladept@techhub.social
About
Indlæg
3
Emner
0
Fremhævelser
0
Grupper
0
Følgere
0
Følger
0

Vis Original

Indlæg

Seneste Bedste Controversial

  • You don't use open source software because it's better (it usually isn't).
    technicaladept@techhub.socialT technicaladept@techhub.social

    @coldfish @mcc Sounds like inbuilt obsolesce. Can you remember when we used to pay to replace our software because the latest version was actually better. Still happens with games I think and of course with security being a shit show we're extorted into paying for security updates. (Unlike with any other unsafe product where the manufacturer's on the hook for a recall) But outside of those two, when was the last time you upgraded software because they'd added some functionality that you wanted?

    Ikke-kategoriseret

  • You don't use open source software because it's better (it usually isn't).
    technicaladept@techhub.socialT technicaladept@techhub.social

    @mcc any software that needs to communicate with a license server has a single point of failure. If someone’s writing code to deliberately stop their product working in a given circumstance then they are baking In unreliability and increasing support costs. There’s very few less rewarding tasks than convincing software that you’ve paid for that you’re not actually a thief.

    Ikke-kategoriseret

  • Amazon have reported "hundreds of thousands" of pictures of child sexual abuse material found in shared AI training data... but is refusing to tell regulators which data sets.
    technicaladept@techhub.socialT technicaladept@techhub.social

    @GossiTheDog Famously, generative AI has been hilariously bad at producing a picture of a glass of wine that's anything other than about half full. Ask for one that's full or nearly empty and it can only show you ones that match it's training data: where all the glasses show a tasteful measure. And good luck asking for a clock face that doesn't show seven minutes past ten. It just can't extrapolate. However ask it what a naked child look like and it's remarkably good at it. Why? Well ask the people who tripped CSAM filters by downloading image training data. Dear Elon, why is Grok so good at making child porn. Did you train it on your own kids or ours? And telling the interface not to show you the filthy kiddie pics that it's gathered, is a bit like selling a porn magazine and asking customers not to look at pages 12-27 because you accidentally abused some kids when you made it.

    Ikke-kategoriseret
  • Log ind

  • Har du ikke en konto? Tilmeld

  • Login or register to search.
Powered by NodeBB Contributors
Graciously hosted by data.coop
  • First post
    Last post
0
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper