I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
-
I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
He’s the opposite of convincing.
@baldur well, I’ve linked to it seriously a couple of times to illustrate the mental health toll of LLM addiction

-
To those who aren’t “AI”-pilled, Steve Yegge on anything related to LLM coding makes about as much sense as the Roko’s Basilisk nonsense. If you want to be convincing to outsiders you need to stop citing what are effectively “AI” catechisms
Steve Yegge is so bad that whenever I want to convince somebody on the fence on ”AI” that the biggest LLM boosters all seem to be having serious mental health episodes, I send them a link to one of his posts. Works every time.
-
@baldur well, I’ve linked to it seriously a couple of times to illustrate the mental health toll of LLM addiction

@baldur it’s too much even for Armin Ronacher https://lucumr.pocoo.org/2026/1/18/agent-psychosis/
(not that that made him stop using LLMs in anyway apparently…) -
@baldur well, I’ve linked to it seriously a couple of times to illustrate the mental health toll of LLM addiction

@ced Haha. Just posted pretty much exactly the same thing

-
@ced Haha. Just posted pretty much exactly the same thing

@baldur I remember how my mind was blown the first time I read one of those – like “but what is he on ?”
-
I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
He’s the opposite of convincing.
@baldur I cannot figure out Steve Yegge. He can't even figure out himself. He's said some interesting and useful things but also whoa nelly is there a bunch of weird in there too. Definitely not a source to cite without care and context.
-
@baldur I cannot figure out Steve Yegge. He can't even figure out himself. He's said some interesting and useful things but also whoa nelly is there a bunch of weird in there too. Definitely not a source to cite without care and context.
@aredridel Yeah, he's become the oddest of ducks these days.
-
I try to keep track of what’s going on among those who use LLMs for coding but they all keep linking Steve Yegge and I just can’t take anybody who links to Steve “gas town” Yegge seriously.
He’s the opposite of convincing.
@baldur Steve Yegge is an industry plant to get everyone to use up all their tokens in 10s
-
@baldur I cannot figure out Steve Yegge. He can't even figure out himself. He's said some interesting and useful things but also whoa nelly is there a bunch of weird in there too. Definitely not a source to cite without care and context.
@aredridel @baldur weird isn’t the problem, I like weird. Morally or intellectually inconsistent, that I have trouble with.
-
@aredridel @baldur weird isn’t the problem, I like weird. Morally or intellectually inconsistent, that I have trouble with.
@trisweb @baldur Honestly both of those are fine, if good to know when citing. Especially if the moral angle is something that's unsettled and people are casting about for better stances.
I'm starting to think though that a hidden piece of AI discourse is whether people can tolerate epistemically questionable sources well. Lots of people can't, it seems — and it explains why we’re in such a misinformation mess in this world even before LLMs, and now we're seeing the seams in places we used to be able to pretend weren't suspect.
I guess it should come as no surprise to me that people who were drawn to computers as deterministic objects would struggle with the absolute probabilistic buffoonery that LLMs generate, and that people have always had. And those of us drawn to them as communication objects in a time of a sometimes-hostile internet public have a different base sentiment about information. (And not to commit “of course I'm in the sweet spot”, but the people who grew up on the chan-pilled internet after my time, even far _too_ comfortable with hostile information spaces to the point of nihilism about it.)
-
To those who aren’t “AI”-pilled, Steve Yegge on anything related to LLM coding makes about as much sense as the Roko’s Basilisk nonsense. If you want to be convincing to outsiders you need to stop citing what are effectively “AI” catechisms
@baldur I hadn't encountered that concept before.
https://en.wikipedia.org/wiki/Roko's_basilisk
Strikes me as being equivalent to the argument that we're living in a simulation.
All of this *is* a religion to the ideas' adherents.
-
J jwcph@helvede.net shared this topic