This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do.
-
This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do. In social psychology, it's called the Barnum effect. Using vague enough language while giving a highly particular description (of someone's personality), gives people the experience that it is highly accurate. LLM bots have found a way to generalize something like a Barnum effect to the field of everything we prompt it for.
@malte along the same lines:
https://www.ocrampal.com/why-criticizing-ai-feels-like-criticizing-someones-child/
-
This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do. In social psychology, it's called the Barnum effect. Using vague enough language while giving a highly particular description (of someone's personality), gives people the experience that it is highly accurate. LLM bots have found a way to generalize something like a Barnum effect to the field of everything we prompt it for.
For reference: https://en.m.wikipedia.org/wiki/Barnum_effect
-
@malte along the same lines:
https://www.ocrampal.com/why-criticizing-ai-feels-like-criticizing-someones-child/
@ocrampal I read your blog post and can see that it is also about conversations with LLM bots. Is there something specific you're thinking about as along the same lines?
-
@ocrampal I read your blog post and can see that it is also about conversations with LLM bots. Is there something specific you're thinking about as along the same lines?
@malte yes, people tend to associate meaning to LLMs. You mentioned Horoscopes as an example.
-
@malte yes, people tend to associate meaning to LLMs. You mentioned Horoscopes as an example.
@ocrampal OK got you, we tend to find meaning in things, even when it's not really there.
-
This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do. In social psychology, it's called the Barnum effect. Using vague enough language while giving a highly particular description (of someone's personality), gives people the experience that it is highly accurate. LLM bots have found a way to generalize something like a Barnum effect to the field of everything we prompt it for.
@malte
@baldur has written about something similar: https://softwarecrisis.dev/letters/llmentalist/ -
This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do. In social psychology, it's called the Barnum effect. Using vague enough language while giving a highly particular description (of someone's personality), gives people the experience that it is highly accurate. LLM bots have found a way to generalize something like a Barnum effect to the field of everything we prompt it for.
I don't find LLMs nearly as reliable as my horoscope.
-
@malte
@baldur has written about something similar: https://softwarecrisis.dev/letters/llmentalist/ -
M malte@radikal.social shared this topic
-
I don't find LLMs nearly as reliable as my horoscope.
@johnzajac Watch out though. Might just be a matter of time before that is LLM generated. In fact, that's probably one of the few use cases I can think of that makes sense besides marketing and other "impression management" tasks.
-
This was a new thought: That LLM bots come across as convincing in a similar way that horoscopes do. In social psychology, it's called the Barnum effect. Using vague enough language while giving a highly particular description (of someone's personality), gives people the experience that it is highly accurate. LLM bots have found a way to generalize something like a Barnum effect to the field of everything we prompt it for.
Could this be an opportunity to have an "anti-fascist tech skeptic" moment? We've had years of scientific skepticism poke fun of superstition on television - with people like Michael Shermer, James Randi etc. exposing the follies of spoon bending "geniuses" on live television. Today, many of these "skeptics" like Shermer are themselves caught up in paranoid fantasies about an oppressive LGBTQ left and don't seem to see the the superstition involved in our LLM delusions. The floor is open!