I continue to be negative about generative AI assistants insofar as every time someone has written a piece of any length "co-written by an AI" it has all sorts of errors I point out the author hadn't even noticed were there
-
I continue to be negative about generative AI assistants insofar as every time someone has written a piece of any length "co-written by an AI" it has all sorts of errors I point out the author hadn't even noticed were there
Most irritatingly is that MULTIPLE TIMES people have written drafts about my work that simply injected and made up history. If they had just published them, that history would be cyclically referenced in the future as if fact
That kind of thing just happens all the time now
-
I continue to be negative about generative AI assistants insofar as every time someone has written a piece of any length "co-written by an AI" it has all sorts of errors I point out the author hadn't even noticed were there
Most irritatingly is that MULTIPLE TIMES people have written drafts about my work that simply injected and made up history. If they had just published them, that history would be cyclically referenced in the future as if fact
That kind of thing just happens all the time now
Study after study also shows that AI assistants erode the development of critical thinking skills and knowledge *retention*. People, finding information isn't the biggest missing skillset in our population, it's CRITICAL THINKING, so this is fucked up
AI assistants also introduce more errors at a high volume, and harder to spot too
https://www.microsoft.com/en-us/research/uploads/prod/2025/01/lee_2025_ai_critical_thinking_survey.pdf
https://slejournal.springeropen.com/articles/10.1186/s40561-024-00316-7
https://resources.uplevelteam.com/gen-ai-for-coding
https://www.techrepublic.com/article/ai-generated-code-outages/
https://arxiv.org/abs/2211.03622
https://pmc.ncbi.nlm.nih.gov/articles/PMC11128619/ -
Study after study also shows that AI assistants erode the development of critical thinking skills and knowledge *retention*. People, finding information isn't the biggest missing skillset in our population, it's CRITICAL THINKING, so this is fucked up
AI assistants also introduce more errors at a high volume, and harder to spot too
https://www.microsoft.com/en-us/research/uploads/prod/2025/01/lee_2025_ai_critical_thinking_survey.pdf
https://slejournal.springeropen.com/articles/10.1186/s40561-024-00316-7
https://resources.uplevelteam.com/gen-ai-for-coding
https://www.techrepublic.com/article/ai-generated-code-outages/
https://arxiv.org/abs/2211.03622
https://pmc.ncbi.nlm.nih.gov/articles/PMC11128619/@cwebber "and harder to spot too"
This is a thing that I feel is underestimated by so many people: these tools are trained to generate output that is as *convincing* as possible, regardless of whether the output is *correct*.
-
M malte@radikal.social shared this topic
-
C cwebber@social.coop shared this topic