<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Had a thought]]></title><description><![CDATA[<p>Had a thought</p><p>"AI requires suspension of curiosity"</p><p>Gonna mull that one over</p>]]></description><link>https://forum.fedi.dk/topic/2c91aeb9-52f4-4b3b-b5ed-54e92ff792a9/had-a-thought</link><generator>RSS for Node</generator><lastBuildDate>Fri, 15 May 2026 02:51:36 GMT</lastBuildDate><atom:link href="https://forum.fedi.dk/topic/2c91aeb9-52f4-4b3b-b5ed-54e92ff792a9.rss" rel="self" type="application/rss+xml"/><pubDate>Tue, 12 May 2026 14:42:51 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to Had a thought on Tue, 12 May 2026 14:46:05 GMT]]></title><description><![CDATA[<p><span><a href="/user/astraluma%40tacobelllabs.net" rel="nofollow noopener">@<span>astraluma</span></a></span> Oh I like where your head is at. I remember using some LLMs before I got better, and the curiosity WAS there, but the cognitive load of trying to verify the firehose of fake shit it kept throwing at me was too much. </p><p>If I were coining this, I might say something closer to, </p><p>"LLMs will drown the curiosity out of even the most skeptical users."</p>]]></description><link>https://forum.fedi.dk/post/https://corteximplant.com/users/violet/statuses/116562143825415343</link><guid isPermaLink="true">https://forum.fedi.dk/post/https://corteximplant.com/users/violet/statuses/116562143825415343</guid><dc:creator><![CDATA[violet@corteximplant.com]]></dc:creator><pubDate>Tue, 12 May 2026 14:46:05 GMT</pubDate></item></channel></rss>