Monday morning pet peeve: using verbs that imply a mind to describe LLM output. It didn’t “write”, “think”, “chat”, “teach”, or “draw”. It *generated*, because that’s all it can do: generate stuff based on the inputs it gets from users and the stats of the stuff it was trained on.
Using words that imply sentience is a marketing gimmick and y’all need to stop amplifying it.