@cwebber even if it was set to be deterministic, it still wouldn’t reliably produce correct output.
m_22@universeodon.com
@m_22@universeodon.com
Indlæg
-
I keep seeing lots of people saying "LLMs are like compilers/assemblers for prompts" -
I keep seeing lots of people saying "LLMs are like compilers/assemblers for prompts"@cwebber the methods used to prepare the data are similar (preprocessing, encoding, tokenization). If you turned the temperature on an LLM to 0 then it can be used to deterministically output the word with the highest probability at every step. People aren’t talking about that in this case, though.