@thomasjwebb @cwebber @joeyh
Local models like llama could be reworked to accept a seed for their RNG. There'd be less risk of them becoming unavailable, and they'd be both deterministic and reproducible, but they'd still be terrible for all the other reasons that LLMs are terrible .
"Sovereign" and reproducible slop is still just slop 
ansuz@gts.cryptography.dog
@ansuz@gts.cryptography.dog
Indlæg
-
I keep seeing lots of people saying "LLMs are like compilers/assemblers for prompts"