@Razemix @Linux_in_a_Bit yes it does allucinate, not its not «often», and most of the time it does it is because the answer is not documented.
And if it does... Well it will simply not work.
LLM is a (biais) tool with a _few_ use cases; To me documentation is one of them.