To be fair, any human that you ask a question is going to give you wrong answers SOME of the time, often with complete certainty. To expect that an LLM will always get it right is unrealistic and misguided. Trust, but verify if the answer is important. Not sure why we would expect any computer to come up with the right answer every time.