A lot of people talking about "hallucinations", and how often LLMs are right or wrong today.
-
A lot of people talking about "hallucinations", and how often LLMs are right or wrong today. I wish you'd stop.
LLMs can't be right or wrong. They don't hallucinate, because they don't experience any reality whatsoever. There's no understanding, or intent, or knowledge, or learning, or truth in there. The best mental model to understand what it's doing is something like mad-libs. It's a mad-lib solver. If the end result is factual, that's a coincidence, and it happened on accident.
-
M monkee@other.li shared this topic