Im Gipper said:
Stmichael said:
LLM's to this day haven't yet demonstrated the ability to get basic true statements right 100% of the time. They will confidently tell you wrong information as if it were truth, .
So essentially public school teachers
This joke could be seen coming from over the horizon.
But just to expand a bit, LLM's don't "think" the way a human brain does. It has a very complicated series of statistical relationships between small chunks of words called tokens. When you ask it a question, it formulates a sentence that is statistically most likely to fulfill your request. But there's no checks against known facts.
The best real life example I can think of to illustrate this was some idiot lawyer who thought it would be a great idea to have chatgpt write his legal brief. When the judge asked for more information on the case this lawyer cited as legal precedent to support his case, the lawyer couldn't find any trace of it. Chatgpt made the whole thing up to just sound like a real court case with the exact precedent he needed.
That's why LLM'S aren't worth even half the hype they get.