Correct. Because there is no "pursuit of untruth". There is no pursuit, period. It's putting words together that statistically match up based on the input it receives. The output can be wrong, but it's not ever "lying", even if the words it puts together resemble that.
ech
Demanding the algorithm apologize is off the charts unhinged. It's amazing that people this stupid have achieved enough to fail this badly.
They're trying to insinuate that the accusations on them are of the same legitimacy of those they've been throwing at everyone for decades.
If their claims are rebuked as baseless, they'll call the claims against them baseless as well. If the claims against them are said to be substantiated, they'll point to their claims and suggest the same. It's the "tails I win, heads you lose" of a political scheme.
Step 1. Input code/feed into context/prompt
Step 2. Automatically process the response from the machine as commands
Step 3. Lose your entire database
Both require intent, which these do not have.
Hey dumbass (not OP), it didn't "lie" or "hide it". It doesn't have a mind, let alone the capability of choosing to mislead someone. Stop personifying this shit and maybe you won't trust it to manage crucial infrastructure like that and then suffer the entirely predictable consequences.
Because the greater populace has fallen for their aim to redefine the word. It used to be used to declare awareness of oppression and prejudice. Now it's a word to avoid or denounce, even by those that should be embracing it.
"Probably"?
For whatever reason, our brains developed the inclination/compulsion to imagine scenarios which greatly improved our odds of survival in the wild. In a biological sense, I figure stories and the like trigger that reward center in a way. Why people like different stories would come down to brain chemistry.
I'm sorry for your loss.
I'm not asserting anything or criticizing anyone. You're taking this much more personally than it's intended. All I'm doing here is pointing out the problematic origins of the comic. You asked me to explain why that matters and I did. You're not going to convince me otherwise, and I'm not interested in convincing you either. If anything, it's something for others to consider. Have a good day.
It very much doesn't because it enforces the idea that these algorithms know anything a or plan for anything. It is entirely inefficient to treat an llm like a person, as the clown in the screenshots demonstrated.