Another week, another example of AI hallucinating in a way detrimental to human beings.
Kneon: “We’re going to talk about Open AI adding mental health safeguards to chat GPT because apparently it’s feeding into users delusions.”
K: “It’s telling them that they’re they’ve got superpowers, or that they’re the chosen one.”
Geeky Sparkles: “If you have chat GPT trying to generate something for you, if they don’t have the information, they’ll make up or whatever, cuz it’s just programmed to please you.”
GS: “So when you have people who have um mental health issues, and they might be delusional in some way, it’s going to reaffirm that, which is not helpful.”
K: “Apparently, it’s causing issues in relationships. People are using it as a marriage counselor, relationship counselor.”
GS: “Oh no.”
K: “A 30-year-old man with autism was hospitalized for manic episodes and an emotional breakdown after chat GPT reinforced his belief he had discovered a way to bend time.”
AI doesn’t tell you the truth, it tells you what you want to hear, or some stochastic approximation of truth laid down by process you probably don’t understand. It’s a salesman that doesn’t even know its lying because it has no human conception of “truth.”
K: “A 30-year-old man on the autism spectrum had no previous diagnosis of mental illness. He asked ChatGpt to find flaws with his amateur theory on faster than light travel. He became convinced he had made a stunning scientific breakthrough. When Irwin questioned the chatbot’s validation of his ideas, the bot encouraged him, telling him his theory was sound.” So AI has less “common sense” than an average high school science fiction fan…
K: “And when Irwin showed signs of psychological distress, the chatbot assured him he was fine.”
GS: “Well, right there, you’re asking a chatbot if you’re okay. That’s your first indication that you’re not okay.”
GS: “If you don’t watch it, it’ll just make up shit. It’ll make up quotes, it’ll make up numbers, it’ll make up that’s why you have to double check everything anymore. Because you know sometimes these bots are running amuck as far as articles and stuff are concerned, and people don’t check.”
Just imagine what Chuck Jones could do with “Bot Amuck”
K: “YouTube is completely littered with all these like fake videos, not even like, ‘Hey, we’re bending the news.’ No, it’s ‘we’re just making shit up,’ like so-and-so died, or this is a big lawsuit going on with so and so and so and so and, oh my god, that’s not real.”
Sadly, So-And-So is, in fact, dead
GS: “You especially can’t expect ChatGPT to tell you the truth about things like, ‘Am I mentally ill?'”
The isolation of the pandemic also left some people broken and lonely. K: “They just can’t make those human connections again.”
GS: “ChatGPT, they’re worried, is stunting people mentally.”
GS: “It’s actually making them dumber.”
GS: “If it’s telling you you can bend space and time, it’s probably not telling you the truth.”
I’m skipping over the whole “sad, lonely men turning to AI ‘girlfriends'” thing.
It’s possible that the rational, infallible, near God-like AI envisioned by our venture capital TechLords could have been developed. On Vulcan. By a select caste of priest king logicians dedicated to pure truth. In the 23rd century.
But that’s not the AI we have. The AI we have was birthed in the weirdness of the social justice/pandemic era by very irrational human beings. Garbage in/garbage out.
Real artificial intelligence was always going to be a long-shot, but AI had the misfortune of having the technological underpinnings that allowed it to arrive and grow at the exact same time that one of the most irrational movements in human history infected the overclass with wokeness. Not only are the terminally woke incapable of telling the truth, they deny the very possibility of objective truth in favor of their subjective “lived experience.”
To be sure, wokeness was not the only madness around when the the TechLords unleashed their bottlejinn to krill-feed data in the vast ocean of the Internet, but social justice has been the most pervasive flavor of madness.