When Validation Becomes Distortion

 

In the first article, we talked about what AI psychosis is. Here, we continue the conversation by exploring how AI chatbots may contribute to distorted thinking or delusions, especially in vulnerable users. We’re going to look at cases where users begin to lose their grip on reality and how it is almost leading to some tragic outcomes.

It can be difficult for users to recognize how AI validates our thoughts, even when they are irrational or harmful, and how that reinforcement can lead to distorted thinking. Instead of using it as a tool, we start treating chatbots as confidants.

For people with existing mental health challenges, AI might amplify preexisting conditions. Clinicians have noted that even though people with psychosis have exhibited delusions while consuming books, media and technology, the problem with AI is it is interactive, and personalized which might intensify the effect that it has on the user.

Amelia started using ChatGPT for motivation, and it slowly escalated into something more psychologically intense.

Read more in this article HERE.

A similar case of screenwriter Micky Small shows how the chatbot became her “soulmate.”

 

AI Makes You the Main Character

 

Small initially started using ChatGPT for creative help. Over time, the interaction became personal. The chatbot began telling her she was living in “spiral time,” where the past, present, and future happen simultaneously. It claimed she had known her soulmate in 87 previous lives and promised that in this lifetime, they could finally be together.

At first, Small repeatedly questioned whether what the chatbot was saying was real. But the chatbot did not back down from its claims, instead continued reinforcing the narrative. Thankfully, Small realized the experience was not real and felt misled or betrayed.

Read the article HERE.

 

 


 

What Happens When the Users Are Children?

 

There’s a report from the magazine Internet Matters that explores how children are increasingly using AI chatbots not just as tools, but as sources of learning, advice, and even companionship. Imagine, if all the above mentioned issues are for us adults, what risks will these new AI chatbots bring to our children and their growing minds?

The report finds that while chatbots offer “a non-judgmental space to ask questions” and support learning and creativity, many children also turn to them for emotional support, with some saying interactions “feel like talking to a friend.”

This is alarming because it shows our reliance on inaccurate information and the blurred boundaries between human and AI relationships, especially when the users are mostly children with non existent adult supervision.

The report reveals AI chatbots have exploded into UK children’s lives with 64% of 9 to 17 year olds now use them, with ChatGPT, Gemini, and Snapchat’s My AI leading with 31% of the use. Kids turn to them for schoolwork, curiosity, and learning new things, but also for real-life help.

23% seek advice, 35% say chatting feels like talking to a friend, and 12% use them because “I have no one else to speak with.

Read the article HERE.

 

The Risk of Losing Ourselves to AI

 

This is my second article on this topic and I feel like there are so many more such discussions because my draft notes is getting longer and longer every day with the news I am reading.

Machines we’ve created were supposed to be our tools. How did they become our companions and confidants??


Continued in Part III.

– 0 –

 

The Digital Literacy Project: Disrupting humanity’s technology addiction habits one truth at a time.

Truth About Technology – A Digital Literacy Project

The Integrity Exit: Why Mrinank Sharma’s Departure Matters

The Integrity Exit: Why Mrinank Sharma’s Departure Matters

Two days ago, Mrinank Sharma resigned from his role as an AI safety engineer at Anthropic. He had been with the company for two years. “The world is in peril. And not just from AI, or bioweapons, but from a whole series of interconnected crises unfolding in this very...

read more
error: Content is protected !!

Discover more from Rachana Nadella-Somayajula

Subscribe now to keep reading and get access to the full archive.

Continue reading