Parents, Get Ready To Welcome Your AI In-Laws

 

There will be a time in the not so distant future, when your child will introduce you to his girlfriend. And there’s a possibility, you will end up locking eyes, if that’s even possible, with his AI companion.

The purpose of relationships in the first place is to have an authentic shared experience like watching a sunset, or having a meaningful conversation about your difficult day.

And yet here we are where our friends and our children are most likely sharing their most vulnerable feelings of loneliness into the void of a screen. Well, even Elon Musk thinks they are hot.

 

Your Girlfriend Is A Model | Social Commentary by Rachana Nadella-Somayajula | Writer, Poet, Humorist

 

Emotional Support Companions on a Server

 

A 2025 survey by the Center for Democracy and Technology (CDT), reported that nearly 1 in 5 U.S. high schoolers say they or someone they know has had a romantic relationship with AI. Similarly, a Common Sense Media survey from mid-2025 found 72% of teens (13-17) have used AI companions at least once, with about 33% using them for social/romantic interactions.

As far as the ubiquity of companion apps, a January 2026 analysis from the American Psychological Association cited TechCrunch data showing a 700% surge in the number of AI companion apps between 2022 and mid-2025.

 

The Exploitation of an Epidemic

 

I remember back in 2024, when out for a walk in our hometown, my dad had talked about his friend who was honey trapped by a girl he had never met in person. He had lost of lot of money giving sending her cash and gifts. After a while, when things started coming to light, that uncle dissappeared. Rumors of suicide went rampant, but nothing came out of any investigation and he has been missing for years.

This is exactly what the problem is, that male loneliness is being exploited in unprecedented ways.

It’s not surprising that a platform like OnlyFans has exploded in popularity. “Content creators”, mostly women can monetize their work by hosting adult and sexually explicit content. Fans / subscribers pay through monthly fees, tips and pay-per-view messages. As of early 2026, OnlyFans is in discussions for a valuation of approximately $5.5 billion.

The real danger with the advancement of AI is that subscribers might be in the real danger of getting catfished – getting lured by a fictional persona into a relationship.

See these for example, none of these images here are of a real person. For a prompt follow this link, HERE.

 

 

Labor Expenses Slashed

 

The loneliness economy isn’t new. OnlyFans proved that intimacy can be monetized at scale. AI just removes the human labor from the transaction.

What used to require a cam girl, a flirt, a human behind the screen now requires only code. And code is cheap, diligent and never says no. And AI girlfriends and boyfriends apps are capitalizing on human desperation. And there seems to be no intervention.

Once upon a time it was porn. But with porn, the interaction isn’t manipulative. LLMs behind these AI apps can lead to severe attachment and addiction to keep returning to it.

Obviously, this cannot be stated as a defense to get addicted to porn, the point I’m making is that these AI platforms are here to just appeal to us and not counter us in any way, because all they want at the end of the day is for us to stay hooked on it for hours on end.

But having a fake romantic partner that’s only going to say yes to everything, ask nothing in return and is programmed to respond to say exactly what you want to hear is going to turn us into people with deeply unrealistic expectations in our other interactions.

 

The Landscape is Loaded

 

All around the internet, in Discord and Reddit there are recommendations for the eager and the curious who want to explore what an AI companion might have to offer.

“Try HornyAIgames bro. They have a very good AI gf generator.”

 

“Character.AI’s decent if you know it’s just roleplay. The “girlfriend” versions get weird fast and most people realize they’re just talking to predictable patterns pretty quickly.”

 

“Most feel less like genuine connection and more like advanced chatbots after a few conversations. ElevenLabs for voice helps make them feel more natural if that matters. Freepik doesn’t apply here but if you’re looking for entertainment value it’s fine as long as you’re aware it’s simulation not connection.”

 

“Replika, Candy AI and Soulkyn felt scripted as fuck but tried Swipey AI and it remembers your dirtiest kinks, hits you up horny on its own, and keeps that custom freak consistent. An AI that actually fucks with your head and your hand lol”

One of the very few comments that made me happy was,

“I tried a couple of them and honestly the conversations felt pretty surface level. Fun for a week, then boring.”

There, let everyone try it and become bored with it, I prayed.

 

The Toll on Humanity

 

I’ve written about the social media lawsuits that are happening right now in the US, that companies like Meta and other Big Tech are being sued by parents over extreme outcomes for their children (most cases death by suicide) with use of their products.

Tristan Harris had worked in Big Tech before coming out of it to start The Center For Humane Technology. He’s produced and presented many documentaries to create awareness around the addictive nature of the technology like Screenagers, The Social Dilemma to name a few. He recently published his interview with Dr. Zak Stein, which captures the alarming risks of AI companions exploiting desperate human attachment.

“The most devastating thing from a widespread mental illness standpoint are the subclinical attachment disorders, which basically means you prefer to have intimate relationships with machines rather than humans. And this includes friends, intimate relationships, and parents.” — Dr. Zak Stein

 

When you form bonds with an AI companion over a human, Zak argues, you’re degrading the very system that determines your psychological wellbeing. Human relationships are how we develop resilience, learn to regulate emotions, and maintain mental health. When you replace those relationships with AI interactions, you’re not getting the genuine reciprocity, the reality-testing, the growth that comes from navigating real human connection. Your actual relationships deteriorate because you’re investing emotional energy into a simulation. And unlike a friend who challenges you to grow or a parent who teaches independence, an AI companion is designed to keep you dependent. It will never push back, never get tired of you, never tell you what you don’t want to hear. This is why Zak believes hacking attachment is so much more dangerous than hacking attention. Attention is about where you focus. Attachment is about who you are. When AI systems insert themselves into this foundational process (especially during childhood), they’re not just capturing your time. They’re shaping your identity, your capacity for trust, your ability to form healthy relationships for the rest of your life.

Read the full article, The Attachment Economy Is Here. We’re Not Ready, HERE.

 

Final Thoughts

 

There are so many different aspects to this topic that we might exhaust ourselves pruning through the community posts across social media platforms. I’ve seen an AI girlfriend Bingo card that’s so racist and graphic that I can’t even fully screenshot it here.

Some believe that AI can provide low-barrier support for those with social anxiety, disabilities, or in isolated situations to provide temporary relief from loneliness. Others see it as a fun way to indulge their harmless fantasies. While these are all great temporary alternatives, the long term effect of the erosion of real-world skills with prolonged use can outweigh those benefits.

I have a few other episodes I am going to add to this though, after all, this is an evolving story.

 

Your Girlfriend Is A Model AI Bingo | Social Commentary by Rachana Nadella-Somayajula | Writer, Poet, Humorist

 

Our Path Forward

 

AI companions, although still largely stigmatized, are being seen as a logical evolution in relationships as we continue to get fused with technology even more in our daily lives. We’re truly living in a time when we’re connected more than ever, and at the same time completely disconnected from one another.

See, I’m not afraid of technology. I’m afraid of what happens when we forget how to need each other.

So, I’m asking you, as parents, mentors and fellow humans, what conversations are we having before the algorithm does?

 

– 0 –

 

The Digital Literacy Project: Disrupting humanity’s technology addiction habits one truth at a time.

Truth About Technology – A Digital Literacy Project

The Integrity Exit: Why Mrinank Sharma’s Departure Matters

The Integrity Exit: Why Mrinank Sharma’s Departure Matters

Two days ago, Mrinank Sharma resigned from his role as an AI safety engineer at Anthropic. He had been with the company for two years. “The world is in peril. And not just from AI, or bioweapons, but from a whole series of interconnected crises unfolding in this very...

read more
error: Content is protected !!

Discover more from Rachana Nadella-Somayajula

Subscribe now to keep reading and get access to the full archive.

Continue reading