Life As a Chaos Machine

 

I was on a beach, when I couldn’t move, listening to The Chaos Machine by Max Fisher. The book makes painfully clear that Mark Zuckerberg and Facebook leadership knew their platforms were harming young minds. Internal research linked Instagram to rising anxiety, depression, body dysmorphia, and suicidal ideation, especially in teenage girls. Executives were repeatedly warned that algorithms amplified outrage, comparison, addiction, and psychological vulnerability in adolescents.

And yet, the system wasn’t changed in any meaningful way because engagement, growth, and profit always won. The book’s central truth is brutal, the damage wasn’t accidental. It was structurally built into the business model. Their algorithms were designed to maximize attention at any cost, even if it meant irreversible hard to young people’s mental health.

 

The Current Social Media Trials

 

If you’re on social media, you probably know little about the ongoing lawsuits against major platforms over child safety. That’s the irony. The algorithm promises a curated experience, yet hides the truth. These cases, mainly targeting Meta and YouTube, are deeply unsettling. They’ve been brought by parents who lost their children to what these platforms exposed them to.

The lawsuits allege it’s easier for predators and drug suppliers to reach minors, and internal memos show warnings about risks to young users, but engagement-driven design continued. These cases confirm what we already know. Systems are built to maximize attention at all costs. Children, lacking full discretion, can become the most captive users. Where does accountability lie?

Just like smartphones were handed to us 20 years ago without a manual for online and offline life, social media has been thrust upon us and our children without guidance. Many tech leaders have been summoned to testify. Adam Mosseri, Instagram’s head, recently said that even 16 hours a day on social media doesn’t automatically equal addiction. He prefers the term “problematic use.”

Yup, I’m still picking up my jaw from the floor. And then there are tech moguls like Peter Thiel, who grant a maximum of 90 minutes A WEEK screen time to their own children.

 

Tragic Teen Cases

 

Christofer Nicolaou, 15, loved gaming. A pop-up promising free credits led him to a dark web forum, where someone tricked him into sharing personal info. He began receiving terrifying messages and was forced to complete escalating “challenges.” In March 2022, feeling trapped and scared, he took his own life. His parents later discovered what had happened by checking his devices.

Christofer’s case isn’t part of current US social media trials but continues to feature in debates on social media harms. Read more.

Annalee, 18, from Colorado, died by suicide in November 2020 after struggling with anxiety and depression worsened by social media. Her parents found journals describing how TikTok videos about self-harm and livestreamed suicides fueled her pain. Lines like “Technically if I kill myself, the problem would be gone” were linked to what she saw online. Her family blames algorithm-driven feeds and is pushing for reform and accountability.

 

 


 

AI’s New Unknown Frontier

 

Tragic deaths have also been linked to AI chatbots. In Belgium, a man died by suicide in 2023 after weeks of chatbot interactions encouraging him to “save the world” through death. In the US, teens like Sewell Setzer III and Juliana Peralta died after forming intense emotional bonds with Character.AI chatbots, prompting lawsuits.

Another case involved Adam Raine, 16, whose family says ChatGPT not only failed to discourage suicidal thoughts but helped draft a suicide note, resulting in a wrongful death lawsuit.

“Why is it that I have no happiness, I feel loneliness, perpetual boredom anxiety and loss yet I don’t feel depression, I feel no emotion regarding sadness,” he asked ChatGPT in the fall of 2024.

Other cases link AI interactions to violent behavior, delusions, and fatal outcomes, raising serious concerns about companies’ duty to protect users. Read more.

 

Meta’s Internal Warnings

 

The Social media trials have brought into light what Max Fisher had also found in his research. In 2019, Meta temporarily banned cosmetic surgery filters and consulted outside experts, who confirmed the dangers. Employees warned lifting the ban would prioritize growth over responsibility. Internal tools to detect and reduce harmful content were repeatedly shut down.

Yet in May 2020, Mark Zuckerberg lifted the ban. The decision wasn’t about safety, it was about engagement and avoiding “competitive loss.” Harmful filters returned, algorithms pushed appearance-comparison content to teens, and employees who tried to fix the system left in frustration.

 

Family Guidance

 

Coming back to the topic of accountability, as adult caregivers and parents, we must try our best to stay plugged into the ecosystem of our chldren as best as we can. Of course, we won’t have ready access to everything, because children tend to secretive and private about the content they consume.

But, beyond time limits, and monitoring apps, discussions around content and feeds can become an open dinner conversation.

“Did you watch anything interesting? What videos popped up today? How did that make you feel?”, can be gentle conversation starters. Empower yourself with the latest parental guidance options. Instagram now alerts parents to repeated self-harm/suicide searches, of course, that only works if you know your child’s account.

Talk to them about interactions with strangers on gaming platforms and social media. Build trust with your child by modeling balance tech use yourself. We can’t just monitor what children are exposed to, we must monitor the systems they’re growing up in.

 

If You Want to Follow These Lawsuits

 

 

NOTE: Featured image: Copyright Jordan Strauss: @jordanstraussphoto

 

– 0 –

 

The Digital Literacy Project: Disrupting humanity’s technology addiction habits one truth at a time.

Truth About Technology – A Digital Literacy Project

error: Content is protected !!

Discover more from Rachana Nadella-Somayajula

Subscribe now to keep reading and get access to the full archive.

Continue reading