Social Media Killed Her Teen Son, Mother Alleges in Lawsuit

Donna Dawley joined a growing group of parents suing social media platforms for injuries caused by deliberately addictive algorithms that can cause depression and suicide.

David Charbonneau, Ph.D. 

Christopher James Dawley feature


A mother whose teenage son died by suicide is suing Meta Platforms Inc., owner of Facebook and Instagram, and Snap, Inc., owner of Snapchat, for the wrongful death of her son.


In a complaint filed last week, Donna Dawley of Kenosha, Wisconsin, alleged the social media companies deliberately designed their algorithms to addict people, particularly minors and children, to their platforms and to limit parents’ ability to monitor and protect their children online.


Christopher James Dawley, known as CJ to his friends and family, was 14 when he signed up for Facebook, Instagram and Snapchat. Like many teenagers, he documented his life on those platforms.


CJ worked as a busboy at Texas Roadhouse in Kenosha. He loved playing golf and watching “Doctor Who” and was highly sought after by top-tier colleges.


“His counselor said he could get a free ride anywhere he wanted to go,” his mother told CNN Business.


During high school, CJ developed what his parents described as “an addiction to social media.”


By his senior year, “he couldn’t stop looking at his phone,” Dawley said.


“He often stayed up until 3 a.m. on Instagram messaging with others, sometimes swapping nude photos. He became sleep-deprived and obsessed with his body image,” she said.


According to his family’s lawsuit, CJ’s addiction spiraled until:

“On January 4, 2015, while his family was taking down their Christmas tree and decorations, CJ went to his room. He sent a text message to his best friend —  ‘God’s speed’ — and posted an update to his Facebook page: ‘Who turned out the light?’


“CJ held a 22-caliber rifle in one hand, his smartphone in the other and fatally shot himself. He was 17
 His parents said he never showed outward signs of depression or suicidal ideation.


“‘When we found him, his phone was still on, still in his hand, with blood on it,’ Donna Dawley said. ‘He was so addicted to it that even his last moments of his life were about posting on social media.’”


Dawley told CNN she and her husband, Chris, believe CJ’s mental health suffered as a direct result of the addictive nature of the platforms.


They said they were motivated to file the lawsuit against Meta and Snap after Facebook whistleblower Frances Haugen leaked hundreds of internal documents — including some showing the company was aware of the ways Instagram can damage mental health and body image.


Algorithms designed to addict

The Dawley suit alleges the algorithms of Facebook and Snapchat are “deliberately designed to get people, particularly minors and children, addicted, and they fail to warn parents of the dangers while limiting parents’ ability to monitor and protect their children online.”


The complaint states:

“The algorithms in Defendants’ social media products exploit minor users’ diminished decision-making capacity, impulse control, emotional maturity, and psychological resiliency caused by users’ incomplete brain development.”


“The complaint asserts the social media giants failed to design their products with any protections to “account for and ameliorate the psychosocial immaturity of their minor users.”


This latest lawsuit joins a growing number of suits filed against social media companies, alleging harm against minors in the wake of Haugen’s revelations of Facebook’s deliberate manipulation of user content to encourage addiction to its platform.

Inspired by Haugen’s new evidence, Matthew Bergman, the Dawleys’ lawyer, formed the Social Media Victims Law Center. He now represents 20 families who filed wrongful death lawsuits against social media companies.


According to the Center’s website:

“[These cases] center around strict product liability and design defects by the manufacturers. Specifically, Meta Platforms and Snap failed to: provide adequate safeguards from harmful and exploitive content; verify minor users’ age and identity; adequate parental control and monitoring; protect minor users from intentionally being directed to harmful and exploitive content; offer protection for minor users from being sexually exploited and abused; design non-addictive social media products; and provide adequate notification to parents about the dangerous and problematic usage of social media by minor users.”


Bergman told CNN in the video portion of the coverage, “This is very much cutting-edge litigation as we are only now learning exactly what these algorithms are.”


A dangerous product, not free speech


The Dawley suit explicitly seeks to head off any free-speech defense by the platforms, asserting the legal issue is not the intellectual content of any third parties using the platform, but the product itself, specifically its addiction-generating algorithms:


“Defendants’ product features are designed to be and are addictive and harmful in themselves, without regard to any content that may exist on Defendants’ platform. For example, Meta’s “like” feature and Snapchat’s “Snapstreaks” [which facilitate ‘endless scrolling’] are content neutral.


“None of Plaintiff’s claims rely on treating Defendants as the publisher or speaker of any third party’s words or content. Plaintiff’s claims seek to hold Defendants accountable for their own allegedly wrongful acts and omissions, not for the speech of others or for Defendants’ good faith attempts to restrict access to objectionable content.”


The suit also alleges Facebook and Snapchat deliberately designed their platforms to make parental oversight more difficult and failed to adequately warn parents of the dangers.


According to the filing, Facebook and Snapchat “stated in public comments that their products are not addictive and were not designed to be addictive,” even though they knew, or should have known, the statements to be untrue.


In addition, “neither Meta or Snap warned users or their parents of the addictive and mentally harmful effects that the use of their products was known to cause amongst minor users, like Decedent CJ Dawley.”


On the contrary, the companies went to significant lengths “to conceal and/or avoid disclosure as to the true nature of their products.”


146% increase in suicide among teens correlated with social media use


Citing data from the Centers for Disease Control and Prevention, Bergman’s Center points out that between 2007 and 2018, there was a 146% increase in suicide among children between the ages of 12 and 16.


During Senate hearings in October 2021, lawmakers on both sides weighed in regarding their concern about the risk to minors posed by social media algorithms.


Sen. Marsha Blackburn, (R-Tenn.), accused Facebook of intentionally targeting children under 13 with an “addictive” product — despite the app requiring that users be 13 years or older.


“It is clear that Facebook prioritizes profit over the well-being of children and all users,” Blackburn said.

Subcommittee Chair Richard Blumenthal, (D-Conn.), echoed Blackburn’s criticisms:


“Facebook exploited teens using powerful algorithms that amplified their insecurities. I hope we will discuss whether there is such a thing as a safe algorithm.”


John Handley, political scientist and blogger at Medium.com, last year said, “The direction of causality has not been firmly established yet, but the fact that a huge spike in teen suicide and depression coincides with the expansion of social media, and that social media use correlates strongly with depression (especially for girls) is extremely concerning.”


Not about the money


“Money is not what is driving Donna and Chris Dawley to file this case and re-live the unimaginable loss they sustained,” Bergman told CNN.


“The only way,” he said, “to force [social media companies] to change their dangerous-but-highly-profitable algorithms is to change their economic calculus by making them pay the true costs their dangerous products have inflicted on families such as the Dawleys.”


He added:

“When faced with similar instances of outrageous misconduct by product manufacturers, juries have awarded tens of millions of dollars in compensatory damages and imposed billion-dollar punitive damage awards. I have every reason to anticipate a jury, after fairly evaluating all the evidence, could render a similar judgment in this case.”


According to the Dawley court filing, just before he shot himself, CJ handwrote the following message to his family on the envelope that contained his college acceptance letter:


“I don’t want you to think this is at all your fault. It’s not. I’m f****d up. You showed me love and family. I wish I didn’t have to do this to you guys. I love you all more than the world. It’s hard to be a person right now. And I wish I believed in God. If God does exist, he will have to beg for my forgiveness.


“There are a lot of things you don’t know about me. What goes on inside my head scares me. I tried to be a good person. It’s just as I am yelling in a dark tunnel running after the light so I can be happy.


“But my legs are tired and what’s a man to do when the lights go out. Tell my friends thank you for the friendship and support and I love them with all my being. I tried.”

Leave a Comment

Your email address will not be published. Required fields are marked *