As a seasoned personal injury attorney, Tom Plouff recognizes that the legal profession is ever evolving. In today’s world, individuals, particularly adolescences, use social media platforms more than ever, with many harmful consequences. Social media platforms include Facebook, Snapchat, Instagram, TikTok, and YouTube and their usage can lead to devastating outcomes. Social media usage has been linked to causing adolescents to commit suicide, engage in bodily harm, develop eating disorders, exposure to sex trafficking, exposure to cyberbullying, and to suffer from severe depression, among other conditions.

 

How Social Media Usage Harms Adolescents

According to this survey by the Common Sense Census, overall screen time for kids has jumped 17% since before the pandemic. Daily screen time for tweens (kids 8-12) rose from four hours and forty four minutes a day with teenagers (13-18) skyrocketing their usage from an already staggering seven hours and twenty two minutes to over eight and a half hours. Instagram clocks over 57 million users below the age of 18 which means that 72% of America’s youth is on Instagram. This long term exposure to the toxic content by a not yet fully  developed brain can lead to long-term, irremediable effects of adverse influences, including addiction and a damaged and fractured sense of psychological well-being. Those effects being:

 

  • Depression
  • Anxiety
  • Eating Disorder
  • Sexual Exploitation
  • Sexual Trafficking
  • Self-Harm
  • Suicide
  • Bodily-Harm

 

Excessive screen time and social media addiction should be on the tips of the tongue of parents of today’s adolescents. It’s no secret that excessive screen time effects mental health.

As one example, NPR reported that 17% of teen girls say their eating disorders got worse after using Instagram.

A Meta (Instagram’s parent company) internal slide presentation from March 2020 found that 32% of teen girls say when they felt bad about their bodies, Instagram made them feel worse. Social media usage and social media addiction has been linked to everything from body issues, self-harm, addiction, depression, and even suicide idealization.

 

According to the Wall Street Journal, another internal slide from 2019 states that Facebook knew they caused these problems, “We make body image issues worse for one in three teen girls… teens blame Instagram for increases in the rate of anxiety and depression.”  Another presentation showed that among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram.

 

Other internal Facebook documents show that not only did Meta know of the problem, but that they have made minimal efforts to address these issues while downplaying them publicly. At a 2021 Congressional hearing, Meta CEO and Facebook founder Mark Zuckerberg claims, “the research that we’ve seen is that using social apps to connect with other people can have positive mental-health benefits.” Yet those internal documents and presentations show that Meta knew the opposite to be true.

 

 

 

In fact, an internal study found that social comparison is worse on Instagram than other social media platforms. This research has been reviewed by Facebook executives, as cited in a 2020 presentation by Zuckerberg. Yet publicly, Meta continues to downplay the detrimental effects of its products on society, taking little to no ownership not only on the problems its products have created, but a seeming lack of willingness to do anything about it.

 

A New Wave of Lawsuits: Paving the Way for Justice

 

If you or someone you know has faced any of these devastating consequences, there are legal avenues available to pursue. In the past, Section 230 of the Communication Decency Act has shielded social media companies from bearing any responsibility from the harms that come from using their platforms.

However, a new wave of lawsuits are potentially paving the way for all of that to change. Revisiting Section 230 is long overdue as social media harms, including social media addiction, have spiraled out of control with the sheer amount of users and the vast content landscape.

Historically, this immunity provision shields companies like Facebook from being sued.  However, there is a very significant case that as of January 2023 is pending in the United States Supreme Court.  At issue is the scope of the immunity provision.  It is hoped by victims of social media platforms that the immunity will be narrowed so cases can be brought.

There is also legislation being discussed in the United States Congress, with bipartisan support, that would reduce if not eliminate immunity for social media platforms like Facebook.

 

 

Meta (the parent company of Facebook and Instagram) is embroiled in various lawsuits, including a multi-district litigation lawsuit, involving their algorithm and how it detrimental effects on children and teens.

“Meta has invested billions of dollars to intentionally design their products to be addictive,” one of the lawsuit states, “and encourages use they know will be problematic and highly detrimental to their users’ mental health.”

Google (as the parent company of YouTube) is facing a similar lawsuit, waiting to be heard by the Supreme Court.

Companies like Meta and TikTok know that the adolescents who use their products have not fully developed brains and that those users are much more likely to suffer both physical and psychological harm. And yet, these companies not only fail to mitigate the issue by designing the products with any protections, but currently and knowingly continue to use algorithms that lead to these life-long, and at times deadly, consequences. And yet, these companies not only fail to mitigate the issue by designing the products with any protections, but knowingly continue to use algorithms that lead to these life-long, and at times deadly, consequences.

There are some interesting legal issues with these social media cases.  Once such issue is whether the algorithm used by these social media companies, which drives negative behavior with harm to adolescents, should be considered a product.  If so, then product liability laws can be used to show the social media platforms’ algorithms are unreasonably dangerous and a proximate cause of harm to adolescents.

If you or someone you know has faced any of these devastating consequences, there are legal avenues available to pursue. Learn more about Social Media Lawsuits.