Revisiting Section 230

In February 2023, the Supreme Court is schedule to hear what will be a historic case regarding the liability of social media platforms regarding their content.

Scheduled for argument, the Court will decide whether Section 230, a provision of the Communication Decency Act of 1996, immunizes interaction computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engages in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.

 

Section 230

 

The case itself is both radical and heartbreaking.

In November of 2015, Nohemi Gonzalez was a 23-year-old American student studying in Paris who was killed when individuals affiliated with the terrorist group ISIS opened fire in a Paris café.

That month, ISIS claimed responsibility for over 100 people killed in Paris. The Gonzales family and estate sued Google as the owner of YouTube.

Their claim is that ISIS has posted hundreds of videos to YouTube specifically meant to incite violence and recruit potential supporters. They take it a step further by alleging that YouTube’s algorithms promoted this radicalized content to users whose characteristics suggested interest in said videos.

How the Supreme Court rules on this case will guide the types of protections that users have from social media harms and addiction. Because the basis of the suit is more about promoting content then regulating it, this type of lawsuit and others like it have teeth that social media companies should be running scared from.

 

new york times

Social Media Companies and Other Organizations Opposition to the Suit

 

Ahead of the Supreme Court hearing, over three dozen amicus curiae briefs have been to submitted for the court’s consideration. Submitted to the court in support of a third party claim, an amicus curiae brief represents third parties who have a strong interest and/or opinion in the matter.

Gonzales has the potential to change the entire social media landscape if the Court chooses to recognize that Section 230 does not provide full immunity to social media platforms (in this case, YouTube). Interest groups ranging from other social media platforms such as Reddit and Meta (Facebook’s parent company), tech companies such as Microsoft, and other interest groups such as the American Civil Liberties Union (ACLU), national security experts, and the Product Liability Advisory Council, Inc. have submitted their briefs opposing liability for social media companies. The reasonings vary but they all take a step back from giving any social media companies and/or their platforms liability.

For social media platforms such as Reddit and Twitter, Gonzales has them nervous about what increased liability would mean for the companies and the content being distributed via their websites/apps.

Reddit argues that Section 230 protects Reddit’s community-based approach to content moderation as the “subreddits” and their content is moderated by users of the site, not Reddit itself. They state that “without robust Section 230 protection, Internet users- not just companies- would face more lawsuits from plaintiffs claiming to be aggrieved by everyday content-moderation decisions.”

Twitter makes a similar argument, stating in their brief “Section 230 ensures that websites like Twitter and YouTube can function notwithstanding the unfathomably large amounts of information they make available and the potential liability that could result from doing so.”

 

 

Here we have social media companies and platforms unwilling to take responsibility for not the content of its users, but, the crux of the lawsuit- whether these platforms have purposefully designed their algorithms to push content to certain users, many times of a harmful nature because that get clicks and therefore, ad dollars.

Of course social media companies are running scared if Gonzales changes the legal liability of their content and how the content is disseminated.

Interestingly, The ACLU, while still supporting the Respondents in their amicus curiae brief, takes a slightly different approach.

They contend that social media platforms does not grant broad immunity to platforms, arguing that that laws regarding discriminatory targeting of ads, antitrust laws, and privacy laws are all still applicable to social media companies and their platforms.

They further state that “[w]hether a platform publishes third-party content to everyone, to a small group, or an individual, does not matter under Section 230, so long as the plaintiff seeks to hold the platform liable for the third-party content published.”

This is an oversimplification of the issue and again, fails to acknowledge the issue regarding harmful content purposefully being pushed through by algorithms designed by social media companies.

 

A brief submitted by a group of economist with expertise in the economics of the digital economy look to Section 230 as a positive-net economic value.

They argue that providing targeted content via algorithms benefits both users and producers, saying “The targeting of search results, advertising, and other content to users fuels the overall digital economy and generates massive benefits for all consumers of information, goods, and services and for the firms that use the platform to reach them.”

What this fails to recognize is that harms that targeted algorithm content can create.

Social media algorithms are responsible for pointing individuals, particularly minors, in the direction of harmful content that causes social media addiction, anxiety, depression, self-harm, eating disorders, and even suicidal ideation, among other things.

Social media companies for too long have been allowed to operate in a way that knowingly causes harm to users while stuffing the companies’ pockets with outrageous and ever-increasing revenue generated from said harmful content.

 

Looking to the Future: Section 230

Revisiting Section 230 is long overdue as social media harms, including social media addiction, have spiraled out of control with the sheer amount of users and the vast content landscape.

Cases like Gonzales have the potential to shape the future of the internet, by creating a space where social media companies would be liable for how their algorithms operate. Instead of regulating user content, this type of accountability would simply stop purposefully promoting harmful content.

0

Related Posts

Self-Fixing Mesh with Y…

Transabdominal repair is a surgical technique used to address various pelvic organs and floor disorders like hernia, stress incontinence, and pelvic organ prolapse involving weakened or damaged tissues within the pelvic floor.…
Read more

Legal and Ethical Considerations…

Transabdominal mesh surgery has been widely used to treat pelvic organ prolapse and stress urinary incontinence. However, concerns have arisen over the safety and efficacy of these surgical mesh devices,…
Read more

Unraveling Transabdominal Mesh: What…

What material is pelvic mesh & what it's used for Pelvic mesh is widely used to correct transabdominal pelvic organ prolapse (POP). To treat pelvic organ prolapse, transabdominal mesh surgery…
Read more