In the fall of 2021, former Facebook employee Frances Haugen testified before Congress on the detrimental effects of Facebook. Haugen was knowledgeable of the situation and testified to the lack of action being taken by Facebook and their parent company Meta. Haugen didn’t hold back, calling out Facebook’s products for harming children, stoking division, and weakening our democracy, going as far to call the problem “one of the most urgent threats to the American people, to our children and our country’s well-being, as well as to people and nations across the globe.”
Haugen joined the Facebook team in 2019, working as the lead product manager for Civic Misinformation and later, Counter-Espionage. To call Haugen’s testimony damning would not be an aggressive enough descriptor. Haugen claims that Facebook frequently encountered conflicts between their own profits and the safety of their users, and “consistently” resolved those conflicts in favor of their own profits. Further, she provided documents to Congress that show Facebook has been repeatedly misleading entities such as the public, the US government, their shareholders, and governments around the world. Facebook mislead these entities regarding its role in spreading hateful and polarizing messages, and the safety of children, among other things.
Haugen calls out the secrecy at Facebook. Nobody, not even researchers, regulators, nor Facebook’s own Oversight Board, knows how Facebook personalizes their newsfeed and algorithm. She likens the situation to imagining a Department of Transportation that could only watch cars drive down the road without being able to ride in the car, conduct safety tests, or even know how or why a safety belt should exist. Like actual highways, the information superhighway is travelled upon by over 80% of Americans. Haugen pointed out, having that many people using a largely unregulated vehicle has proved to be dangerous.
One of the most vulnerable groups of social media users is children and teens. This audience’s brains are still forming and are like sponges for the images and words they see on sites such as Facebook, Instagram (both owned by parent company Meta), and other social media platforms. Excessive screen time and social media addiction would certainly be on the tips of the tongue of parents of today’s adolescents. It’s no secret that excessive screen time effects mental health.
As one example, NPR reported that 17% of teen girls say their eating disorders got worse after using Instagram. An internal slide presentation from March 2020 found that 32% of teen girls say when they felt bad about their bodies, Instagram made them feel worse. Social media usage and social media addiction has been linked to everything from body issues, self-harm, addiction, depression, and even suicide idealization.
According to the Wall Street Journal, another internal slide from 2019 states that Facebook knew they were the cause of these problems, “We make body image issues worse for one in three teen girls… teens blame Instagram for increases in the rate of anxiety and depression.” Another presentation showed that among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram.
Other internal Facebook documents show that not only did Meta know of the problem, but that they have made minimal efforts to address these issues while downplaying them publicly. At a 2021 Congressional hearing, Meta CEO and Facebook founder Mark Zuckerberg claims, “the research that we’ve seen is that using social apps to connect with other people can have positive mental-health benefits.” Yet those internal documents and presentations show that Meta knew the opposite to be true.
In fact, an internal study found that social comparison is worse on Instagram than other social media platforms. This research has been reviewed by Facebook executives and was even cited in a 2020 presentation by Zuckerberg. Yet publicly, Meta continues to downplay the detrimental effects of its products on society, taking little to no ownership not only on the problems its products have created, but a seeming lack of willingness to do anything about it.
And it’s not just the detrimental effects on children that are riddling Meta products and causing societal decay. Disinformation and misinformation on Facebook has been a growing problem for years. In the hours following the January 6th Insurrection on the U.S. Capital, Facebook’s Chief Technology Officer posted on an internal message board “Hang in there everyone” and that Facebook should allow for peaceful discussion of the riot but not call for violence. However, even the response from internal employees was that of distrust. One employee commented “How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today?”
According to ProPublica, between Election Day 2020 and the January 6th siege of the U.S. Capital, Facebook groups had at least 650,000 posts attacking the legitimacy of the election, some 10,000 posts a day. Facebook became an unchecked incubator for spreading baseless claims, including many that called for physical violence. Interestingly, Facebook had set up an internal task force to attempt to regulate groups with violent of hateful content, and yet shortly after the election, they dissolved the task force and rolled back other intensive enforcement measures. Haugen had just a few choice words for Facebook groups, “Groups are a disaster.”
Haugen also had this to say about Facebook’s impact on our global society: “Facebook did not invent partisanship. They did not invent polarization. They didn’t invent ethnic violence. But the thing that I think we should be discussing is what role, what choices did Facebook make to expose the public to greater risk than was necessary?” These are questions that still need answers and Meta needs to be held accountable for the detrimental effects they are having on individuals and society as a whole.