Facebook whistleblower advocates for mandated transparency and accountability in social media

Frances Haugen, the Facebook data engineer who gained international prominence in 2021 as the whistleblower who disclosed tens of thousands of the social media platform”™s internal documents to the Securities and Exchange Commission and The Wall Street Journal, was a recent guest speaker at Fairfield University”™s Open Visions Forum lecture titled “Ethics, the Public Good, and the Challenge of Social Media.”

Haugen spoke on the societal danger posed by social media platforms, particularly on Facebook and Instagram, which she said have a tight grip on online interaction mixed with a singular focus on profiteering, all with little to no government oversight. Previously, Haugen had been the lead product manager of Facebook”™s civic integrity team when she became aware of how Facebook (now Meta Platforms) purposefully pushes its userbase toward divisiveness and extremism for the purpose of ramping up user engagement and data collection, and thus maximizing profits.

Frances Haugen during her 2021 congressional testimony on Facebook”™s business practices. Photo courtesy Fairfield University.

Haugen noted Facebook did not always encourage polarization and extremism. In 2008, she stated that “no one thought Facebook was going to destroy democracy ”“ it was about our friends and our family. But the problem was Facebook needed to have us consume more and more content every single quarter. Facebook makes its money based on ads. The more content you consume, the more ads you see and the more ads you click on.”

In its current iteration, Meta”™s algorithms on its platforms attract users with the prospect of connecting with friends and loved ones, and, according to Haugen, are then inundated and pushed toward extreme content which incite strong emotional responses, which then produces enough content and activity to satisfy Meta”™s financial goals.

Haugen gave examples of how this system works and its consequences, including instances when Facebook suggested users in Germany join Neo-Nazi groups. She also mentioned how a newly created Instagram account ”“ which would, on a daily basis, click on the first five pieces of content related to healthy eating ”“ have the opposite effect for the user.
“Within two weeks, we’re starting to see pro-anorexia content and pro-self-harm content,” Haugen revealed.

According to Haugen, Meta”™s reach and the harm it causes is even more pronounced in developing or “fragile” parts of the world. In such countries, prospective users are lured not only by the opportunity to connect with others through Facebook, but with access to subsidized internet. This tactic, Haugen submitted, allows the company to entrench itself into other societies even more so than in the U.S.

They went into countries where almost no one was online and said, ”˜If you use our products, the internet is free. If you use anything on the open web, you’re going to pay for it,”™” Haugen said. “You and I can make choices to opt out of Facebook’s products. But for those billion people, they don’t get to opt out.”

Meta”™s actions in creating a divisive environment were responsible in great part for causing and encouraging societal strife, Haugen argued, from ethnic violence in Ethiopia to the Jan. 6 insurrection on Capitol Hill. While people from Meta such as Mark Zuckerberg and Nick Clegg, president of global affairs at Meta, claim they are aware of the divisions the company”™s current algorithms and systems cause, no real solutions are put forward because of the loss of profits it would result, according to Haugen. Alternatively, impractical solutions are put forward, such as the use of AI.

“They said, ”˜Don’t worry, we’ve invented this magical AI. The magical AI is going to take down all the bad things. It”™s going to take down hate speech, it”™s going to take down misinformation,”™” Haugen said.

“The problem is when we ask, ”˜Is this hate speech, is this violence-inciting?”™ those things require context, they require understanding who is the speaker, who is the audience. AI is decades away, at the minimum, from being able to solve those problems.”

Haugen believed that despite a culture which puts profits over public safety, Meta can change or be made to change with government mandated transparency, and that such standards will protect citizens as well as companies, who will spend more time fixing issues and will feel freer to invite other parties and organizations to assist them, instead of allowing a volatile situation to deteriorate in secrecy.

“In a world where we have mandated transparency, there is less incentive to lie,” Haugen said, “And that means we’re actually going to have more effective companies.”

During the lecture”™s discussion panel, entrepreneur Candice Peterkin, CEO and founder of SheIsArt Media, asked Haugen how one might remind users of Facebook and other social media that such platforms are concerned primarily with monetizing their personal connections. Haugen responded that users should ask themselves how they feel about the amount of time they devote to social media and how they use it.

David Schmidt, director of applied ethics and associate professor of ethics and business ethics at Charles F. Dolan School of Business at Fairfield University, asked Haugen how one may continue to be so hopeful in the face of an extremely difficult venture as reining in social media companies.

“Fatalism is a sign that someone is trying to steal your power,” Haugen responded. “Fatalism steals from us a chance to act, because you’re never going to actually inspire change if you don’t believe change is possible.”

Philip Eliasoph, professor of the university”™s art history and visual culture in the College of Arts and Sciences and director of the Open Visions Forum, brought up a quote by Elon Musk after his recent acquisition of Twitter that envisioned a “digital town square” where users are welcome to exercise their freedom of speech, within legal limits, and that it will not become a “free for all.” Haugen answered that Musk represents an undercurrent of disaffected users who feel moderation of the past has been too heavy handed.

“I have a feeling he’s going to run up against a point of friction,” she said. “Advertisers will not spend money on systems that are too hostile, and users don’t spend time on places that make them feel bad.”