The whistle-blower who revealed thousands of internal Facebook documents urged Congress on Wednesday to change social media’s business incentives so platforms are forced to consider long-term harms to users and society, rather than just short-term profits.
Speaking before the House subcommittee on technology, Frances Haugen, a former Facebook product manager, said the best strategy for legislation would focus on product design and transparency to “open up the black box at Facebook.” She warned against rules that target specific kinds of content.
But the urgency with which she and other witnesses called for tech industry guardrails was met with partisan interpretations of the nature of the problem. Despite bipartisan support for tougher regulation, there was little agreement about how -- or if -- Congress should change the legal liability protections for online platforms under Section 230 of the 1996 Communications Decency Act.
Most Republicans focused their questions on content moderation decisions that they claim silence or downplay conservative viewpoints, while Democrats criticized the way online platforms enable discrimination against and harassment of marginalized groups.
“Facebook wants you to get caught up in a long, drawn-out debate over the minutiae of different legislative approaches,” Haugen said in her opening statement. “Please don’t fall into that trap. Time is of the essence.”
Wednesday’s discussion included four Democratic bills that would chip away at Section 230 liability protections for online platforms. Energy and Commerce Chair Frank Pallone, a New Jersey Democrat, said the goal of that legislation is to clarify this statute, written in 1996 for a much earlier version of the internet.
“These targeted proposals for reform are intended to balance the benefits of vibrant free expression online while ensuring that platforms cannot hide behind Section 230 when their business practices meaningfully contribute to real harm,” Pallone said.
After the hearing, a Meta spokesman said the company is constantly balancing free expression and limiting harmful content. Facebook has been asking Congress to update tech regulation for three years, the spokesman said, adding that it’s no surprise that Republicans and Democrats disagree with the company’s decisions, when they often disagree with each other.
|H.R. 2154||“Protecting Americans from Dangerous Algorithms Act“||
Would remove Sec. 230 protections if a platform’s algorithms amplify certain content involving terrorism or violating civil rights
|H.R. 3184||“Civil Rights Modernization Act of 2021”||Would amend section 230 to ensure that civil rights protections apply to targeted ads|
|H.R. 3421||“SAFE TECH Act”||
Would remove Sec. 230 protections for paid content, civil rights violations and cyber-stalking. Would shift the burden of proof for claiming Sec. 230 protections
|H.R. 5596||“Justice Against Malicious Algorithms Act of 2021”||Would remove Sec. 230 protections when a platform “knowingly or recklessly uses an algorithm” to spread content that causes serious injury|
Rashad Robinson, president of online racial justice organization Color of Change, said all four of the bills are “essential for reducing the tech industry’s harmful effects on our lives,” and he urged Congress to not leave tech companies to regulate themselves.
“Congress is rightly called to major action when an industry’s business model is at odds with the public interest -- when it generates the greatest profits only by causing the greatest harm,” Robinson said in his prepared remarks.
While Haugen didn’t weigh in on specific bills, she warned against proposals to regulate content on a platform like Facebook that has nearly 3 billion users in roughly 140 languages. She said rules would be more effective if they address a platform’s design and business incentives.
“I do not support removing 230 protections from individual pieces of content because it’s functionally impossible to do and have products like we have today,” Haugen said.
Despite the technical challenges and partisan differences, James Steyer, head of the family-focused nonprofit Common Sense Media, who also testified Wednesday, said he expects Congress to coalesce around measures to improve privacy, hold platforms accountable and address the concentration of industry power in the hands of companies like Facebook, Amazon.com Inc., Apple Inc. and Alphabet Inc.’s Google.
Next year “is going to be the year of tech legislation and regulation,” Steyer said in an interview before the hearing. “It’s not about Democrats and Republicans. It’s about America’s kids and families and our democracy and our sense of right and wrong.”
The internal documents that Haugen shared with journalists, Congress and securities regulators over the past several months suggest that leaders of Facebook and its parent company, now known as Meta Platforms Inc., were aware of the mental health and societal risks posed by its platforms. The revelations have sparked renewed outrage in Washington, where Facebook was already encountering antitrust scrutiny and criticism over the spread of misinformation concerning the coronavirus pandemic and last year’s election.
Lawmakers on Wednesday said those documents contradict what Meta chief executive Mark Zuckerberg told the very same subcommittee in a March hearing about the risks his company’s platforms pose to young people and users around the world.
Facebook has dismissed Haugen’s allegations, saying she didn’t work directly on some of the issues she raised, even though she provided Congress with relevant documents. The company has also pushed back on claims that its platforms drive polarization and has pointed to investment in content moderation and automated systems to identify harmful information.
Haugen has met with other congressional committees behind closed doors, including the House Intelligence panel, according to a committee aide. Other panels also have access to the trove of documents she copied before leaving Facebook.
Haugen, who worked on Facebook’s Civic Integrity and Threat Intelligence teams, has promised to be a resource for lawmakers investigating how Meta’s platforms could have been used to undermine national security and public confidence in democratic elections.
Responding to questions from Illinois Republican Adam Kinzinger, Haugen said Facebook has “chronically underinvested” in counter-terrorism measures, allowing the platform to be used as a tool for illicit activity. She also repeatedly pointed out that the vast majority of Facebook’s investment in content moderation is for English-language information, leaving users in other languages vulnerable to harmful information.
Facebook wasn’t the only company criticized during the hearing. Haugen and Kara Frederick, a Heritage Foundation research fellow and former Facebook employee, both warned about the dangers of TikTok, whose parent company ByteDance Ltd. is based in Beijing. Haugen said TikTok “is designed to be censored” since the platform automatically serves up short videos -- a model that gives the user much less choice about what to see. TikTok, which is popular with even pre-teens and children, could be more addictive than Instagram, Haugen said, adding that there’s “nowhere near enough transparency about how TikTok operates.”
Lawmakers also raised concerns about how social media algorithms, including those used by Snap Inc., connect young people with sexual predators and illegal drugs. Both Democrats and Republicans said Section 230 shouldn’t protect tech companies from people seeking legal redress for real-world harms caused by the platforms’ design.
But even members of the subcommittee recognized the daunting task of turning that shared outrage into legislation.
“We definitely have to figure out what to do, and you all gave us food for thought,” Oregon Democrat Kurt Schrader told the witnesses. “The hard part is getting to the solution at the end of the day.”
To contact the reporter on this story:
Anna Edgerton in Washington at email@example.com
To contact the editors responsible for this story:
Sara Forden at firstname.lastname@example.org