- Facebook whistleblower Frances Haugen on Tuesday described research she says shows the company prioritised profit while stoking division, undermining democracy and harming the mental health of its youngest users.
- Facebook has pushed back on some of Haugen's allegations and disputes that the research she shared from Instagram shows that the app is toxic to teens.
- There are several proposals for legislation that would address the kinds of harms revealed in the Facebook documents shared by Haugen.
The Facebook whistleblower who revealed internal research documenting societal and mental-health risks told senators the company is aware of the problems with its platforms but wants Congress to believe that they are too difficult to fix.
Frances Haugen, 37, a former product manager at Facebook, testified Tuesday before a panel of the Senate Commerce Committee, describing research she says shows the company prioritised profit while stoking division, undermining democracy and harming the mental health of its youngest users. Haugen shared Facebook's internal studies with the Securities and Exchange Commission as well as the Wall Street Journal.
"I saw Facebook repeatedly encounter conflicts between its own profits and our safety," Haugen said. "Facebook consistently resolved those conflicts in favour of its own profits. The result has been more division, more harm, more lies, more threats, and more combat."
Haugen described her work on the social media company's civic misinformation team and her increasing disillusion with its willingness to address the problems its own employees identified. She urged Congress to think outside "previous regulatory frames" and focus on requiring greater transparency from the social media platforms that have tremendous power over public discourse.
Senator Richard Blumenthal, the chairman of the subcommittee holding the hearing, described the Menlo Park, California-based company as "morally bankrupt" and said the impact of its platforms will "haunt a generation." He sharply criticised Facebook Chief Executive Officer Mark Zuckerberg for not taking responsibility for the harm his company caused and said Zuckerberg needs to appear before the committee to answer for the conclusions from his company's internal research.
"Facebook knows its products can be addictive and toxic for children," Blumenthal said. "They value their profit more than the pain they caused children and families."
Senator Marsha Blackburn of Tennessee, the subcommittee's top Republican, sought to identify discrepancies between Haugen's testimony and that of Antigone Davis, Facebook's global head of safety, who appeared before the same Senate panel last week. Blackburn highlighted the risks for children and the way human traffickers are able to use Facebook's platforms.
"Big tech companies have gotten away with abusing consumers for too long," Blackburn said.
Facebook has pushed back on some of Haugen's allegations and disputes that the research she shared from Instagram shows that the app is toxic to teens. Facebook has argued that while some teens say the app makes mental health issues worse, many others say the app makes those same issues better, and that the company is doing the right thing by researching these issues.
Facebook has pointed to its investment in safety and security as proof the company cares about many of its largest issues, like misinformation and fighting hate speech. The company says it has 40,000 employees working on safety and security teams, and claims it has spent more than $13 billion on safety and security efforts since 2016.
The hearing comes the day after Facebook and its photo-sharing platform Instagram and messaging service WhatsApp experienced rare global outages, and two days after Haugen revealed her identity in an interview with CBS's "60 Minutes". The company's shares rebounded Tuesday, after falling 4.9% Monday.
There are several proposals for legislation that would address the kinds of harms revealed in the Facebook documents shared by Haugen. Lawmakers are considering bills to increase transparency in social media algorithms, change liability protections for online platforms, strengthen privacy protections and even break up the biggest tech companies.
While there is bipartisan outrage about Facebook's alleged abuses, there has been little movement on legislation to tighten tech regulations.
- With assistance from Kurt Wagner.