Frances Haugen, a Facebook whistleblower whose revelations have pitched the social network into its biggest crisis since the Cambridge Analytica furore, will make her first appearance before US senators in Washington on Tuesday.
The hearing comes after she unmasked herself on US television on Sunday night as the source who provided thousands of pages of internal documents to the Wall Street Journal that revealed frequent inconsistencies between Facebook’s public statements and its private research.
Haugen said the documents show the Silicon Valley company repeatedly prioritised “profit over safety”. She has also filed complaints to the US Securities and Exchange Commission over whether Facebook misled investors.
The company’s shares has lost some 13 per cent of its value since the WSJ began publishing its series in mid-September, including another nearly 5 per cent fall on Monday morning on Wall Street.
Haugen’s 60 Minutes interview on Sunday and testimony before the Senate subcommittee are only the beginning of a roadshow that will see the 37-year-old data scientist appear before UK politicians and the Web Summit event in Lisbon over the coming weeks.
Here are five questions that will be top of mind as she makes those public appearances.
1. What is Facebook doing to children?
The senate hearing on Tuesday will focus on protecting children online, and politicians will want to hear more about Facebook’s efforts to tap younger users.
Facebook has argued that it is trying, through controversial projects such as Instagram Kids, to help make it safer for children to take their first steps online.
But Haugen’s disclosures suggested that Facebook viewed younger users as an untapped source of growth, envisaging ways to “leverage play dates” to hook a new generation of users, while glossing over its own findings about the impact of its platforms, such as Instagram, on teenagers’ mental health.
“Facebook’s own research says it is not just [that] Instagram is dangerous for teenagers, that it harms teenagers, it’s that it is distinctly worse than other forms of social media,” Haugen told 60 Minutes.
Facebook has disputed her presentation of its research but last week “paused” development on Instagram Kids to incorporate feedback from policymakers and parents.
2. What happened to Facebook’s ‘civic integrity’ unit?
Before Haugen left Facebook in May, she was working on “civic integrity”, a team dedicated to fighting disinformation and other attacks on democratic elections. There were fears in particular that the fraught political climate around last year’s US elections could spill over into real-world violence.
But after Joe Biden was elected, Haugen said, Facebook executives decided it could “get rid of civic integrity now” — a mistake that she argued paved the way for the Capitol Hill riots in January.
That decision, she told 60 Minutes, was the moment she decided she could not “trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous”.
Facebook executives say the civic integrity team’s work continues under a new guise, as part of a wider strategy to contain misinformation, while disputing that the social network causes polarisation.
3. How can Facebook improve content moderation?
Facebook has invested billions of dollars over the past five years to tackle abuse across its apps, recruiting some 40,000 content moderators working to police its vast network.
When Mark Zuckerberg, Facebook’s chief executive, appeared before Congress earlier this year, he also revealed its reliance on automation. Machine learning, a form of artificial intelligence, is responsible for spotting more than 95 per cent of the hate speech that the company takes down, he said.
But even this may not be enough.
According to Haugen, Facebook was unable to contain anti-vaccination content despite Zuckerberg making it a priority. Internal Facebook memos earlier this year described a “huge problem” and “cesspools of anti-vaccine comments”, according to the WSJ. The company has argued the selected documents mischaracterise its “routine process for dealing with difficult problems”.
4. Are Facebook’s algorithms making things worse?
Even when Facebook claims its technology is helping, it can have the opposite effect. Haugen argued that many of the group’s challenges with polarisation stem from changes to its news feed algorithm in 2018 that prioritised posts from friends and family and played down professional publishers’ posts. Zuckerberg promised at the time that the changes would create “more meaningful interactions”.
Haugen told 60 Minutes that Facebook’s own research shows divisive content is more engaging. “Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she said, and that after the 2020 US election the company chose to “prioritise growth over safety”. The company has denied the allegations, saying it continues to improve its algorithms to reduce “click bait”.
5. Can anything be done to fix Facebook?
Haugen told the WSJ: “If people just hate Facebook more because of what I’ve done, then I’ve failed.” Her personal website declares that, with “public oversight . . . we can have social media we enjoy that brings out the best in humanity”.
Facebook launched its own Oversight Board a year ago, which the company says is an independent group acting as a “supreme court” for speech. But many politicians in the US and Europe believe self-regulation does not go far enough.
Haugen’s disclosures have pinpointed problems that Facebook’s critics have complained about for years. After diagnosing the challenges, her opportunity in the coming weeks is to offer solutions.