It goes against the unwritten rules of the columnists’ guild to admit that there are some complex problems that defy simple solutions. But dealing with the toxic exhausts from Facebook’s social networks counts as one. With 2.8bn users, accounting for about 60 per cent of the world’s internet-connected population, the company has arguably become too big to run, let alone regulate. Yet a mass of messy, half-measures can still help push social media in a better direction.
The damning testimony to the US Senate this week of Frances Haugen, the former Facebook product manager, provided further evidence that the company is damaging society and society needs to respond. “I believe Facebook’s products harm children, stoke division and weaken our democracy,” she told the hearing.
In spite of Facebook’s attempts to trash her credibility, Haugen made a powerful case. A computer scientist by training, she has worked at Facebook, Google, Pinterest and Yelp since 2006, and had access to reams of internal Facebook research, which she leaked to the Wall Street Journal.
Her most damaging charge was that the company’s leadership knew about the problems that Facebook and its photo sharing app Instagram caused but chose to put their “astronomical profits before people”. In so doing, they had misled both users and shareholders. Haugen is urging the Securities and Exchange Commission to investigate.
There appeared to be rare bipartisan consensus among senators at the hearing about the urgency of the issue and the necessity to intervene. Some drew comparisons between Facebook and tobacco and car companies, which all denied their products did serious harm until legislators concluded otherwise.
The senators should summon Facebook’s chief executive Mark Zuckerberg to respond to Haugen’s testimony and encourage more independent research into the impact of the company’s services and the ways in which its algorithms work. They should also support legislation to protect children, defend privacy and review free speech and antitrust laws.
Even Facebook accepts it is time to rewrite the rules of the internet in the US. “Instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act,” Facebook said.
The company may well be counting on its muscular lobbying team to steer legislation in a favourable direction. It may also believe that compliance costs will fall more heavily on nascent competitors, enabling Facebook to dig an even deeper moat around its business.
This latest row, and its own employees’ outspokenness, will surely intensify pressure on Facebook to reform itself from within. To its credit, Facebook conducted the research into the potential harms of its services — even if, in Haugen’s view, it did not act on it with sufficient vigour.
In recent years, the company has invested heavily in AI-powered systems to flag harmful content and it has hired 40,000 content moderators to take down offensive posts. Its creation of an independent oversight board also attempts to introduce some accountability into its decision-making process around content — even if the board’s remit is too narrow. It has suspended its plans to extend Instagram to younger children.
Yet Facebook is like a baby trying to grasp a soapy basketball. The company has no chance of ever fully getting to grips with toxic content given its monetisation model and global scale. How can Facebook monitor the deluge of noxious content in dozens of languages and cultures it does not understand? In Myanmar, and elsewhere, the company stands accused of allowing its services to be used to incite ethnic violence.
The best hope of constraining the company may lie in more imaginative competition and more local networks. Breaking up the Facebook parent might solve nothing if the “baby Facebooks” operated in the same way.
But, as Haugen suggested, it is possible to design a more responsible social network that treats users as co-creators rather than products. Jaron Lanier, the maverick technologist, argues this could be done by giving users more control over the content they produce and a greater financial stake in the game.
If Facebook cannot build a more trustworthy social network then it is a fair bet that someone else will figure out how to do so. A huge market opportunity has opened up to build one that prioritises users rather than advertisers. To invert Silicon Valley speak, Facebook has become a maximum viable product. It is time to invent new, and better, social networks.