Facebook’s moment of regulatory reckoning


“It’s been quite a week,” wrote Mark Zuckerberg, Facebook’s chief executive. This is an understatement. On Monday, there was an outage of the company’s three main apps, freezing 3.5bn users out of their accounts for over six hours. On Tuesday, a former employee-turned-whistleblower alleged to Congress that Facebook knowingly prioritises profits from online engagement over harm to users. As embarrassing as the first event was for Facebook, it is the second that will be the most consequential.

Frances Haugen, who worked at Facebook’s “civic integrity unit”, detailed to Congress how the company pushes content designed to elicit strong responses, enticing users to share posts and to linger on the platform, where they can be bombarded with advertising. That social media manipulates division for clicks, and makes teenagers feel worse about themselves are not new claims. What feels novel is the bipartisan consensus, which should not be squandered, that something ought to be done to curb the power of social media. The question is what.

Lawmakers have dubbed it Facebook’s “Big Tobacco” moment, akin to when the US decided to rein in cigarette manufacturers, despite the industry’s claim that its products did no serious harm. Rather than tobacco, it may be more apposite to think of sugar; indeed, that is a comparison one Facebook executive made in 2019. Sugar is known to be unhealthy, addictive and readily available. But it is not banned, even if America’s children consume far too much of it.

Tighter oversight now seems inevitable. Congress is considering options, from tightening privacy laws to strengthening the hand of antitrust watchdogs. Facebook — with evident self-interest — has tried to position itself to help shape new rules. Zuckerberg has broken rank with his peers to back (partial) reform of Section 230 of the 1996 Communications Decency Act, which currently gives tech platforms immunity from being sued over user-generated content.

But the bigger issue is whether shoehorning the social media world into existing laws and regulations suffices. Antitrust and markets regulators may feel they have enough fertile ground: Haugen also complained to the securities watchdog, accusing Facebook of misstating key user metrics. She advocates forcing Facebook to turn over its algorithms that determine what content users see. That is a meritorious proposal that neatly sidesteps arguments over policing of free speech that bog down debates on whether content should be moderated more closely.

But with whom should Facebook be forced to share algorithms? A new sectoral watchdog ought to be considered. Social media are no longer disrupters to be shielded from regulation. They are part of a digital economy where personal data is traded for convenience. Yet they have no sectoral oversight, as applies to, for example, banking. It is notable that in the highly regulated area of finance that Facebook tried to enter, a co-ordinated response from policymakers and watchdogs around the world rightly forced it to rethink plans for a digital currency.

After Haugen’s testimony Zuckerberg used a Facebook post to hit back at the “false picture” he claims was painted. He called for Congress to determine whether there should be a legal age for using the internet. Age limits, and how those are policed, are legitimate points for Congress to now debate. But Haugen has raised a broader range of issues on which lawmakers should consider legislating. She is right that Facebook will not change without it.


Source link