Ofcom sets out rules for TikTok, Snap and OnlyFans


Video streaming platforms such as TikTok, Snap and OnlyFans must verify the age of users and take down harmful, illegal content including hate speech and terrorism or face hefty fines, warned the UK media regulator Ofcom.

The watchdog’s new guidelines offer clarity on the Audiovisual Media Services Regulation which came into effect last November, and allow Ofcom to levy penalties of up to 5 per cent of relevant turnover, or £250,000.

But the rules will only apply to video streaming services with a regional headquarters in the UK, and therefore excludes YouTube, Netflix, and Disney Plus. Regulation of YouTube content, for instance, falls to the Irish authorities, where its European headquarters are, Ofcom said.

The regulator said that over the next 12 months, it would focus on regulating highly sensitive areas like child sexual abuse, online hate and terror, under-18 protections on adult sites, and processes around user reporting of harmful content on platforms. It will not adjudicate individual pieces of problematic content, but ensure that the companies themselves were transparent and consistent in how they dealt with illegal content on their platforms.

The guidance comes ahead of the UK’s Online Safety Bill, which aims to crack down on hate speech and bullying, currently being debated in Parliament. Ofcom said it is hiring staff across its technology and policy teams to cope with its new responsibilities.

Dame Melanie Dawes, Ofcom chief executive, said: “Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them. The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”

The guidance does not put pressure on video platforms to roll out any major product changes, such as new age verification technology, as Ofcom admits consumers could just switch to services headquartered outside of the UK. The regulator said it hopes to co-ordinate these requirements with counterparts in Europe and elsewhere, so there is a globally consistent regulatory framework in future.

Analysts said the regulation and Ofcom guidelines were not “meaningful” and were “inherently set up to fail” because they don’t address the biggest player in the video streaming industry, YouTube.

“One thing we know about how national governments manage the digital environment is they are invariably late to the party, they are trying to solve yesterday’s problems. But regulation needs to be fit for purpose,” said Tim Mulligan, lead video analyst at digital media research company Midia Research. “All those players are acutely aware of the negative impacts of illegal content and a lot of work is already being done to regulate this.”

He added: “The bigger question is how much time young people should be spending on these platforms and what is an ethical way to monetise this.”


Source link