This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
UK’s internet watchdog unveils online criminal crackdown
The UK’s communications regulator Ofcom, has given social media giants such as Facebook parent Meta and TikTok owner ByteDance a three-month deadline to address illegal activities on their platforms.
The regulator said it will leverage powers granted to it under the UK’s Online Safety Act to introduce rules to combat criminal harms, including terrorism, fraud, hate speech, child sexual abuse, and the encouragement of suicide.
The new safety requirements will apply to various types of online services, including social media platforms, search engines, messaging apps, gaming and dating platforms, as well as pornography and file-sharing sites.
Companies have until March 17, 2025, to implement the safety measures.
Changes firms must make include designating a senior leader within their top governance team who will be responsible for ensuring compliance with the rules around illegal content, as well as the reporting and handling of complaints.
It also requires tech firms to ensure their moderation teams are appropriately resourced and trained. This means setting performance targets in order to remove illegal material swiftly, making reporting and complaints functions easier for users to find and use, and optimising algorithms to ensure illegal content is harder to distribute.
Child safety online
The new codes also aim to enforce measures to protect children from sexual abuse and exploitation online.
This will mean platforms should ensure children’s accounts and locations are not visible to users other than their friends, as default.
Children must also receive information from the platforms to educate them on the risk of sharing personal information, and children’s accounts should not be suggested as connections.
The online watchdog quotes children from the age of 14-17 who are said to have received messages asking for bikini photos in exchange for money, or other unwanted invitations.
“I don’t want my siblings to go through what I did on social media. I feel happy about these measures because I know that my sisters and siblings would feel safe,” said one girl, aged 14.
Another 14 year old added: “[This will be] effective because no more strangers can be added, there are no more creeps sending things, and it will decrease grooming.”
According to an Ofcom study, many young people felt interactions with strangers, including adults or users perceived to be adults, are currently an inevitable part of being online—they described becoming ‘desensitised’ to receiving sexualised messages.
Fraud and terrorism
Ofcom also aims to tackle fraud by ensuring sites and apps establish a dedicated reporting channel for organisations with fraud expertise.
The regulator said that this would allow them to flag known scams to platforms in real time so that action can be taken.
It also requires sites to remove users and accounts that spread terrorist content.
“For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits,” said Melanie Dawes, Ofcom’s chief executive.
“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year,” she added.
The UK Parliament set Ofcom a deadline of 18 months after the Online Safety Act was passed, on October 26th, 2023, to finalise its illegal harms and children’s safety codes of practice and guidance.
#BeInformed
Subscribe to our Editor's weekly newsletter