The Government has announced today that it plans to extend the powers of the existing communications regulator, Ofcom, to oversee new legislation against online harms. On the basis of its "proven track record", Ofcom will have new – but as yet undefined – powers, including to enforce a statutory 'duty of care' on global media companies such as Facebook and Twitter which host user-generated content. The stated aim is to promote online safety without placing a "disproportionate" burden on businesses. Platforms will need to "minimise the risks" of illegal content appearing in the first place, as well as ensuring that content which does appear is removed quickly. They will also be expected to take particularly robust action in relation to terrorist content and online child sexual abuse.
Although the Government will "set the direction" through legislation, the regime will be flexible enough to allow Ofcom to monitor and tailor its enforcement to emerging harms. It has also promised to protect freedom of expression by allowing adults to access and post lawful content, including content which some might find offensive. Importantly, the regulator will not investigate or adjudicate on individual complaints. Instead, companies will be required to state explicitly what content and behaviour is acceptable on their sites, and to enforce those terms and conditions "effectively, consistently and transparently".
The announcement marks the Government's first response to the consultation launched in 2019 alongside the draft White Paper on Online Harms, which envisaged an escalating range of powers for a new regulator against companies which have breached their duty of care, including to issue "substantial" fines, disrupt the business activities of non-compliant companies, and impose liability on individual members of senior management. The government's full response is due in the Spring, including further details of Ofcom's powers.
Emma Woollcott, Head of Reputation Protection, commented: "Regulating the behaviour of global corporations will always be challenging, but essential if we are to ensure that platforms take greater responsibility in exercising the enormous power they wield. The prospect of meaningful sanctions when platforms fail to properly protect users should drive greater investment in transparent complaints processes and shorter response times."
Alexandra Whiston-Dew, Managing Associate, added: "Tougher obligations on companies will be welcomed by users who have for a long time struggled to secure a quick and effective response to serious complaints, particularly the major platforms that have been largely shielded from liability by being based abroad."