Mishcon de Reya page structure
Site header
Main menu
Main content section

The Online Safety Bill: will it make the UK the "safest place in the world to be online"?

Posted on 12 April 2022

On 30 March 2022, we hosted an in-person event to discuss the Online Safety Bill. The panel comprised Dame Margaret Hodge MP, Labour MP for Barking; Professor Lorna Woods, Professor of Internet Law at Essex University; Mark Bunting, Director of Online Safety Policy at Ofcom; and Sanjay Bhandari, Chair of Kick It Out. The discussion was chaired by Harry Eccles-Williams, Managing Associate at Mishcon de Reya.

To review the key insights from the event, please watch the film.

We also set out below the key takeaways from the discussion.

The need for regulation

All of our panellists agreed that there was a need for regulation. Margaret spoke powerfully about her own experiences of horrific online abuse. Sanjay spoke to the real world effects of online abuse, comparing it to "a mob of people marching into your living room and hurling abuse at you".

An overview of the Bill

Lorna, who devised the systems based approach to content regulation that has been adopted by the Government, provided a concise overview of the Bill. Both Lorna and Mark explained that the Bill was a "framework" bill, which would need to be fleshed out by secondary legislation and codes of practice. This structure should build in agility – allowing the Bill to develop alongside the technology it is designed to regulate.

Lorna explained the Bill's systems-based approach: one that focusses on features and functionalities rather than playing "whack-a-mole" with individual items of content. Mark welcomed this approach, noting that the purpose of the Bill was to inculcate a proportional risks-based approach to regulation rather than to become a standard of perfection in terms of individual items of content. 

Sanjay called for the need to tackle the "all-engagement-is-good-engagement" system that propagates hate, noting that we must focus on the root causes of online hate so that the regulatory system is "curative" and not merely "palliative".   

Key areas of tension

Freedom of speech

Margaret and Sanjay both challenged the idea that unregulated online platforms equate to freedom of speech, explaining that unregulated online abuse meant that people were often bullied and intimidated off online platforms, thereby reducing the plurality of voices online. As Sanjay put it: "freedom of speech is not freedom from reach and freedom from consequence".

Lorna noted that the new draft of Bill builds in the countervailing obligation to protect freedom of expression, and doubted whether the fear of platforms overzealously removing content would come to bear.

"Lawful but awful" content

Lorna, Margaret and Sanjay all expressed concerns at the vague category of "legal but harmful" content – criticising it as "quite problematic" and "unclear", "ridiculously complicated" and in "unhelpful nomenclature". On the final point Sanjay suggested it would be better described as "harmful behaviour that had not yet met the criminal threshold". 

Anonymity

Margaret made clear that she was focussed on stopping anonymous online abuse and not on stopping anonymity online (which can be vital for whistle-blowers and victims of child and/or domestic abuse). Margaret and Sanjay agreed that third parties should hold user information and not the online platforms themselves. However, they disagreed on whether the suggested optional verification system worked. Sanjay praised it as a "neat balancing of competing interests" and helpful for his constituents who had no need to see online abuse. Margaret explained that it was not good enough as she, and others in public life, wanted to know whether people were telling lies about her and therefore undermining her ability to participate.

The burden on smaller platforms

Mark stressed that a proportional approach was appropriate and that smaller platforms would be under less pressure than the biggest and best funded ones.

Margaret and Sanjay suggested the smaller platforms should not see regulation as an "optional extra" and should welcome the Bill as a new opportunity to build in safety by design. 

The long road ahead

All our panellists were clear that before the UK becomes "the safest place to be online in the world", there was much still to be debated and that any legislation would need to be backed up with proper funding for the regulator.

Mark explained that achieving a safe online environment would not happen in two, or even five, years. Instead, this should be seen as the beginning of a generational change: one in which people begin to conceive of safety in a virtual environment in the same way as they do in a physical environment.

In conjunction with the event we have launched a webpage dedicated to the Bill, where we have uploaded key legislative documents and Mishcon de Reya commentary.

The Mishcon Academy Digital Sessions
Harry Eccles-Williams, Managing Associate, Mishcon de Reya

We’re having a debate about the Government’s recently published Online Safety Bill.  We have four leading thinkers on the Bill and sort of issues around online harms more generally and so we’re going to discuss whether the Bill fulfils the Government’s aim of making the UK the safest place in the world to be online and if so, at what cost to business and to free speech. 

There is little doubt that something needed to be done about online harms and I think the individual and collective suffering it causes is sort of becoming even more clear and we’ll hear more about that later.  The question really is whether the Government’s approach achieves these lofty goals and then what the unintended consequences might be on business, on freedom of expression, on society more broadly. 

Professor Lorna Woods, Professor of Internet Law at the University of Essex

There is a main core of the Bill and then some bits that have got added on.  So, looking at the main core, we’re looking at two types of services, user to user and search, and they have slightly different obligations, search is more likely treated than user to user.  The other two bits, before I forget, fraudulent ads and if I can call it professional pornography rather than user generated.  There is some specificity in relation to the duties as to what’s expected, so it’s not just down to the platforms and the mitigation duty is called the safety duty.  And there are three types of safety duty where what the Government has chosen to do is to identify three types of content and use them as a proxy for a measurement of the type of harm that’s being suffered by victims.  And so we’ve got criminal content, there are obligations on all of the services with regards to criminal, there’s the children’s safety duty which relates to content that is harmful to children and that applies to services that are likely to be accessed by children and then the most contested category is content that is harmful to adults.

Dame Margaret Hodge MP, Labour MP for Barking

When we started with social media, I was a fantastically strong advocate of it because I did think it could democratise our society and allow people who had never participated in democratic debate in any way, it gave them access.  One in five women now, in the UK, have been subject to online abuse, with Black women 84% more likely to be victims of online abuse than White women.  Whether it’s what you learn about at Children’s Trust, suicide rates where or indeed self-harming among children, all that has made me think again.  It is ridiculously complicated.  I want anonymity online.  I want people who are victims of domestic violence, whistleblowers, victims of child abuse, I want them all to be able to go online.  What I want to try and stop is online abuse.  It’s this whole thing is going to blow up and be a nothing if there isn’t the money and the resources and the expertise to implement it.  And all my experience from this Government is that they are very, you know, they focus on the legislation, again I talk about all my experience on tax avoidance, tax evasion and money laundering and all those areas.  They just do not resource the bodies, the enforcement bodies, like you will be properly and we know, you know, and the Police as well will have to be enforced and I suppose I go back to my experience, if you really have 90, in that two week period, 90,000 really offensive stuff and I have it all the time, so that I, just that was just one little period.  And not one prosecution?  There’s something wrong in the system and that part of that is the strength of the Enforcement Agency.

Sanjay Bhandari, Chair of Kick It Out 

Technology is the most underregulated industry around.  Of course the techies won’t tell you that because no one likes to be regulated and they all bleat and moan but people were doing that in financial services thirty years ago, forty years ago.  I come to it from football and focussed on discriminatory abuse, you know very much aimed at our people who play, watch and work in the industry, very often the high profile stuff is towards Black players but actually let’s be really honest, you know, there’s so much that goes misogynistic abuse, rape threats that go to female pundits, female players, there is tons of this stuff, you know, the two things you don’t want to be in social media are a Black footballer or a female politician or indeed any high profile female at all.  I hate the nomenclature, it’s not legal but harmful, it’s harmful not otherwise illegal.  Right?  Now I know it’s clunky but as soon as you say legal but harmful, that colours the debate and it… which is why it gets people who are advocates of freedom of speech, quite rightly, irritated because you assume well, if it’s legal, why am I, why am I prevented from saying it?  But it’s not, this is about stuff that’s harmful but hasn’t quite met, hasn’t met a criminal threshold.  There are lots of things that we do in life that create a legal consequence but are not criminal.  People are creative and if your focus is on whack-a-mole and industrialising the whack-a-mole because some of this is well, we’ll appreciate that if it’s prior to illegal content, you have to create systems that will minimise the potential for you to be able to view that.  So that’s saying we will industrialise them or mechanise the whack-a-mole but we are still whack-a-mole, we’re not, we’re not looking at the underlying systems. 

Mark Bunting, Director of Online Safety Policy at Ofcom

I think this is a framework Bill.  It’s a framework that’s going to take time to flesh out.  I think the core pillars of the framework are robust and are very much aligned with the guidance that we’ve given to Government as we’ve gone along about how this regime can be made effective so, I would pick out the duties on companies to comprehensively assess risks of harm, particularly for the biggest services but not only the biggest services, every service will have a duty to assess risk of illegal content, not withstanding the points that you have made about quite what illegal content means, and that’s a new obligation on companies that doesn’t exist today, backed up by Ofcom’s ability to enforce those duties and to carry out investigations into the effectiveness of services’ risk assessments so that I think is powerful.  I think the powers that are given to the regulator to assess and oversee the effectiveness of companies’ protective systems are new and welcome, both the ability to gather information ourselves, duties and powers in the Bill about gathering information are very broad and also to require the biggest services to be transparent about both what they are doing to keep users safe and the effectiveness of that action.  I think those are strong.  I think the emphasis on proportionality in the Bill is very welcome.  It runs all the way through the safety duties, the emphasis on proportionate systems and processes.  We haven’t talked about these concepts of journalistic content and content of democratic importance, which are called out in the Bill as something that platforms have to have particular regard to.  Not very obvious what those terms mean and I don’t think an independent regulator is the right body to define what constitutes journalistic content.  So, there are some issues, there are some issues to work through there.  There is I think a bit of an expectation that the Bill will pass and these duties will immediately bite and platforms will have to start doing things and that’s not the way the Bill is set up, you know, we… until our codes of conduct are finalised and laid before Parliament, the duties themselves don’t become enforceable so, we’re still probably two years away from being in a position where Ofcom has the powers to enforce against these things and probably a year or more before the Bill passes and we’re empowered to start consulting on codes of practice. 

Sanjay Bhandari, Chair of Kick It Out 

I think when we talk about freedom of speech, it’s because we have a model in our head that social media is like Speaker’s Corner that I’m just going and ranting into the ether so it doesn’t matter but if you’re on the receiving end of abuse, it’s not like that, it’s like having a mob of 400 people march into your living room and hurling abuse at you whilst your family is next door unable to do anything about it and so the reality is that social media now already is a place of public, semi-public and private spaces and so we should be regulating it in that way but the freedom of speechers will have you believe that it’s just like Hyde Park Corner and that’s just not right. 

Mark Bunting, Director of Online Safety Policy at Ofcom

One of the areas where we would most expect to focus and where the Bill does make reasonably clear what the expectations of platforms are, is precisely on this point about the gap between what platforms say they are doing and what’s in their terms and conditions about these things and the reality of the experience of users on the ground and the Bill points to a number of ways in which platforms should be thinking about that better, you know, the use of design systems and processes which try to identify and prevent that abuse, not necessarily monitoring every item of content that goes up but, you know, accounts that have been identified as perpetrating repeated active abuse, you know, there are means to take action on those.  So, there’s those sort of design, design choices but also the, that the mechanisms of complaint and the consideration by companies of user complaints and we know from our own research that there’s a very big gap between what platforms say they do in their terms and then what recourse individuals can have when they’ve had these experiences of abuse, to the extent where actually a lot of people don’t bother reporting abuse anymore because the expectation is that nothing happens.  So, those are things that platforms say they are doing and yet, you know, there’s clearly robust evidence that that’s not working, it’s note effective, and that is, I think that is one of the areas where regulation can lead in very early on and say well look, there’s clearly a gap here, you know, what are you doing about that and talk to us about what your process is for considering these complaints in these cases and tell us what you are going to do to improve them.  That’s not going to solve these problems overnight but I think it does give us that leverage early on to start having those conversations. 

Harry Eccles-Williams, Managing Associate, Mishcon de Reya

Do you think that you have the powers that you need to obtain information or should they be wider?

Mark Bunting, Director of Online Safety Policy at Ofcom

They are very broad and the new Bill has been revised after legislative scrutiny has made clearer some of those duties about, or some of those powers I should say, about our ability to carry out audits of services.

Professor Lorna Woods, Professor of Internet Law at the University of Essex

Some people are concerned that this is going to lead to companies taking content down right, left and centre.  I’m not convinced about that.  There are safeguards built in and there are also appeal mechanisms.  I think improving the complaints mechanism and an appeal mechanism on the back of that would, I hope, help mitigate.

Harry Eccles-Williams, Managing Associate, Mishcon de Reya

Continuing on the freedom of expression argument, I sort of hinted at this in my opener, which is the… it seems that this is going to be where the pressure comes.  Do you think the Bill is likely to pass in a similar form to the one it is in?

Dame Margaret Hodge MP, Labour MP for Barking

The minister who is now taking this through is very, you know, open and well in some ways, in listening mode.  I think the Government, they’re trying to, they’re trying to both grab David Davis and M Hodge if you like, I mean you know but… and put us both… and I think they’ve run into real difficulty in trying to square that circle. 

Harry Eccles-Williams, Managing Associate, Mishcon de Reya

Do we have real concerns on the panel as to the impact of this on business?

Sanjay Bhandari, Chair of Kick It Out 

Every fast-growing business and fast-growing industry experiences compliance lag.  I think in, and you know, they are reactive to bad press.  I think after one of the particularly incidents of online abuse of footballers, someone from the Facebook Compliance Team talked about their quarterly transparency report and so I did the extrapolation from that data that they provided and so this was just abusive messages that were taken down, only on Facebook, so didn’t include DMs on Instagram, which is the worst source in Facebook and of course didn’t include Twitter and if I just took their data, there’s a piece of abuse taken down every second of every minute of every day, 365 days a year.  Every second, at least one, that’s an industrial scale technology problem.  You are not going to answer that with whack-a-mole.  It’s like the M1, right?  So when that was first built, we didn’t have lights, we had soft verges, we didn’t have hard shoulders, there were no crash barriers on the motorways, there were no speed limits.  Did we say, oh well, you can’t have those safety features, if you don’t like it, take the long road.  We didn’t.  We built the safety features in.  This is about building those basic safety mechanisms into the way in which we communicate on the great information communication medium of our age.  It’s just like putting hard shoulders, crash barriers, lights so that we don’t have crashes in fog, that’s all we’re asking for, that’s what this legislation does.  All the country in the world is doing this and trying to regulate the kind of abuse and content that we see on social media and no one else is doing this and these companies have had a free pass for too long so, it’s good that we’re doing it. 

Mark Bunting, Director of Online Safety Policy at Ofcom

Well I think it’s a real milestone now that the Bill has been launched into Parliament and what we’re really pleased about is that it’s the first time that services will be subject to a comprehensive duty to assess the risks of harm arising from their platform and that we as the regulator will be empowered to gather information from them and to understand how they’re keeping their users safe and how they think about managing the risks that they face.  And that’s the first time really that this industry has been subject to that comprehensive scrutiny and oversight and we think that’s tremendously important in helping get to grips with these difficult issues.

Professor Lorna Woods, Professor of Internet Law at the University of Essex

If I’m taking a glass half full approach, I would say that it is based on a systems approach, risk assessment approach, and that is a sensible approach to take with regards to social media.  If I’m going glass half full empty, it’s very, very complex and I think there are a lot of questions still to be sorted out. 

Sanjay Bhandari, Chair of Kick It Out 

I think the big sort of risk or danger is, it still seems very focussed on content rather than systems and the whole of social media is built on algorithmic amplification, it monetises engagement and it doesn’t care whether that engagement is good or bad, whether it’s positive or hateful and my worry is that if you’re focussed on the symptoms, which is content, and you’re not really focussed on the root causes, then the danger is that it’s going to be more palliative than a cure. 

 

The Mishcon Academy Digital Sessions.  To access advice for businesses that is regularly updated, please visit mishcon.com.

The Mishcon Academy offers outstanding legal, leadership and skills development for legal professionals, business leaders and individuals. Our learning experts create industry leading experiences that create long-lasting change delivered through live events, courses and bespoke learning.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else