Mishcon de Reya page structure
Site header
Main menu
Main content section

The latest News Session: fake news, online hate speech & the regulation of social media

Posted on 31 July 2019

The Latest News Session: Fake News, Online Hate Speech & The Regulation of Social Media

 

Voice over
Welcome to the News Sessions from Mishcon de Reya, hosted by Hayley Geffin – A conversation on key legal matters that affect you and your business.

The News Sessions Podcast with Hayley Geffin

 

Hayley Geffin
Head of Communications

Hi I am Hayley Geffin, Head of Communications at Mishcon de Reya and you're listening to the News Sessions, where we look at key areas of law that are hitting the headlines.  Today we are talking about social media, a pretty broad topic but we are focussing on regulation of social media and what we can all do to protect ourselves online. 

Here with me today is Emma Woollcott, Head of our Reputation Protection Department and Reputation Lawyer, Alexandra Whiston-Dew. So, social media. I think that anybody reading the news whether you are an individual, a concerned parent, someone in a business you think 'oh my gosh, this is the wild west.' There are no rules, this a nightmare, people are looking for law.  Emma is there law when it comes to social media?

 

Emma Woollcott
Head of our Reputation Protection

Yes there are a plethora of laws and regulations that apply to social media. The tricky thing is that social media applies across various jurisdictions and in such a broad range of situations that it is about trying to work out which rules apply in which situations. Social media companies themselves have their own terms and conditions and their own user rules so in effect they get to write their own rule book but also act as judge and jury.  That’s potentially the issue and that is why in April the Government um published an Online Harmed White Paper setting out an ambitious vision for improving online safety and trying to provide a new regulatory framework to tackle a broad range of online harms.

 

Hayley Geffin 
Head of Communications

So Alex we think the problem is there’s too many rules that don’t all speak to each other?


Alexandra Whiston-Dew
Managing Associate

That’s part of it. I also think that it is really difficult for both sides, um users and the social media platforms themselves to work out what to apply and when and to draw that line as Emma was saying, as judge and jury between freedom of expression which obviously they want to cultivate on their platforms and it is what their users go to their platforms for and also looking after their users and dealing with what is unlawful but also what is um content that isn’t appropriate on a platform for their various users. So for example, abusive content or something that is not appropriate the particular age range of their audience for example.

 

Emma Woollcott 
Head of our Reputation Protection

They are in a very difficult position because we are asking commercial businesses to make quite sophisticated judgment calls about what’s harmful, what’s lawful, what’s unlawful. If you were concerned about a newspaper article being defamatory you would sue in defamation, you’d go to you know, you’d issue proceedings.  The decisions around what the meaning of defamatory content would be, would be determined by judges, and that um there is a passage of time. In social media terms everything happens really quickly. We expect social media platforms to act on and respond to complaints in real time and so the onus is on platforms to have appropriate um abuse channels to respond to complaints really quickly but to me judgment calls around not just what’s unlawful but what’s harmful and there is no universal definition of what can cause harm.

 

Alexandra Whiston-Dew
Managing Associate

Yeah I think that’s absolutely right and what is also interesting is what is deemed to be harmful offline, is it the same as what is deemed harmful online. That kind of cultural relativism or cultural palatability of what is acceptable changes and shifts so um we look for certainty in our law but do we actually need something which is more fluid when we are dealing with fast paced cultural developments online and in the media and in the school room.

 

Hayley Geffin 
Head of Communications

Do you think that some social media companies when faced with the challenges we are discussing are saying it is not our problem or are they prepared to accept that some responsibility does sit at their door?  We are talking about Facebook, Twitter, Instagram, Snapchat etcetera?

 

Emma Woollcott 
Head of our Reputation Protection

They want a bit of both worlds. Social media platforms have had to accept that they are responsible and that with this great power comes great responsibility. Traditionally they have been defined in law as hosts of content rather than publishers of content because they are the notice board to which notices are pinned rather than being the authors of them but they appreciate that they have the ability to unplug, to de-activate, to close accounts and um so there has traditionally been a reluctance to be actively involved in the policing of what happens on platforms but over time the platforms have had to accept that it is in their interests to make sure that the harmful content isn’t hosted by them and actually there has been a shift I think in the last couple of years of that responsibility being taken seriously because users have become more concerned about how their data is being shared, how responsive the platforms are to reports of abuse.

 

Hayley Geffin 
Head of Communications

So Emma, what kind of advice would you give to clients to protect themselves online?
 

Emma Woollcott 
Head of our Reputation Protection

I think it pays for everyone to be quite mindful about what they share and the information that they hold dear and the impression they want to leave online and I think that sometimes the tone of conversations makes people you know, put their guard down and they say things in the heat of the moment that they wouldn’t if they were facing a real human being or realising that they were leaving kind of an indelible mark on their reputation forever and um so before something kicks off the advice tends to be, be mindful about what you are putting on social media, what sort of stories you are engaging in, the images you are sharing, the information you are sharing and um try and switch round to the worst case scenario and avoid it happening. When tricky situations blow up and the mob starts to circulate and stories get quite heated, advice tends to be to listen and to react decisively rather than just to react and to be sure to make sure that if there is concerning harmful content you report it promptly, that you think about the channels on which your providing information. I think that often there is a misunderstanding that the internet is the wild west and social media companies won’t respond to um concerns.  Actually we managed to get Twitter accounts deactivated quite quickly, Facebook can be very responsive when we need to shut down discussions. They tend to respond much more swiftly if they understand that there are breaches to their own terms and um conditions taking place or if actually the content is criminal. What’s also, I think is overlooked and not appreciated is that these social media platforms harvest quite a lot data about their users and so a client who has had confidential data misused or has been attacked or bullied online is, um if they act quickly enough, able to go to social media platforms and through the Court apply for disclosure of information that can help them track down their attackers.

 

Hayley Geffin 
Head of Communications

So it sounds like from what you are saying that when it comes to abuse and online hate potentially the companies can be a bit fleet footed in dealing with that?  When we are talking about fake news, is that still the case or is that much murkier water?

 

Emma Woollcott 
Head of our Reputation Protection

The difficulty with fake news is you’ve got to demonstrate that it’s false and the harm in it being circulated.  I think that you know, um most readers know that a lot of what they read online is not as credible as what they read in the broadsheets um and so er you know, fake news circulates and gets… and is sometimes relied on and sometimes er damaging but often is dismissed and I think the real challenge is being able to quickly articulate not only that something is false but also that it is going to cause you know, exponential damage if left circulating so the social media platforms are already dealing with damaging false stories but er understandably in most cases they require evidence that what’s being peddled is false and they often need convincing that it is their problem and something that they should deal with.

 

Alexandra Whiston-Dew
Managing Associate

I think you can’t lose sight of the fact that social media platforms are businesses and they make money out of tracing lines of argument or um trends for example online and people being able to access content that they want to read so being able to funnel that content to those individuals in a way that they will keep clicking on um various links or images or articles is something that we have to recognise makes them money and this is where the amplification of false stories or the channelling of um bias or um extreme views to individuals who then um get funnelled towards a different viewpoint is something that um parents or law makers or we as lawyers get worried about because it is very hard to challenge that and the algorithms that are very carefully and cleverly um devised by social media platforms are deployed in the way to enhance that experience. So although I agree that it is um difficult for social media platforms to address fake news or the disproportionate focus on bad news or good news and creating that bias or echo chamber, they also have to recognise that they are making money out of that commercial focus and it is something that they then have to be accountable for if it is creating a disproportionate effect of on their user base.

 

Hayley Geffin 
Head of Communications

So lots of reports are circulating that people are actually leaving these platforms in droves, I think Facebook particularly, and maybe that is because they are reading the terms and conditions and thinking actually no, I am not prepared to opt into that. But are there other reasons people are leaving?

 

Emma Woollcott 
Head of our Reputation Protection

I suspect that of the quarter of the planet that are on Facebook, very few of them have read the terms and conditions. I doubt that’s the reason they are leaving. There is research that suggests that women and non-white people are harassed on social media disproportionately and extremely. In fact I think Amnesty International said that a woman is harassed on Twitter every five seconds or something awful, a terrible stat. I am concerned that because social media platforms have to police abusive content and they do focus first on racist, homophobic content, stuff that’s easier to filter out in terms of the language, there’s studies that suggest that a lot of the people who are leaving social media are female and people of colour because they have found it an uncomfortable and harmful place to be and I hope that there will be an increased focus on social media companies and platforms in the next little while to really focus on those issues and what they can be doing to react more positively and constructively to reports of abuse so that rather than coming off social media, women are empowered to engage, empowered to be part of those conversations and stay on social media but know that when they feel uncomfortable that the abuse that they report will be understood and taken seriously.  Facebook have been quite responsive to changing the definitions and the things that they understand as abusive. For a while I think that there were certain things that they classified as being patriotic that were actually neo-Nazi and they reflected on the terms that were banned and the discussions they deemed and considered to be harmful and they came forward and said actually no we’ve had a real look at this and we’ve closed a load of accounts and we think actually we don’t want to be a platform that is hosting this sort of material. There is a wealth of evidence that suggests that women are being abused online and are coming off social media because they just, you know, feel like they don’t want to be in it anymore and I think that that’s, you know, that abuse seeps into kind of the female psyche early and it its dissuading women from er taking position, stretching out, putting their head above the parapet then that’s damaging more generally and it is a big ask of social media companies, I understand that, but um I think you know, some focus on how you protect, not just minorities but half the population who are being focussed on on social media would be a really positive, perhaps ambitious, but why not focus for these discussions.

 

Hayley Geffin 
Head of Communications

So you’ve both used the phrases ‘lawful’, ‘harmful’, I mean there has got to be situations I’d imagine where something is not lawful and it's harmful but sometimes is it harmful but actually there aren’t laws being broken and if so, how do you deal with that?

 

Alexandra Whiston-Dew
Managing Associate

That’s absolutely right and there has been recent media attention on various different issues relating to that and it is also very present in the Online Harms White Paper. One issue that we are discussing in the office is around the anti-vaccination campaigns and how harmful they can be and how it can have such a detrimental effect on communities but also how important it is for people to be able to choose what is right for their children, read and research what they want to in terms of how they are going to parent and also be clear about what’s, what’s real news and what’s fake news in that debate and that’s something that’s particularly interesting but um also potentially harmful but lawful and if you look at platforms like Mumsnet, how are those moderators of those um sets of… um of those conversations going to be able to draw a line when the guidance is not able to give clarity on what is harmful to that extent so we look to um the terms and conditions of each of the social media platforms when the laws don’t go as far as dealing with that kind of harmful content and um when a new regulator is established if they are going to deal with content which is not just unlawful but harmful also then they need to bring that clarity in terms of how the users are going to be able to complain or asset their rights.

 

Hayley Geffin 
Head of Communications

So we have talked in broad terms about law makers, regulators. Who is actually the regulator when it comes to this stuff?

 

Emma Woollcott 
Head of our Reputation Protection

At the moment no specific regulator so depending on the content and how it is broadcast and how it's published, how it is disseminated, different regulations apply and different laws apply. So the proposal of the Government in this Online Harms White Paper is to create a new regulatory framework to tackle all these harms to set up potentially a new regulator or to establish a more joined up system of self-regulation. Now if they have a new regulator the social media regulator will have a kind of suite of powers to try and take action about companies against the platforms that don’t adhere to the kind of common set of rules and ethics that they sign up to. The carrot in that is that then social media companies have input in agreeing the rules and know what is expected of them and users are potentially more able to know what they can expect of the platforms in terms of dealing with abuse. The stick is that the social media companies who sign up to be regulated are then subjected to substantial fines if they don’t adhere to the rules and I think you know these are commercial businesses, they care about their reputations, they care about traffic and clicks and other things but they also care about the profit and the bottom line and the threat of substantial fines is going to be persuasive, the ICO after GDPR came in, one of the biggest drivers of one of the biggest ways of pushing compliance within businesses was this threat of substantial fines that can be a percentage of turnover so not something that even the biggest companies want to ignore.

 

Hayley Geffin 
Head of Communications

That said you mentioned within that those businesses that agree to be regulated.  Is that… are we talking about you have to opt in or opt out in which case how much impact can a regulator have if you are deciding whether to be regulated or not?

 

Emma Woollcott 
Head of our Reputation Protection

It may very well be that users who are becoming more and more mindful of online harm and fake news and this is an issue that they walk with their feet and they choose to participate only on the platforms that they know have signed up to treat them properly if they are abused.  I think that it has become a commercial imperative to the social media platforms and that is why I think that they are wanting to be part of drafting the rules and signing up to the rules.

 

Hayley Geffin 
Head of Communications

Yeah to get that kitemark that says look here is a safe place for you to interact with each other, for your children to use, you don’t have to worry because we’ve got that tick, verified tick even.

 

Alexandra Whiston-Dew
Managing Associate

Yes and they are not just signing up potentially to big fines when they say that they will join or participate in a regulator, they are also potentially signing up to personal liability for senior management and that is something that has been bandied about not only in the Online Harms Paper but also in the media globally as to whether senior founders or senior management of these social media platforms will be personally responsible for breaches of their duty of care or breaches of um regulations or legislation.

 

Emma Woollcott 
Head of our Reputation Protection

And they can see this happening so after the Christchurch shootings in New Zealand, Australia passed a new law, the sharing of Abhorrent Violent Material Act which introduced criminal penalties to um social media companies which can include possible jail sentences for executives and up to 10% of global turnover as a fine. So these are the penalties that social media platforms are seeing in different jurisdictions and I suspect will be wanting to avoid in Europe.

 

Alexandra Whiston-Dew
Managing Associate

The downside of making the penalties so severe is that you might cause social media platforms to migrate from certain jurisdictions and that obviously affects the users and also the global nature of the internet and social media platforms which is so cherished by quite a lot of us. Um you’ve seen that sometimes for example, there are various options that are switched on and off in social media platforms or for example, with Google Listings for example in Spain and that is something that really impacts the end user and should be part of the um decision making process when you are looking at legislation or regulation.

 

Hayley Geffin 
Head of Communications

So a lot of us I think around here didn’t grow up with social media, it is something that came of age when we had come of age. Um what do we… how do we talk to our children about this?

 

Alexandra Whiston-Dew
Managing Associate

I think it is really interesting this question and something you hear from parents all the time and we at Mishcon do a lot of pro bono work with legal advice centres who have parents um who are asking how to educate their children about these things but also um understanding about the importance of going into schools and educating children about how social media platforms work, what information they gather, who can see what and why it is important that um you understand the repercussions of your actions not only in the playground but also online and that um education piece is something that is part of the Online Harms White Paper but also is, is a story that is really important to be heard at school and I think that there has been a huge development in um terms of the information that has been provided to children and how they know how to protect themselves online but also there is a slight um contradiction with them finding themselves, trying out new ways of being themselves and accessing information and new areas where they can learn about what it is to be part of a global community which is something that parents might be scared of but still is an amazing thing about being online.

 

Hayley Geffin 
Head of Communications

So we have talked about grown-ups, we’ve talked about children. We haven’t talked much about businesses and whether you know with all of this talk of harms and what’s lawful and people might come after you. Should businesses be avoiding social media or learning how to grapple with it in their own way?

 

Emma Woollcott 
Head of our Reputation Protection

The latter. Social media is not going anywhere, it is increasingly where people are receiving their news, there are shifts over time in terms of what’s in favour but this is now how you know, the next generations are, are interacting, are receiving news, are learning about products, are buying products and um so brands do need to be understanding social media, understanding how to present the best version of themselves on social media, understanding um how to react when things go badly and how to close down conversations that are harmful and um to take some conversations off line, um how to deal with IP abuses and copycat stories, fake news and the um main advice to brands and to business owners is to try and crisis plan, think through things, be ahead and practice.  The… in the bad old days, or the good old days – when there was a story or an issue affecting a business there would be one story and now the way that businesses deal with crisis situations becomes story 2, 3, 4 and if those who are monitoring social media accounts for businesses aren’t um mindful of how they are perceived, don’t get the right tone, that often becomes the subject of a second, third, fourth story. Some brands have done really well out of tricky issues on social media and they’ve owned them and they’ve turned them around but those who are caught unaware or who handle situations badly end up getting kind of two doses of egg on their face and so social media is here to stay, there is a growing trend I think in the public to feel like they can jump into any debate and have their say and um  can complain visibly on social media and know that that’s actually the Achilles heel and that’s what brands are concerned about and that’s the quickest way of getting a complaint heard and dealt with and um I think the only way that businesses who pride themselves on their profile will continue to thrive is to be ever more mindful of their reputations, their brand, how they curate it, where they place it, where they you know, who they speak to and how they police attacks on their brand.

 

Hayley Geffin 
Head of Communications

And also as you mentioned, there are opportunities there and Alex you talked a lot about the positives of being on social media of course we know about the negatives but I think what we are perhaps talking about when we are talking about businesses is don’t be afraid.

 

Alexandra Whiston-Dew
Managing Associate

Don’t be afraid but take responsibility for what you are publishing and make sure that the people who are publishing for you um are serious people who are careful about what they say and in the right voice for your brand.  People mistake corporate social media platforms for their own social media platforms and it is important to recognise that being the voice of a brand is a serious and important role and not to confuse it with what you might write on a Saturday night.

 

Emma Woollcott 
Head of our Reputation Protection

I think as Alex was saying, brands seemed to be increasingly aware of the responsibility of those who are engaging on social media on their behalf.  It’s the shopfront and so they need to have clear policies about who can say what and when, clear training about how those who are managing social media accounts respond, when they need to escalate, when they need to slow conversations down and actually planning through different scenarios is and allows those who are controlling social media channels um for brands to be more confident in the moment, to be more natural, more authentic and that you know, I think the brands that do well in a crisis are those that have practised.

 

Hayley Geffin 
Head of Communications

Well we could talk for hours but we won’t, lots to think about. Thank you Emma, thank you Alexandra.

I am Hayley Geffin and you’ve been listening to the News Sessions.

 

Voice over
The News Sessions in partnership with Mishcon de Reya.  Find more of the News Sessions podcasts dealing with key legal matters on iTunes.

The News Sessions is a Jazz FM production for Mischon de Reya.

How can we enjoy the benefits of social media whilst protecting our privacy and reputation online? Whose responsibility is it to prevent the spread of fake news and online hate speech?

Mishcon de Reya's Head of Communications Hayley Geffin hosts the latest News Session with Head of Reputation Protection Emma Woollcott and Managing Associate Alexandra Whiston-Dew. They discuss how the use of social media is regulated, and what businesses and individuals can do to avoid reputational fallout online.

The News Sessions look at the latest legal headlines, providing insight into the most pressing matters when they are most relevant. Short segments of this recording will be featuring on our Jazz Shapers programme, broadcast at 9am every Saturday on Jazz FM, throughout the summer. 

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else