Mishcon de Reya page structure
Site header
Main menu
Main content section

Mishcon Academy: Digital Sessions - Facebook, Donald Trump and Free Speech

Posted on 28 May 2021

The Mishcon Academy Digital Sessions.  Conversations on the legal topics affecting businesses and individuals today. 

Isabella Piasecka

In this episode, what is the Facebook Oversight Board, what did it decide in relation to Facebook’s decision earlier this year to impose an indefinite ban on former President Donald Trump and what is the significance of that decision for Facebook, its users and free speech?

Hello and welcome to the Mishcon Academy Digital Sessions podcast.  I am Isabella Piasecka, a Managing Associate and Member of the Reputation Protection and Crisis Management Team at Mishcon de Reya.  I’m joined remotely by my colleague, Harry Eccles-Williams, also Managing Associate within the Reputation Team.  So Harry the Facebook Oversight Board is less than a year old, what is it exactly and why was it set up?

Harry Eccles-Williams

Hi Isabella thanks.  The Oversight Board is a body set up by Facebook in 2018 to review decisions regarding content on Facebook and Instagram.  The Oversight Board says that it was created to help Facebook answer some of the most difficult questions around freedom of expression on line, what to take down, what to leave up and why.  Facebook has reportedly put about a 130 million dollars into an independent Trust to fund the Board and the Board is made up of academics, journalists and politicians from around the world so there is currently 20, they are hoping to increase this to 40.  They chose cases of interest and review them.  Alternatively Facebook can also ask them to look at cases.  There are four panels of 5, each with a Chair, they reach a decision and that then has to be approved by the majority of the Board itself.  Originally they could only decide whether a decision to take down content was correct.  This has been expanded to whether a decision to keep up or take down the content was correct so, so now a decision can be made, Facebook can say, we decided to take this content down or, we decided to leave it up and either way that can be decided upon by the Board.  When making their decision they consider three things; they consider content policies, values and human rights standards.  So Facebook and Instagram have their own content policies as to what content should and should not include.  Facebook then has its slightly broad values and then there’s the generic concept of human rights standards which are not defined.  We don’t know what will happen if the Board decide that a post didn’t fit in with human rights standards but did fit in with the content policies and values.  We don’t know what would happen if there would be a contradiction.  There has to date not been such an instance.  None of these decisions are legally binding so there is no law that this is based on.  This is sort of Facebook law.  However Facebook says that they will be bound by the decisions of the Board to decide whether to keep the content up or down but they will not be bound by the broader recommendations that the Board is able to make although they will respond to these recommendations within 30 days.  The Board has you know, perhaps unsurprisingly come in for significant criticism, I think everything Facebook does is, is criticised by many and probably praised by others but in this instance you know, there are allegations that it is simply a PR stunt to try to dissuaded the US Authorities from properly regulating it, that it is a sort of diversion, a distraction.  Also that it’s been created to take the difficult decision so that Facebook doesn’t have to so that Facebook can continue to appease all of its users and people can instead be angry with the Oversight Board if they don’t like a decision, not be angry with Facebook.  However the first set of decisions which were made earlier this year suggest that the Board isn’t going to simply do as Facebook wishes and it will seek to stand up to Facebook.  They refused Facebook’s request not to consider one appeal, they made far reaching recommendations in relation to content moderation processes despite Facebook’s specific requests that they don’t and they’ve also unearthed a lot of information about Facebook’s processes that were not previously known so you know, it’s in its early stages and, and it’s just made its tenth decision and this was the infamous decision in relation to the suspension of Donald Trump.

Isabella Piasecka

Right and can you tell us a little bit more about the, the background to that case?  What Facebook originally decided and why, and why they ended up referring it to the Oversight Board?

Harry Eccles-Williams

Yeah sure, so you know, I think we all know that on 5th and 6th January supporters of then President Trump gathered in Washington to protest against the election being, as they saw it, stolen.  This gathering was encouraged by Mr Trump who was the headline speaker at the Save America Rally and encouraged his supporters to march on the capital.  That day, and this was the day that the counting of the electoral votes was taking place, a mob forcibly entered the capital building in Washington DC, five people died, many more were injured.  There were threats to members of congress, Vice President Pence was in danger at some point and during these events Trump posted two pieces of content; at 4.21 New York time or Washington time as the riot continued, Trump posted a video on Facebook and Instagram and I won’t read out all the words but they are interesting because he said things like, ‘we had an election that was stolen from us’, but he also said things like, ‘you have to go home now, we have to have peace, we have to have law and order, we have to respect our great people in law and order, we don’t want anybody hurt, go home and go home in peace’.  So just over an hour later Facebook removed this for violating its community standards on dangerous individuals and organisations.  Then a couple of hours later, Trump posted a written statement on Facebook that simply said, ‘these are the things and events that happen when a sacred landslide election victory is so unceremoniously and viciously stripped away from great patriots who have been badly and unfairly treated for so long.  Go home with love and peace, remember this day forever’.  So under ten minutes later Facebook removed this for violating its standards on dangerous individuals and organisations and it blocked Trump from posting on Facebook or Instagram for 24 hours.  Then the next day they extended this block indefinitely and at least until the transition of power and then on the 21st January, so the day after Biden’s inauguration, Facebook announced that it had referred the case to the Oversight Board to consider.  Then on the 5th May after deliberating for over four months or for nearly four months, and reviewing nearly 10,000 responses to its call for comments the Board finally announced its decision.  Isabella, what were the headline findings?

Isabella Piasecka

Well the Board upheld the ban so that’s Facebook’s decision to suspend Trump’s account based on those two posts.  But it rejected the idea of an indefinite ban and said it was unfair because, because of the uncertainty to have that hanging over you and because it was standardless so it wasn’t based on any existing policy, it was really the first ruling of its kind.  Facebook now has six months to review its decision and issue ‘a proportionate response consistent with the rules applied to other users’ and this distinction between Trump and you know, what average users is something I am sure we will talk about further on and it’s really, that’s the thrust of the ruling, that, that Facebook made a vague decision, then referred it to the Oversight Board and in doing so was seeking to avoid its responsibilities and that’s actually the phrase that the Board used and that is a real indictment of the poor, the opaque state of Facebook’s policies, it’s lack of transparency and the risk of arbitrary penalties and in terms of the Board’s non-binding recommendations and again we’ll look at these in more detail, a lot of interesting details emerged, this is what you were saying about one of the benefits of the Board is it’s unearthing all of these little known details so one is to do with its newsworthiness allowance which is the greater latitude that it affords to what it deems to be important including political speech but it says it didn’t apply that allowance to Trumps output.  We’ll maybe talk about that a little bit later because that seems questionable but the Board has called for more guidance on that and exactly when it applies.  Another is that Facebook we now know operates a cross check policy to some of its high profile accounts to minimise the risk of errors and enforcements so when it makes a decision it can escalate it and potentially review it when it relates to controversial content.  Again the Board asks for more details and one thing that would be really fascinating to find out is, is to hear about the relative error rates, so how often do they get it right.  They also ask Facebook to clarify its policies on strikes and penalties for restricting profiles which is essentially a warning system but in short the Board said very firmly that Facebook needs to be more open for the sake of all users, not just the highly influential ones and then possibly the most inflammatory passage, the Board recommended that Facebook conducts and publicize a review of its own potential contribution through its design and policy choices to the narrative of electoral fraud and the exacerbated tensions.  So what they are really saying is, is Facebook engineered to fan the flames where there is heated debate or you know, did the platform itself serve to amplify Trump’s rhetoric which is obviously something that Facebook I am sure doesn’t want to answer but the Board has really put it on the spot in that sense.

Harry Eccles-Williams

And so within the sort of dialogue that the Board seems to have had with Facebook prior to the decision, are there any questions that Facebook failed to answer?

Isabella Piasecka

It failed to answer, well quite a few and quite a few significant ones.  So the Board asked Facebook 46 questions, of which it declined to answer 7 fully and 2 partially including and this goes to the point I made about its design policies how the news feed impacted the visibility of Trump’s contents.  It also didn’t answer whether it has already researched its design decisions and how they contributed to the events of the 6 January and another very important question; the platform completely ducked how other political leaders have been treated on its platform to date or how much content it has with other Governments.

Harry Eccles-Williams

And this, this is one of my pet fascinations but we’ll, we’ll come to that in a second.  Anyway so what, what happens next?  So they made this ruling or decision as they call it.  What’s the next step?  Who needs to do what?

Isabella Piasecka

So first of all, I mean it’s not that far away, Facebook needs to respond to the policy recommendations, so that’s by the 4 June and then it’s got six months which you know, some people have been critical and said that’s, that’s far too long but it’s got six months to review its decision and issue, a justified decision which may be, although it seems unlikely to reinstate Trump’s account but if it wants to ban him permanently it has to explain why. 

Harry Eccles-Williams

I mean, and this is very interesting of course because they’ve used words like must, Facebook doesn’t have to do this, they are not required to do this but it seems that the Board are telling them they have to and basically daring them not to.  I think what will be, I mean as you sort of alluded to, the response to the policy recommendation should be very interesting.  I mean previous responses have been mixed, some information has been provided but the really difficult questions and especially those that go to Facebook’s bottom line and the algorithm have always been put off indefinitely with promises to consider carefully and hope that everyone forgets about them to be honest but let’s wait and see, I mean I think it will become increasingly difficult for Facebook if they are seen to ignore every difficult recommendation they receive by the Board and you know, refusal to properly engage in what the real questions are I think may well be used against them in the argument about regulation but let’s let them come back first.  So Trump aside, what do you think are the key take away from this decision, I mean to start maybe we could think about whether the Board’s decision not to make a decision and to send it back to Facebook was the right one or whether the Board itself was, was ducking responsibilities.  I mean there is an argument that it’s the Board’s role to make these difficult decisions and to give Facebook clear policy advice and that they’ve failed in that function you know, they are not a Supreme Court as Zuckerberg once called them, but more of an advisory body.  I wonder what you think of that.

Isabella Piasecka

Well I actually think it’s right that they threw the ball back to Facebook.  Zuckerberg agreed the remit of this Board and ultimately it’s for Facebook to come up with consistent policies and enforcement mechanisms and to resource them properly and the word resourcing came up prominently in the decision and I also think this is much bigger than Trump.  He is of course a high profile, probably the most profile example of a content review decisions but there are many other influential users to use the Board’s term, Trump is not the first, he won’t be the last populous leader, he won’t be the last user with huge reach and I think Facebook has to grapple with the potential for those influential users and all users to cause harm and I don’t think it is an enviable task but we are talking about a multibillion dollar company arguably with more power than many Governments and I think it’s time to step up.

Harry Eccles-Williams

I agree and I think this issue about how they deal with political leaders is a fascinating one and one where the Board is very critical because Facebook doesn’t on the face of it seem to have any policies in place to deal with what the Board terms as you say, influential people and you know, there is a real tension here.  I mean people need to hear from political leaders but at the same time that cannot give political leaders carte blanche to, to say and do as they wish and not be subject to any form of moderation.  Should political leaders be held to the same standards as normal people, I mean Facebook seems to take the, the sort of Orwellian approach that in respect of content moderation we are all equal but some are more equal than others you know, there’s the newsworthiness issue, the cross-checking issue.  I mean it is all quite fluid and seems to not be based on any one set of rules or regulations.

Isabella Piasecka

No I think that’s right and I think Facebook has really tied itself in knots with the newsworthiness allowance.  I mean I personally don’t disagree that certain content is more important, I think that voters need to hear their candidates and then leaders in order to evaluate them but Facebook has been inconsistent so it has behaved one way in Myanmar for example and we know and Facebook admitted that it fermented violence and then it banned the military there, it’s behaving differently in India so it’s facing allegations that it is sort of selectively taking down content.

Harry Eccles-Williams

And in Myanmar what was interesting was that firstly this decision came after the Trump decision and so, so with all of that content, context sorry, their decision was also indefinite as opposed to permanent which is obviously a criticism the Board has raised at the Trump decision.  Also in Myanmar if you look at the reasoning, I mean, given what’s been going on in Myanmar, the reasoning essentially is that the Army has now broken the content moderation rules and also there are human rights issues.  The Army has been breaking content rules for years, not least in relation to the Rohingya.  So why now, what’s, what’s new now?  There is not policy, there’s no, well this is we followed this policy, they did this, this and this and so ultimately we got rid of them and it seems to be frankly arbitrary, I mean they’ve also been breaking all sorts of human rights laws for the past however many years and just now all of a sudden because I mean obviously there’s been a coup and I support the decision but why now and there is no clear base so there is no set of rules and there is similarly as you said, fluid decisions in India where there are allegations that the ruling parties are allowed to post hate speech, legitimate criticisms of it are taken down and there is no rule book as to how this is dealt with.  My view is that that’s by design, I mean Facebook is a smart organisation obviously and as soon as you have a set of policies you have to stick to them and people can hold you to them.

Isabella Piasecka

Absolutely and for me two of the words that jumped out from the ruling are you know, unfettered discretion and I think perhaps that’s over stating it but that’s what Facebook has had until this point which as you say allows it to act in different ways depending  on the context, depending on who’s in power and that’s just not sustainable and it’s not good enough and then it justifies its decisions retrospectively and that’s why I think it’s right for the Board to call it out and say, you know, not just for the sake of these influential users but for all, for all users.  People need to understand why you remove contents and when and when do you take into account background context so in Trump’s case, so I talked about the newsworthiness allowance earlier, you could say that the two posts on which the decision was based were on their face not inciting violence but add in the context of the rally, of some of the words that came out and some of the previous posts that he’s made that were found…

Harry Eccles-Williams

Which weren’t taken down often.

Isabella Piasecka

…right, which weren’t taken down.  So it felt more like a tipping point decision to me that suddenly violence exploded and they thought, right now we have to act.  But arguably they should have intervened sooner.

Harry Eccles-Williams

Yeah and when they did act it was not based on some policy that says, at a certain time we will act.  I mean, I think that most, well I am not so sure most people is quite fair but a number of people would agree that they should have taken action when they did because there was real danger but I think the basis upon which they have done it, it just seems to be frankly mark decided.

Isabella Piasecka

Right and we can’t have a platform with so much power.  I think Alan Rusbridger during a… in a recent interview said that, which is quite extraordinary but I think it’s true, you know, you can’t divorce Facebook from the political process, it is the political process.

Harry Eccles-Williams

Yeah.

Isabella Piasecka

That’s how much power they have.

Harry Eccles-Williams

And just, just for everyone’s knowledge, Rusbridger is one of the members of the Oversight Board.

Isabella Piasecka

Right sorry, yes I should have mentioned that.  So we can’t have a platform with so much power being so secretive about how it makes such far reaching decisions and there was another line in the ruling that I thought was really interesting which is perhaps the Board’s voice or thinly veiled, not so thinly veiled, saying that its lack of transparency is leading to perceptions that it’s acting out of a sense of political or commercial considerations rather than based on policy you know, that will be the conclusion that you draw if the policies aren’t clear.

Harry Eccles-Williams

It’s the only conclusion you can draw at the moment.

Isabella Piasecka

Right.

Harry Eccles-Williams

And what this starts to draw out is the real difficulty that Facebook has I think that they have these content policies, they also now say that the Board should consider human rights law as a broad concept but they also say that they follow local law.

Isabella Piasecka

Mm.

Harry Eccles-Williams

And there will be many times where there is a conflict between these three and they don’t say which trumps – to use the word – which you know, they always say, we respect local law but they also say that their decisions are based on their content policies and at some point they are going to have to, well they are going to try to avoid but I think as a user I would want to understand which is it?  Are you, if I am a user in India and India has just brought in a new internet law, which it has, some say is to make unlawful anti-Government speech, if I am a user in India, I want to know is my post being decided on the basis of Indian law or Facebook’s rules or the broad concept of human rights law or all of them and if it is all of them, in which order and obviously they don’t want to do that because they want to… if Facebook said, well we are going to ignore actually Indian law and we’ll just use India as an example, then the Indian Government may say, well that’s fine Facebook, you ignore our law so you can’t operate here and things become difficult.

Isabella Piasecka

Right and, and well for me there is two, two issues; one is going back to this question of context and whether Facebook is it evaluating content or is it evaluating users.  So in Trump’s case I think it was more evaluating a pattern of behaviour of a user that’s much broader than just, just his behaviour on that particular platform.

Harry Eccles-Williams

Yeah, absolutely. 

Isabella Piasecka

Which may be right or wrong but there needs to be clarity for users about, well how will my posts be judged?  As you say, what will the decisions be based on?

Harry Eccles-Williams

And who by?

Isabella Piasecka

And who by?

Harry Eccles-Williams

I mean you know as you said, we have this if you are an influential user, so called then what you say is cross-checked by someone and so we know that twenty times content by Trump was considered to be against the content policies i.e. it should be taken down and on each occasion it was decided notwithstanding that apparently newsworthiness was not taken into account, on each of these twenty occasions it was decided that actually it didn’t breach the content moderation rules and we are only just learning this.  It’s all so opaque, I don’t think any other sort of judicial or quasi-judicial system would get away with this and as you say, Facebook is a political power at this stage.

Isabella Piasecka

Well it is and the other point I was going to make is we know it is from you know, it’s recent tussle if you could put it that way with, with the Australian Government so we know that when it’s unhappy with legislative decisions, with Government decisions it threatened and it did, I mean briefly withhold its services from all Australian users.  That’s a pretty bold move so it is willing to make those moves when it wants to force policy changes you know, it’s shown its hand in that sense but it goes back to the point that if it has that kind of power it needs to wield it responsibly and transparently.

Harry Eccles-Williams

Transparently is the key I think and, and… but obviously you know, Facebook is renowned and I don’t think it would contest this, for being a very secretive, well maybe it would contest this but, secretive organisation and has done very well out of its… the way it operates its business and I think that one of the things that the Board can do, let’s see how successfully, is start to shine a light on all of this because these are very difficult decisions Facebook has to make, no one is saying they are easy but as you say, they are an incredibly powerful and well-resourced company.

Isabella Piasecka

No absolutely and I think it’s clear you know, we keep using the term, platform, if they ever were a platform arm’s length from what’s posted and its consequences they no longer are, they are making editorial publishing decisions every day, every second of every day.

Harry Eccles-Williams

Every second of every day, every second yes, although under Section 230 they still are at least if you follow the case law there, I am not sure under the statute whether they are.

Isabella Piasecka

Right.

Harry Eccles-Williams

But how it’s been interpreted and so their main audience in relation to regulation I think will remain America but on the face of it it is hard to see how they are not a publisher.

Isabella Piasecka

But that, I think that, that front line is shifting and we within the UK with the online safety bill but also in Europe, we are shifting the dial on liability and Facebook and you know, all of these big tech companies will be forced to take more responsibility for third party content and the only way for that to be workable is if they have clear policies on what is acceptable and what’s not, overseen if necessary by an independent regulator which is the route that we are going down in the UK which seems to be the right one.  Either these decisions are made by Facebook behind closed doors or they are made by Governments, also less than ideal especially depending on the regime, or they are made by regulators, that’s, that’s the only way that you can police that volume of content.

Harry Eccles-Williams

Depending on how Facebook engaged the Oversight Board I actually think is an interesting experiment but it just depends on the level of engagement they get.  I mean it seems to me that the information that the Board is most interested in ultimately getting from Facebook is this information as to content moderation and how decisions are made and on what basis and who by and Facebook have, have pushed back initially on that but I don’t see that the Board is going to let up in its desire for and request for this information and it may become increasingly difficult for Facebook not to give this.  I mean I think the other thing that, well, that they would like to get hold of is the algorithm and Alan Rusbridger has also said that.  I think again Facebook will, will not want to do this because these issues are all going to the bottom line so a number of the recommendations they can address and show a little, give a bit more information than usual and look into things etcetera but they are all around the edges.  The issue of content moderation, how it’s done and the issue of the algorithm and as you say in relation to Trump, you know, just as an example, how that may have actually aided or caused or played a causal role in the capital riots and therefore how their commercial model may be causing disunity and… within society and also international disunity and all sorts of difficult issues that we are having to deal with at the moment, how much that is being essentially commercialised, these are not issues they are going to want to get to but I think they are issues that we have to get to.

Isabella Piasecka

I couldn’t agree more and the other thing that I thought was interesting in the ruling is the Board seems to be in agreement that banning, banning which is effectively no platforming is a penalty of last resort and they’ve said, we would prefer for Facebook to have, they called them you know, effective mechanisms, to stop speech being amplified but still allowing the speech and I think that’s really, really interesting and I think I would agree with that, it goes back to the point about we want even bad or irrational speech to be heard so that it can be criticised by journalists, by users, by voters.

Harry Eccles-Williams

But the amplification point is the one, is, is the algorithm, I mean…

Isabella Piasecka

Right.

Harry Eccles-Williams

…as Alan Rusbridger has said, he has no idea really what he means by the algorithm and I similarly, I have absolutely no idea and maybe there is more than one, maybe there is millions of algorithms that work together but ultimately it’s, it’s this amplification point is really important isn’t it.  Also to your point that banning somebody is so draconian, outright banning forever is a big deal on Facebook.

Isabella Piasecka

It is and it’s not necessarily the answer to much bigger societal tensions.  What we want I think are platforms, our digital platforms to do is to encourage responsible debate so you want as many participants as possible rather than having rules that tend towards bans.  You know I am imaging Facebook behind, you know, a bunch of dials, dialling things up and dialling things down.  That’s what we need to understand, that’s how I think of the algorithms working and only once we understand that can we you know, try to engineer in a positive way debate that doesn’t suddenly escalate as it did on the 6 January and some catastrophic.

Harry Eccles-Williams

Yeah I think that’s right and this doesn’t need to be a sort of an anti-Facebook and there are other obviously social media companies but Facebook is the biggest you know, having Instagram as well but it doesn’t mean that everything they do is negative or that you know, it’s not a… just a pure criticism or to say that simply their business model is incompatible with having a sort of a decent society rather…

Isabella Piasecka

Absolutely.

Harry Eccles-Williams

…just things need to change about how they moderate content.

Isabella Piasecka

Exactly it’s about how they moderate content and what kind of dialogue they foster and whether that dialogue is useful or divisive and that’s one of the biggest criticisms that’s been levelled at it is that these algorithms however they, however they operate exactly fuel discord and if that’s right obviously that’s not a state of affairs that we want so ultimately this ruling is only the beginning but it will be very interesting to see what Facebook comes back with, how substantive it will be.

Harry Eccles-Williams

Yeah, no I think that’s right to say it’s the beginning, I mean it is their tenth judgment, this is a big issue for the world that will have to be addressed in a number of different ways but their response to this will be interesting, not least because it is so high profile, it’s only people like you and I who are probably interested in the first nine but the tenth has got some real coverage and the spotlight is on Facebook now and maybe that will make it harder for them not to provide the information requested but we’ll see.

Isabella Piasecka

So finally in tribute to the West Wing, what’s your ten word summary of the decision?

Harry Eccles-Williams

I can give you four which is, Not What Facebook Wanted.  How about you?

Isabella Piasecka

I think mine don’t quite add up to ten so I might take some of yours but I would say it’s a long overdue call for transparency and accountability over profit.  Well for now let’s wrap up there, I would like to say thanks so much to Harry Eccles-Williams for joining me and do watch out for the next episode.

The digital sessions are a series of online events, videos and podcasts, all available at Mishcon.com and if you have any questions you’d like answered or suggestions of what you’d like to cover, do let us know at digitalsessions@mishcon.com. 

The Mishcon Academy Digital Sessions.  To access advice for businesses that is regularly updated, please visit mishcon.com.

Join Managing Associates Isabella Piasecka and Harry Eccles-Williams from our Reputation Protection team where they discuss the Facebook Oversight Board's decision behind it's indefinite ban on former President Donald Trump, and examine the significance of the decision for Facebook, its users, and the wider implications on free speech.

Mishcon Academy: Digital Sessions are a series of online events, videos and podcasts looking at the biggest issues faced by businesses and individuals today.

The Mishcon Academy offers outstanding legal, leadership and skills development for legal professionals, business leaders and individuals. Our learning experts create industry leading experiences that create long-lasting change delivered through live events, courses and bespoke learning.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else