• Home
  • Latest
  • TV
  • Mishcon Academy: Digital Sessions - Bloody Difficult Women, At Large: Fighting Misogyny Online

Mishcon Academy: Digital Sessions - Bloody Difficult Women, At Large: Fighting Misogyny Online

Posted on 03 November 2020

Mishcon Academy: Digital Sessions are a series of online events, videos and podcasts looking at the biggest issues faced by businesses and individuals today.

This session was recorded on 15 October 2020. The information in the film is correct at the time of recording.

To review the key insights from the event, please view the film or read the write up below.

Context of the event

Emma Woollcott, Partner and Head of Reputation Protection at Mischon de Reya, explained that women are consistently undermined in the media and disproportionately attacked online, and that this is one of the factors which holds us back; that the fear of being criticised or subjected to oversexualised and often violent vitriol can dissuade us from engaging in public debate, from taking high office, and from putting our heads above the parapet. The event aimed to acknowledge – again – the scale and impact of online misogyny, but also to focus on what politicians, campaigners and the big social media platforms are doing to mitigate it; and to consider the solutions and technologies which already exist and are in the pipeline to help us to engage confidently online. It also explored what we can all do as digital citizens and responsible bystanders to try to lessen the impact of misogyny when we see it.

COVID-19, minoritised women and intersectionality

Seyi Akiwowo, CEO and Founder of Glitch, explained how COVID-19 restrictions have moved more of our social and professional interactions online, and exacerbated online misogyny and abuse. According to Glitch's recent survey, those who had experienced online abuse prior to the pandemic reported an increase since March, even more so for black and minoritised women. At the same time, the Government and employers have not stepped up with digital education and guidance to help us stay safe online.

She also called for the Law to recognise the intersectionality between, for example, race and gender: currently, a black woman reporting online abuse is forced to frame her complaint as linked to either her race or gender, but not both.

The progress made, and challenges faced, by Twitter and Facebook

Katy Minshall, Head of UK Government, Public Policy & Philanthropy at Twitter, outlined the platform's work to reduce the reporting burden on victims and make the platform safer by design. For example, users can now moderate replies (choose who can reply to their Tweets) and hide replies as and when it suits them, as well as muting and blocking accounts. Twitter has also provided safety training and resources for those in the spotlight, including UK political parties and candidates. 

Rebecca Stimson, Head of Public Policy for Facebook in the UK, distinguished between Facebook's similar drive to give users more control over their online experience (and to raise awareness of those features), and the harder task of controlling content centrally, at scale. Language is evolving and subjective, more so for misogyny than, for example, terrorism-related posts, and content that is mobilising for some will be traumatising for others.

Both Katy and Rebecca backed a "whole-of-society" approach, given that online discourse reflects offline bias, as well as more education on digital safety and self-care, such as ensuring password security. They also encourage victims to report abuse, partly to feed the systems used to detect it with as complete and sophisticated data as possible.

However, echoing Seyi's calls for platforms to improve – and clarify – their content moderation procedures, activist and audience member Gina Miller (dialling in) argued that the overarching burden should not be on women to report abuse, and women should not face a disproportionate burden when complaining about online, as opposed to offline, abuse. Gina also stressed the need for the Government to press ahead with its plans for an internet regulator and to impose a duty of care on tech companies (as set out in its Online Harms White Paper in April 2019): just as in Law and finance, abiding by rules and regulations is the cost of doing business and the right thing to do. Katy and Rebecca reiterated their support for a new regulator, although Rebecca cautioned that the devil will be in the detail of the legislation, once published, and that moderating your way out of societal problems is impossible.

Progress in the dating and gaming industries

Anno Mitchell, Strategy Director of the Internet Commission, discussed the ways in which the gaming and dating industries have addressed online misogyny, including by rewarding good behaviour. For example, a dating app user who is verified and labelled as behaving well online will be made more prominent to other users.

Anno also emphasised the need to record misogyny as a specific form of abuse rather than as part of a larger data set, to provide concrete examples for human moderators, as well as algorithms, and improve detection. She added that, as companies increasingly buy in artificial intelligence to help with content moderation, we need to ensure there is oversight and accountability in terms of how these programmes are trained and tested.

Making misogyny a hate crime

Stella Creasy, the Labour and Cooperative MP for Walthamstow, talked about her long-running campaign, recently backed by the Law Commission, to make misogyny a hate crime. She wants all police forces to track misogyny so that we can encourage reporting, spot patterns of abuse, and punish offenders. At the moment, "misogyny is so ingrained that it is almost accepted". 

You can support Stella's campaign by signing up here for updates. If you wish to respond to the Law Commission's consultation the deadline is 24 December 2020.

The Mishcon Academy Digital Sessions

Emma Woollcott

Hello, I’m Emma Woollcott, I am Head of Reputation Protection and a Partner at Mishcon de Reya and I am delighted to welcome you to this Digital Academy event.  I have been a media litigator for 15 years and I am deeply concerned by the way that women are portrayed in the media and the abuse we suffer online.  The Business Woman Campaigner, Gina Miller who many of you will know is a client and a friend of the firm and who is with us this afternoon, recently said that there has been avalanche of abuse against women in recent years.  I believe that this abuse is one of the factors that hold us back.  The fear of being criticised or subjected to over sexualised and often violent vitriol can dissuade us from engaging in public debate and from taking high office.  Now I am not alone in believing that our democratic systems are weaker without bloody difficult women in politics, in fact, in any sphere of life.  We cannot afford as a society for women and girls to withdraw from controversial debates nor from taking high profile roles for fear of abuse.  We can’t afford for them to sensor or limit themselves or decide not to engage at all.  Where any group of people is silenced, we all suffer.  So today I want not just to acknowledge again the scale and impact of online misogyny but also to focus on what politicians, campaigners and the big social media companies themselves are already doing to mitigate it.  I am delighted to have with us this afternoon, five – it appears four at the moment but it will be five accomplished women each ready to share their own perspectives on online misogyny and how to challenge it. 

Stella Creasey has been the Labour and Cooperative MP for Walthamstow since 2010.  She served on the front bench teams of Ed Miliband and Harriet Harman first as a Shadow Minister for Crime Prevention and then as Shadow Minister for Business Innovation and Skills.  Seyi Akiwowo is the Founder and CEO of Glitch, an advocacy, campaigning and training organisation dedicated to ending online abuse.  Katy Minshall is the Head of UK Government Public Policy and Philanthropy at Twitter.  Amin Mitchell is the Strategy Director of the Internet Commission which aims to work with industries, Governments and citizens to make the digital transformation of society responsible and sustainable.  Lastly and by no means least, Rebecca Stimson who is the Head of Public Policy for Facebook and Instagram in the UK.  First maybe we could talk about the scale and nature of online misogyny but also how misogyny focusses particularly on minority or as you say, on minoritised women and how intersectionality really does seem to ignite the trolls.

Seyi Akiwowo

Women are disproportionately impacted by online abuse.  Women are twenty seven times more likely to be harassed online and then you look at that with race, women are eighty four percent more likely to be harassed online and Covid and lockdown and being at home so much more, relying on the internet so much more, exacerbate all of that and we found that out in our report last month and again intersectionality allows us to know what… how those communities are also able to access justice.  When I tried to take a case against my trolls or across Twitter and on YouTube, I had to pick one area of law.  I had to either pick being a woman and there was very little legal protections around me being a woman or just being black and having protections around that.

Emma Woollcott

Do you want to talk a little bit more about the report.  Issues were highlighted during the lockdown and as a result of Covid-19.

Seyi Akiwowo

Forty six percent of women experienced some form of online abuse during lockdown and when you looked at black and minoritised women and non-binary people that went up to fifty percent.  So we made kitchen tables, living rooms, gardens for some the new workplace without any support and guidance and training around on that efficacy.  What are the duties of care for employees in relation to their employer if harassment online does take place.  In our report we recommended that employers look at having a digital health and safety strategy, we are calling on tech companies to be a lot more transparent around content moderation and promoting that and then our final calls on Governments, we need some form of consistent framework and how all of these tech companies respond to online abuse and hate speech.

Emma Woollcott

If AI and the algorithms which feed it learn from our misogynistic society and misogynistic data, can social media platforms realistically ever solve these problems?

Katy Minshall

Twitter reflects a mirror on society and our society around the world is far from perfect.  We’ve gone from being in a place just a few years ago of being wholly reliant on people having to report these issues to us, victims having to let us know when they’ve been abused online to now one in two of the tweets we take down for abuse.  We have detected ourselves proactively using technology.  When it comes to reviewing and changing the fundamentals of Twitter to just try and make it safer by design, since August we have provided to all users the opportunity to choose who can reply to tweets or turn off replies to tweets completely.  Has it worked?  Has it made a difference?  Initial data would suggest yes it has had a positive impact and on average these sub-teams have prevented three potentially abusive replies per tweet and data suggests that those who face abuse are more likely to find these settings helpful.  You will see more from Twitter and I am sure other technology companies as well is far more thinking about safety by design and how you can organise and change your platform so the incentives are right and I am encouraging the behaviours you want to see.

Rebecca Stimson

So obviously Facebook’s had rules for what you can and can’t do on it since the inception of Facebook.  We are constantly developing and evolve our policies with at risk groups, representation groups from all kinds of different spheres of life.  We have a very broad spectrum of particular characteristics and traits of people under our hate speech policy and I think that speaks to the intersectionality point where yes it is much, much worse if you hit on a number of those different characteristics.  You mentioned the algorithm in your opening bit as well about that driving some of this behaviour we overhauled entirely the algorithm a couple of years ago now that underpins what you see in your news feed so that it isn’t just a ‘I’m a white supremacist’ and just feed me a constant diet of white supremacy content, it’s harder to go into those kind of bubbles that people can be although that’s not perfect.  In the end an algorithm is just a decision making process but if the data you put in the front end and the decisions it makes at the back end, that’s where you want to focus your time and effort thinking is that right, are women and all forms of minorities accurately reflected in those data sets, not the algorithm itself that’s necessarily the problem.

Emma Woollcott

So Amin the internet commission has been working with technology companies across a range of industries to encourage positive and responsible online behaviour.  Can you tell us what’s been successful around misogyny in other sectors?

Amin Mitchell

Everybody we’ve talked to across the pieces introducing more and higher levels of automation and particularly with the dating platforms we’ve talked to misogyny, sexist behaviour, intimations of harassment is something that they are very interested in starting to automate the detection of.  There is a limit in interpersonal discourse about how much you can presume somebody finds something offensive and so augmenting that with prompts for reporting and prompts for access to safety features seems to be really working.  One of the things that we have found with some of the people we’ve talked to is that because they don’t necessarily define misogyny as a thing, it makes it very hard to train people to take action.

Emma Woollcott

Stella, it seems apt to be talking in Hate Crime Awareness Week about your push to specifically name misogyny as a hate crime. You’ve campaigned for a long time for misogyny to be recorded and treated as a hate crime and now those proposals have been picked up by the Law Commission, albeit are now part of a review of hate crime legislation.  What are the new proposals, how will they be implemented and what do you think the practical impact will be?

Stella Creasey

But as to the other forms of hate crime, we think recording the motivation behind it is part of being able to detect it, prevent it and challenge the culture in which it happens in the first place.  Women face hostility in the same way that people of colour face hostility because of the colour of their skin because of their gender and yet our hate crime system doesn’t actually recognise that.  On a practical basis it is now actually seven Police Forces across the country who have taken this approach and what they are finding is that it does two things; one it improves their intelligence about where crime is taking place but more importantly it gives victims the confidence to come forward and report in the first place because they know it is going to be taken seriously.  So we are also trying to amend the legislation in Parliament to make all Police Forces do what is good and best practice.

Emma Woollcott

I wonder if I can bring Gina Miller who has asked a question in our panellist chat but I wonder if you could make the point to the room and maybe pose the question so that we can discuss it?

Gina Miller

All the way through the abuse that I have suffered the response I have got is very practically what I should do to keep myself safe online.  If I walk down the street and somebody screamed abuse at me, it is not up to me to take myself off the street and somehow this assertion that on the medium, the online medium there is a different weight of responsibility, further responsibility on women to be safe than if they are off line and I just think my view is that we should go back to just thinking about the content and thinking about the practicalities of that and where the burden lay and I put a second question on there which is if companies are setting up an environment inviting women to be on as we are part of society then it is their duty of care as far as I am concerned, to actually keep us safe in that environment.  Why should it be that we have to do more than we would do off line and online and that’s my argument about this, you know, I posted this duty of care is really, really important to me because the lawyers in the audience, myself working in finance, when we set our businesses up, when we set out stalls out, we have a series of rules and regulations and duty of care on us and the way we behave, and the way we… our responsibility to our clients, to our customers and the way we set our stall out.  Why does the online world believe or these companies believe that they are better or… not better but why they are above us, that they don’t have to operate with the same duty of care.

Emma Woollcott

Rebecca, Katy I know that you are both involved in, you are both involved in a consultation on the Online Harms White Paper, do you want to talk about those proposals and that the implementation then of a duty of care?

Katy Minshall

I agree with Gina, there is way too much of a burden on victims of abuse.  We’ve already spoken quite a long time ago now in support of how a regulator could be a really positive thing for the UK. 

Rebecca Stimson

What’s interesting about it is going to be as you say, that burden between what is reasonable and proportionate and right and expected of companies and also a recognition that you won’t be able to regulate or content moderate your way out of some of these societal problems and where that balance lies and one of the reasons it lies where it does at the moment is just the capacity of algorithms and machine learning for us to find and remove it but in the meantime when machine learning and AI is not able to perfectly find and detect we are going to have to rely on users and reporting to an extent for a while.

Gina Miller

When it comes to algorithms I have in my world in finance is that they tend to be based on backward looking data so it has inbuilt biases and inbuilt social discrimination so you know we have to be very mindful that the answer for online you know, for duties of care is not just through AI, it also it’s about employing individuals who do some of that work as well.

Emma Woollcott

One of the questions that’s come through in the Q&A is whether the panellists feel confident there are enough women in tech to ensure that Facebook and Twitter have looked at those algorithms in a kind of unconscious bias way.

Katy Minshall

There are so many spaces in our society still where women are the exception rather than the normality that we don’t recognise that it is unusual in a society that is fifty one percent women, forty nine percent men that women are so rarely at the top.  All of us have a vested interest in joining the dots in how whether the tech companies recognise their responsibilities, where the Government and the Police recognise their responsibilities because ultimately all of us want to live in a society where everyone is free to be who they are without that form of harassment.

Rebecca Stimson

Outside of the large platforms lots and lots of other social spaces are introducing AI, there needs to be a kind of supply chain accountability for how AI and machine learning is trained and how it is continually tested for bias as well as a recognition that the starting point for lots of machine learning and artificial intelligence is far from neutral and consequently the auditing of how that is being put into effect needs to be done to understand whether or not the starting point is neutral but also whether the effects that are coming out are actually generating more inequalities.

Katy Minshall

There are so many decisions and so many systems that are set up with men as the default that we don’t even realise and the AI is a classic example of that in the decision making and the logic behind it.

Emma Woollcott

I know that when we have spoken before this event, Rebecca and Katy you talk quite a lot about trying, about the kind of the real eyes on abusive or flagged messages being focussing on what the motivation is and how tricky that can be.  Are you confident there is a kind of broad range of eyes on those decisions and that the policies are kind of as up-to-date as they can be.

Rebecca Stimson

You are in a dangerous situation where our content moderators are trying to infer too much into a situation.  We tend to lean more on giving people the tools to do it themselves.  If you get a comment that someone thinks he’s flirting and somebody doesn’t, you can control that.

Emma Woollcott

When people do report it there is a lack of faith in that report from being dealt with.  For us there is still a lot of things that are clearly red lines being broken and not being responded to and there is not enough transparency around that reporting process and I am going to end on this point.  Covid-19 is going to be with us for a lot longer than we want it to be and so if we are getting responses that content moderation is reducing because there is now a lot more things to deal with there needs to be clearer communications to women around what the current duty of care is and what can be done but I think at the moment it does seem to be always reactive.  I think we have to recognise online as well as offline that women are under assault.  When women do come forward we need to be much better at recognising believing them and looking into it and I agree with Seyi, I have yet to see from many of the social media platforms frankly the feedback that explains why something has been taken down or why somebody has faced census.

Rebecca Stimson

We need people to be transparent about the processes they undertake about how their decisions happen.  There is a lot of informal work that organisations can do to be, to be clear on how they are accounting for their own decision making.

Emma Woollcott

I feel like we could talk about this forever and I think that we should.  Thank you so much for joining us.  I hope that you found this debate interesting and invigorating and we will all go off now and think about how we can all try to maintain this conversation and to help together to stop misogyny.  Thank you very much, take care.

The Mishcon Academy Digital Sessions

To access advice for businesses that is regularly updated, please visit mishcon.com.

How can we help you?

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

Emergency number:

COVID-19 Enquiry

Please enter your first name
Please enter your last name
Please enter your enquiry
Please enter your email address
Please enter your phone number
Please select a contact method

I'm a client

Please enter your first name
Please enter your last name
Please enter your enquiry
Please enter an email address
Please enter your phone number
Please enter a value

I'm looking for advice

Please enter your first name
Please enter your last name
Please enter your enquiry
Please select a department
Please enter your email address
Please enter your phone number
Please select a contact method

Something else

Please enter your first name
Please enter your last name
Please enter your enquiry
Please enter your email address
Please enter your phone number
Please select your contact method of choice