Mishcon de Reya page structure
Site header
Menu
Main content section
person on their phone

Online safety: Ofcom publishes guidance to tech firms to tackle online harms against women and girls

Posted on 28 November 2025

In Brief  

  • Women and girls face serious, distinct online including misogynistic abuse, coordinated reputational attacks, stalking, coercive control and intimate image abuse. The Online Safety Act 2023 (OSA) compels online platforms to protect users based in the UK from illegal and harmful content - this includes tackling harms which disproportionately affect women and girls.  
  • As the independent regulator of Online Safety, Ofcom is required to provide guidance to online platforms on how to mitigate these risks. On 25 November, Ofcom published its Guidance relating to online harms against women and girls, urging its immediate adoption across a range of regulated user-to-user and search services including social media, gaming, discussion forums, pornography, dating services and online marketplaces. 
  • The Guidance goes beyond what is strictly required to comply with the OSA and sets a new, ambitious standard for online safety for women and girls. It was developed with insights from victims and survivors, safety experts, women’s advocacy groups and organisations working with men and boys and seeks to set a clear benchmark for safer product design and operations.  

A harmful online reality: how abuse is perpetuated and normalised 

Women and girls from all walks of life face disproportionate harm online, limiting their ability to participate safely, express themselves freely and, in many cases, to work. Online harms also contribute to the spread and normalisation of harmful attitudes and behaviours towards women and girls, both on and offline. This is highlighted by the fact that every single case reported to the National Stalking Helpline includes online contact and monitoring1. 

AI-enabled abuse is becoming increasingly widespread and, concerningly, accepted, as highlighted by a recent report and survey commissioned by the Office of the Police Chief Scientific Adviser. The survey revealed that: 

  • Among 1,700 people aged 16 and over in England and Wales, around 25% agreed with or felt neutral about the legal and moral acceptability of viewing, sharing, creating or selling a sexual or intimate deepfake - even when the person depicted has not consented;  
  • Those who found sexual or intimate deepfakes acceptable were more likely to be men under 45, viewers of online pornography, agree with misogynistic views and feel positively about AI;   
  • One in 20 respondents admitted creating deepfakes; and more than one in 10 said that they would do so in future.  

The survey's author cautioned that deepfake creation is “becoming increasingly normalised as the technology to make them becomes cheaper and more accessible”. Of the individuals surveyed (in April of this year), only 14% were aware of the current legislation relating to deepfakes, notwithstanding key recent legislative developments in this area (including the OSA, and more recently the Data (Use and Access) Act 2025, which contains provisions criminalising the creation, and requesting of the creation, of intimate deepfakes without consent). The DUAA received Royal Assent in June and the provisions will be enacted within the coming months. 

Ofcom's Guidance to tackle gender-based harms 

The Guidance explains where services should go further than the provisions of the OSA to address gender‑based harms, balancing safety with freedom of expression and privacy. It expects platforms to design and test with safety in mind, improve reporting and support for women and girls, and to adapt these measures to their platform's design and risk profile. The Guidance outlines actions in four key areas as follows:  

1. Misogynistic abuse and sexual violence 
  • Misogynistic abuse and sexual violence 
  • Introduce prompts asking users to reconsider before posting harmful content; 
  • Impose timeouts for users who repeatedly misuse features to target victims;  
  • Promote diverse content and perspectives in recommended “for you” systems to avoid toxic echo chambers; and 
  • De‑monetise posts or videos which promote misogynistic abuse and sexual violence. 
2. Pile‑ons and coordinated harassment 
  • Set volume limits on posts ("rate limiting") to curb mass‑posting of abuse; 
  • Allow users to quickly block or mute multiple accounts at once; and 
  • Provide more sophisticated tools to make multiple reports and track their progress. 
3. Stalking and coercive control 
  • Bundle safety features to simplify setting accounts to private; 
  • Enhance visibility controls over who can see past and present content; and 
  • Strengthen account security and remove geolocation by default. 
4. Image‑based sexual abuse (including cyberflashing) 
  • Use "hash‑matching" technology to detect and remove non‑consensual intimate images; 
  • Blur nudity by default, with an adult override option; and 
  • Signpost users to support including how to report a potential crime. 

More broadly, Ofcom expects “abusability” testing of new services and features before launch, so that misuse risks are identified early. Moderation teams should receive specialised training on online gender‑based harms and companies should consult experts and listen to victims’ and survivors’ lived experience (including through user surveys) to ensure policies and safety features work effectively for women and girls. 

Next steps 

Ofcom has pledged to enforce services’ legal requirements under the OSA, using its full powers to ensure that platforms tackle illegal content including intimate image abuse and material encouraging unlawful hate and violence. It will strengthen its industry Codes as the law evolves, consulting on mandating hash‑matching technology to detect intimate image abuse and reflecting cyberflashing becoming a priority offence next year.  

Ofcom has written an open letter to tech firms regarding the Guidance and is planning meetings with companies over the course of the coming months, as well as convening an industry roundtable later this year. In summer 2027, it will publicly report on progress by individual providers and the sector following the publication of the Guidance to ensure accountability, and, if action falls short, it will consider making formal recommendations to Government on where the OSA may need to be strengthened.  

Mishcon de Reya's specialist team of lawyers advise victims of digital/online abuse and harassment, as well as publishers and platforms on their complex obligations in moderating online content. If you would like to discuss or might require assistance in respect of these matters, please do contact a member of the team.  

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else