The recent conclusion of the inquest into the death of 14-year-old Molly Russell in November 2017 has brought internet safety and the potential exposure to risk when using social media back to the fore.
Senior Coroner Andrew Walker ruled that due to Molly's exposure to harmful online content, "it would not be safe to leave suicide as a conclusion".
The Molly Russell Inquest has highlighted the need for social media platforms to do more to manage the risk of harmful content being posted on their platforms and prevent children and young people from accessing such content. Making online operators just as accountable as other service providers that fall within the scope of the Health and Safety at Work etc Act 1974, as companies have a duty to secure the health and safety of the people it provides services to.
The inquest, which lasted for two weeks, focused on Molly's use of the social media platforms Pinterest and Instagram. The inquest heard that Molly viewed 2,100 pieces of content on Instagram which related to suicide, depression and self-harm in the six months which preceded her death. The inquest also heard that Molly compiled a digital board on Pinterest that was comprised of 469 images that related to depression and suicide.
The Global Head of Communications at Pinterest expressed his regret that Molly was able to view the images relating to suicide, self-harm and depression on their online application. Pinterest rely on human moderators and artificial intelligence to hide potentially harmful content, but some material like the kind Molly viewed was still available on the website. Pinterest accepted that the site was not safe when Molly accessed it and it was "likely" that harmful content was still available on Pinterest.
By contrast, the head of Health and Wellbeing at Meta, the parent company of Instagram, argued that the posts relating to suicide and self-harm that Molly viewed on Instagram prior to her death "were safe". It was accepted that two of the posts that Molly had viewed on Instagram prior to her death had violated the platform's policies, but they were still deemed to be safe. This evidence was in direct contrast to that given by child psychiatrist, Dr Navin Venugopal, who told the Inquiry that Molly had been "placed at risk" by the content she viewed.
The inquest's conclusion left no doubt about the Coroner's view of the role that social media content played in Molly's death. He stated that some of the sites viewed by Molly "were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see". He went on to discuss how the algorithms used by these sites created "binge periods of images, video-clips and text" that were in some instances sent to Molly without her requesting them.
The Coroner has sent a prevention of future deaths report to the Government and a number of social media firms including Meta, Snapchat and Pinterest urging them to take action on the issue of online safety. He identified the following points for the Government and social media firms to consider:
- Separate platforms for adults and children;
- Age verification before joining a platform;
- The provision of age specific content;
- Reviewing the use of algorithms to provide content;
- A Government review of the use of advertising; and
- Parental, guardian or carer control including access to material viewed by children, and retention of material viewed by children.
He also recommended that consideration be given to an independent regulatory body that would be responsible for monitoring the content of online platforms.
The Prime Minister has confirmed that the Government will be proceeding with the passage of the Online Safety Bill ("the Bill") albeit with some "tweaks". It remains to be seen what these tweaks will be, but there will no doubt be greater pressure on the Government to ensure that social media companies are held more accountable for the content that is accessible on their platforms, particularly by children and young people. The problem is that large amounts of harmful content can currently be found on social media as platforms struggle with balancing the moderation of the scale of content being posted and affording a level of freedom for users to express themselves, all while trying to keep their online spaces safe.
The passage of the Bill will see social media companies having to discharge a considerable duty of care towards their users. Those who fail to comply with this duty will face significant criminal and financial sanctions – similar to the penalties imposed on individuals and businesses for health and safety non-compliance under the existing law. The Bill emphasises the shift away from prescriptive statutory compliance and a move towards public accountability in the UK – both for online and physical in-person safety.
Social media platforms must be proactive in seeking expert legal advice to ensure that they put in place the appropriate preventative policies and procedures that will help them to comply with the duties that are likely to be imposed on them by the implementation of future legislation.
Our multi-disciplinary teams have significant expertise in health and safety, reputation protection and crisis management, data protection, child safety, criminal law, regulatory compliance and cyber security – many of whom are involved in providing advice and assistance to relevant organisations in respect of the Online Safety Bill. Updates on our work on the progress of the Bill can be found here.