The Online Safety Bill was published on 17 March 2022 with the aim to "protect children from harmful content such as pornography and limit people's exposure to illegal content, while protecting freedom of speech". The Bill seeks to impose a duty of care on online platform operators to achieve this aim. Platforms in scope that find themselves in breach of the Bill's provisions could face fines of up to £18 million or 10% of their annual turnover, whichever is higher.
In relation to protecting children, the Bill focuses on imposing a duty on platforms likely to be accessed by children to protect young people using their services viewing from illegal content, as well as legal but harmful material such as self-harm or eating disorder content. Any business that hosts user-to user content and has sufficient links to the United Kingdom would be in scope and would owe such duties if their platforms are likely to be accessed by children. The Bill also requires providers who publish or place pornographic content on their services to prevent children from accessing that content.
It is clear that platform providers need to take proactive steps to prevent harms, especially in relation to children. However, it is crucial that this is done in tandem with a widespread programme of online media literacy education for children.
Children's online use
A report by the then Children's Commissioner in 2018 found that, by the time a child turns 13, parents will have posted 1,300 photos and videos of their child to social media, with this number increasing explosively when the child engages with platforms themselves. According to the report, children post on average 26 times a day, averaging a total of nearly 70,000 posts by age 18. Ofcom's latest media literacy report has found that one in three internet users fail to spot misinformation online.
Given the frequency of use, it is essential that children are properly educated on safe internet use, for example:
- Discussion of the permanency of the digital footprint and how children's internet use may impact them in the future, as Tweets, Instagram posts and the like remain in the ether in perpetuity.
- Providing children with the tools to decipher disinformation.
Further, an Office of National Statistics report in November 2020 found that 52% of children who had experienced online bullying said they would not describe that behaviour as bullying, and indeed 26% had not reported any instances of online bullying that happened to them. The lack of understanding as to what constitutes cyber-bullying among children no doubt increases the risk that children may be bullied and not report their experiences, and may even become perpetrators of bullying themselves.
The Ofcom report also highlighted key trends in how children in the UK use and understand media online. One such trend is that children may lead secret lives on social media using private social media accounts (for example fake Instagram accounts – "Finstas") that their parents don't know about. Ofcom reports that "Two-thirds of 8- to 11-year-olds had multiple accounts or profiles, and almost half of these have an account just for their family to see." Children are also able to circumvent age restrictions on most social media platforms, with a third of parents with children aged 5-7 and two thirds of parents with children aged 8-11 saying that their children have social media profiles.
Against this backdrop, media literacy is vitally important.
In July 2021, The Department for Digital, Culture, Media & Sport published its Online Media Literacy Strategy as part of the Government's plan to help users make informed and safer decisions online "by supporting the education and empowerment of all internet users". The Strategy document set out a Media Literacy Knowledge and Skills Framework, which highlights five principles that support strong media literacy capabilities. The framework highlights that users should understand that actions online have consequences offline, and should use this understanding in their online interactions. The potential for historic online activity to negatively impact a child in the future is clearly a harm that needs careful consideration. Ofcom will need to continue researching the impact of the internet on children's lives, and how online media literacy in young people might be improved.
The draft version of the Bill published in May 2021 had included, at clause 103, proposed replacement wording for Ofcom's duty to promote media literacy, which is currently set out in the Communications Act 2003, and clarified what was to be expected from Ofcom in relation to their duty. However, the Joint Committee report scrutinising the draft version of the Bill in December 2021 recommended that the Bill needed to go further in relation to media literacy. For example, The Committee stated:
"If the government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm".
Specifically in relation to children and education, the Joint Committee also recommended that "the Bill reflects that media literacy should be subject to a 'whole of government'’ approach, involving current and future initiatives of the Department of Education in relation to the school curriculum as well as Ofcom and service providers" and that "Ofsted, in conjunction with Ofcom, update the school inspection framework to extend the safeguarding duties of schools to include making reasonable efforts to educate children to be safe online".
The Committee's recommendations reflect a concern that current media literacy programmes have a tendency to put the responsibility for online safety onto children themselves. A 5Rights Foundation Report states that many programs offered by private companies like Google and Facebook, at little to no cost, "teach children to accept certain service design elements as 'unavoidable' risks, when in fact they could and should be tackled at a design level by those very same companies". Media literacy, 5Rights argue, is not confined to an understanding of risks created by adult or harmful content, but also those created "by design and operation of platforms and services."
Contrary to the Joint Committee's recommendations, however, in the updated version of the Bill, the Government in fact removed clause 103, stating that it was simply a clarification of Ofcom's duties under the Communications Act, did not grant any additional powers and therefore was deemed "unnecessary regulation". In its response to the Joint Committee's recommendations, the Government has confirmed that it has plans to undertake non-legislative measures, including working with the Department for Education to consider what the Government can do to improve media literacy education in schools. It has also said it will discuss with Ofcom options for engaging with Ofsted.
Whatever the Government's next steps, it is apparent that media literacy will be a vital tool for children in our increasingly digital world.