Right to Be Forgotten?

Posted on 27 April 2018 by Jessie Bridgett

laptop user

A recent research paper from Google, coupled with a High Court judgment, has shed light on how Google is responding to right to be forgotten ("RTBF") requests four years after the right first arose.

The RTBF was established in 2014, when the Court of Justice of the European Union ruled that those with clear links to the EU (typically EU citizens or residents) can ask internet search engines to remove search results against their name, if they link to content (even if lawfully published) that is inaccurate, outdated or irrelevant. If successfully 'delisted', the material remains online but should be much less visible.

This landmark judgment places the onus on search engines to assess delisting requests and determine whether an individual's right to privacy outweighs the public interest in continued access to the information in question. Shortly after the ruling, Google and other search engines such as Bing and Yahoo! responded by creating a “take down” request form.

In response to a demand for more public data on RTBF requests, Google recently released the draft of a new research paper entitled: "Three Years of Right to be Forgotten", which analyses the way in which the RTBF was exercised between 30 May 2014 and 31 December 2017.The paper provides an insight into Google's approach to delisting requests, how those requests are being used across Europe, and the categories of sites being targeted.

Google's research reveals that, during the relevant period, it received just under 400,000 requests to delist just over 2.3m URLs. Of this total, only 43% met the criteria for delisting. Minors made up 5.4% of all requested URLs – a class whose delisting rate is nearly twice as high as private individuals. Government officials and politicians generated 3.3% of requested URLs and had a lower delisting rate, highlighting the balance the company tries to strike when evaluating the public interest. No corporate entities have ever had content delisted under the RTBF.

It may surprise some to read that there is no automation; Google assigns each request to a least one reviewer for manual review. Broadly speaking, reviewers consider four factors when balancing an individual's privacy against the public interest: the validity of the request; the identity of the request; the content referenced by the URL and the source of the information. In the first two months after Google began delisting, it took a median of 85 days to reach a decision. By 1 January 2017, that had dropped to four days.

Requestors have sought to delist URLs relating to a range of potentially personal information. Around a third of the requested URLs related to social media and directory services that contained personal information. A quarter of URLs related to news outlets and government websites, which in the majority of cases had a bearing on the requestor's legal history.

One of the aims of the report is clearly to highlight the challenge the company faces when considering RTBF requests. It is hard to miss its defensive tone. At the same time, Google's approach to delisting has recently been thrown into the spotlight in a High Court case concerning its refusal to delist search results regarding two individuals' spent convictions. Much of the media coverage has portrayed the decision as a big loss for Google, but in fact it was a draw: the Judge ordered the company to delist the URLs complained of by one of the claimants, but not by the other. If anything, the judgment serves to reinforce Google's argument that evaluating RTBF requests is difficult; that every request must be considered on its specific facts, and that the company takes care when doing so.

The seven-day trial resulted in a 76-page judgment, which goes into considerable detail as to the relevant law and specific facts of each case. While both Claimants sought delisting of URLs regarding spent convictions, the particular facts of each case were distinct, hence the different outcomes. The unsuccessful Claimant had been convicted of a serious dishonesty offence, continued to refuse fully to accept responsibility for his crime and was found by the Judge to be an unreliable and evasive witness. By contrast, the successful Claimant had committed a less serious crime, which was not for financial gain, and appeared repentant. Google was also vindicated to a certain extent in that the Judge refused to award any damages on the basis that the company had taken reasonable care to comply with its obligations.

This is an area of growing significance, not least because the right will be codified in the General Data Protection Regulation which comes into force on 25 May 2018. As the coverage of the recent High Court judgment shows, there is widespread interest in this developing area of law, and it brings into focus key questions about privacy, free speech and Google’s role as the arbiter of the two.