What is the 'right to be forgotten'?

A mouse cursor hovering over a web page url bar

In 2014, the outcome of the case of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González meant that individuals can ask search engines to remove links to pages that appear when searching their name, where the results are either out of date, inadequate or inaccurate.

This decision was dubbed “the right to be forgotten”, and following this ruling, this principle was included as a clause in the General Data Protection Regulations (GDPR) as a right to request information be deleted from records, no matter the reason. This is also known as the right to erasure.

The case initially came about after Costeja González requested that an article from the Spanish newspaper La Vanguardia remove the data relating to him regarding the forced sale of his property. Once the newspaper declined, citing that it would be inappropriate, Costeja then contacted Google Spain to remove the links to his information and submitted a complaint to Spanish Data Protection, Agencia Española de Protección de Datos, AEPD.

After written legal proceedings and an oral hearing, the ruling by the European Court of Justice (ECJ) upheld Costeja’s request for his information to be removed from public records and subsequently, this landmark case resulted in Google eventually coming up with a set of policies for processing “right to be forgotten” requests.

Even though the essence of the “right to be forgotten” existed in a lesser format in the EU's 1995 Data Protection Directive, the inclusion of Article 17 of the General Data Protection Regulation (GDPR) in 2019 was considered a breakthrough, enabling EU citizens to request their personal data be deleted, and for the businesses holding that information to oblige and remove it as quickly as possible, or face significant fines. Ultimately, the penalty for non-compliance is around £18 million (€20 million) or 4% of global annual turnover.

This right, however, isn’t an absolute right that must be followed without exception and is usually balanced against other competing rights, like the right to freedom of expression, for example. The ECJ said in its initial ruling that information considered to be in the public interest isn’t likely to fall nder the right to be forgotten principle.

Google, and any other search engines, must consider removing links to any information that is inaccurate, inadequate, irrelevant, or excessive when a request is filed from an individual about their own search results. With Google, your 'right to be forgotten' can be exercised using this form.

Right to be forgotten: GDPR

The right to be forgotten ruling was based on the EU's 1995 Data Protection Directive, which stated in Article 12 that people can ask for their personal data to be deleted once it's no longer necessary. The ruling outlined when and how search engines like Google must honour such a request.

However, the General Data Protection Regulation (GDPR), which applied to all EU member states (and all organisations using EU citizens' personal data) from 25 May 2018, has taken the reins from the old 1995 Directive. Intended to update privacy and data protection rules for the digital age, GDPR also updates the definition of the right to be forgotten.

In Article 17, the GDPR legislation considers the right to be forgotten in the context of organisations collecting and processing people's personal data. It retains the 1995 Directive's intent to allow people to request their data be deleted when it's no longer relevant, but expands this right to give people more control over who can access and use their personal data.

Under GDPR, an EU citizen has the right to demand an organisation erases their personal data if:

  • the data is no longer relevant to the reason it was collected;
  • if the person withdraws their consent for their data to be used (and if the organisation has no other legal basis for collecting it);
  • if the person objects to their data being collected for marketing purposes or where their rights override legitimate interests in collecting data (for instance, where that is sensitive data concerning a child);
  • if the data was unlawfully processed;
  • if the data's erasure is necessary to comply with a legal obligation;
  • if the data belongs to a child, and was exchanged for "information society services".

In all these cases, the organisation must delete the data "without undue delay" - i.e. as soon as possible. If the organisation has made the data public, it must take "reasonable steps, including technical measures" to inform any other organisation processing that data that the data subject has asked for it to be removed.

However, organisations don't have to honour these requests if they're complying with legal obligations, exercising their right to freedom of expression or the right to freedom of information, if the data is in the public interest or to establish, exercise or defend legal claims.

Right to be forgotten: UK

RELATED RESOURCE

Data governance and privacy for data leaders

Create your ideal governance and privacy solution

FREE DOWNLOAD

As of 31 January 2020, the UK ceased to be a member of the EU, however, provisions were made in advance and the EU GDPR legislation became incorporated into the UK’s Data Protection Act (DPA) 2018, as well as a UK version of the GDPR which still includes the right to erasure. In a nutshell, GDPR rules are still applicable to UK businesses until the expiration of the EU’s “adequacy decision” in June 2025 when the EU and the UK will need to come to an agreement around data protection and the free flow of data between the states.

How the DPA’s right to erasure is used can vary from case to case. For example, you could ask a social media company like Twitter to delete posts they published earlier in their lives that could hinder them personally or professionally as adults.

There are two prominent cases in the UK where individuals have invoked their right to be forgotten, on 27 February and 13 March 2018. They were identified at the time only as NT1 and NT2. Both claimants are men, and both cases involved challenges against Google.

They had requested that Google remove links to articles that listed their previous convictions for crimes committed in a place of work - arguing these reports have damaged their personal relationships and hindered their professional reputation. NT2's court filing even referenced the assertion he had faced attempted blackmail and had been threatened in public.

Google initially refused to comply with their requests, suggesting this information was in the public interest. In April last year, the High Court ruled in favour of NT2's request, suggesting the conviction listed, in his case, was not relevant to his business dealings. By contrast, the High Court ruled against NT1, as this individual was convicted of false accounting, and therefore still relevant to those doing business with him.

What will be removed?

Depending on where information is held, it must be deemed "irrelevant, outdated, or otherwise inappropriate", and could include old newspaper articles, and pictures or videos from social media, and it should be accompanied by a digital copy of the user's official identification. Content will only be delisted for searches related to your name.

Who is regulating the right to be forgotten?

How Google handles complaints and requests to remove information from its search results is governed by European data protection watchdogs, including Article 29 guidelines.

Following the flood of requests received by Google, Professor Luciano Floridi, the person tasked with determining how Google can comply with the recent EU court ruling, said in 2014: "People would be screaming if a powerful company suddenly decided what information could be seen by what people, when and where. That is the consequence of this decision. A private company now has to decide what is in the public interest."

Google, currently responsible for over 92% of web searches in Europe, faces the unenviable task of balancing its duty to comply with its users' "right to be forgotten" and preserving its reputation as the go-to source for online information and content.

Peter Barron, Google's director of communications for Europe, said: "The European court of justice ruling was not something that we wanted, but it is now the law in Europe, and we are obliged to comply with that law. We are aiming to deal with it as responsibly as possible... It's a very big process, it's a learning process, we are listening to the feedback and we are working our way through that."

All applications must verify that the links in question relate specifically to the applicant - unless the applicant has the legal authority to act on the claimant's behalf, in which case this must be proven.

Landmark cases to date

Google vs CNIL

In September, the European Court of Justice ruled that Google is not required to apply the right to be forgotten globally and that only search results within Europe should qualify.

The case, which began as a dispute between the tech giant and French data regulator CNIL, initially saw Google being ordered to delete any results that included damaging or false information relating to an individual. Google introduced a geo blocking feature that prevented search results from appearing, however, this was only introduced in Europe, prompting a challenge from the regulator.

Google argued that allowing a geo blocking feature to be applied beyond Europe could potentially allow rogue governments to hide criminal activity, such as human rights abuses.

German double murder

In November, a German man that had been previously convicted of two counts of murder in 1982 won a case in Germany's highest court to have his surname removed from articles referencing the initial charge and subsequent criminal proceedings.

In this instance, the courts agreed that his right to privacy outweighed any public interest or press freedom case, and that Google needed to comply with his removal request. Any publication that held archives of the articles were also forced to remove them.

Is it about privacy or censorship?

The request form launched by Google after the ruling in 2014 received 12,000 entries from across Europe within 24 hours; at one point receiving up to 20 requests a minute. This grew to 41,000 requests in the first four days.

Feeding into fears about the potential consequences of this ruling, almost a third of the requests were related to accusations of fraud, while a further 12% were attached to child pornography arrests and 20% for other serious or violent crimes.

Of these first 12,000 entries, around 1,500 were said to be people residing in the UK, with an ex-politician, a paedophile, and a GP among them.

By December that year, the number of requests received by Google had grown to around 175,000 from all 28 EU countries, with 65,000 of the links coming from the UK. As of 22 January 2018, Google had complied with 43.3% of requests to remove links.

There are concerns from many that the ability for users to request information be removed from search results could result in the system being abused for nefarious purposes.

However, lawyers have assured those worried that politicians, celebrities, and criminals will probably not benefit from the ruling as Google will have the right to reject applications that request removal of information deemed in the public interest.

It should also be noted that, while links to the objectionable information will be removed, the information will not actually be deleted from the web.

Following comments regarding the ruling, Baroness Prashar, chair of the Lords Home Affairs EU Sub-Committee, said: "[We] do not believe that individuals should have the right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said."

This article was first published on 20/09/2019 and has since been updated.

Keumars Afifi-Sabet
Features Editor

Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.