top of page

Forgive and... Forget? Legal Insights into the Right to Be Forgotten in the Digital Milieu

Today, the Internet acts as a record of everything that has happened and is happening around and to everyone. This urges many to be cautious about their online behaviour. However, if something is posted about someone, the control a person has over themselves on the Internet is somewhat relinquished. In a way, everyone is an open book on the web and personal information is often indiscriminately exposed for anyone to read. Importantly, if this content is wrong, defamatory or sensitive, it can have momentous repercussions on individuals and their social lives, employment prospects and personal well-being. But would the removal of such information protect human dignity or comprise an act of censorship?

Recently, Internet users have called upon governments to protect the “Right to be Forgotten” (RTBF) which is the right to have information about oneself removed from the internet under specific circumstances. An important difference, which helps distinguish this from the right to privacy, is that the information to be “forgotten” is publicly available.

The discussion on the RTBF was brought to the fore by the landmark 2014 case Google Spain v AEPD and Mario Costeja Gonzalez. Due to the settlement of his debts, the plaintiff, Mr Costeja Gonzalez, requested that Google remove a link from a digitised 1998 article regarding the auction of his foreclosed home. The case was referred to the European Court of Justice (ECJ) which delivered a pivotal ruling in favour of the plaintiff, affirming the RTBF as a human right. Specifically, the ECJ ruled that search engines are responsible for the processing of personal data and that they can be held liable for the ways in which they do so. By dealing within the purview of Google’s accountability, the verdict also produced an important shift in the collective perception of large technology corporations’ responsibility for their products.

The need to establish mores on forgetting is based on the right of individuals to live with dignity and to be able to “determine the development of their life in an autonomous way” without being haunted by past actions which no longer have any bearing on the present. This has become especially relevant in light of expansion of the digital sphere in past decades which presently has the scope to stunt personal development and self-expression.

However, not everyone is of this opinion, and dissent is largely present in the United States where the “right to know”, in line with the dicta of transparency and freedom of speech in the First Amendment, trump considerations regarding the removal of one’s information from the Internet. The concerns voiced range from censorship to a falsification of history. Recently, although lawmakers in New York attempted to pass a bill that would remove “inaccurate” content from the Internet, this was rapidly thwarted. Cases such as Garcia v Google and Machanda v. Google further attest to the reluctance felt by the legal establishment to consider the RTBF compatible with the public’s right to the truth.

If individuals decide which details of their lives others can find on the Internet, they can obfuscate aspects of their personal lives, preventing others from accessing an honest account of who they are. As such, a ruling akin to the one made by the ECJ would not merely amount to censorship but could also represent a breach of the Constitution in the United States. Thus, reconciling the freedoms enjoyed through the RTBF with freedom of speech and self-expression seems particularly challenging.

Furthermore, in many countries, laws concerning the protection of one’s identity and personal data are already in place and are thought to already afford adequate protection to users. However, it is worth considering more ambiguous cases – such as what an individual’s rights are if someone posts something that they want taken down. Some instances seem clear-cut, such as those regarding the victims of revenge porn. In a way, it seems farcical to assert that removing explicit content targeting these individuals constitutes as censorship or fabricating a “constructed” version of history that works in their favour as those affected are clearly victims of criminal abuse.

It is also worth interrogating the degree of a search engine’s accountability in each case as search results are rarely “neutral” and their algorithms have been shown to consistently prioritise some results over others. A better understanding of this phenomenon within the legal establishment might help vindicate requests made to be “forgotten”.

It is also important to recognise that despite the importance of its outcomes, Google Spain v AEPD and Mario Costeja Gonzalez and other cases like it have their limitations. Firstly, the Westphalian system dictates that each state only has exclusive sovereignty over its own geographical territory. This is problematic in the domain of the Internet, as users interact with it in a way that often transcends the bounds of geographical location. What this meant for Mr. Costeja Gonzalez, for example, is that results for the article on his foreclosed home would be deleted from Google domains in the European Union but that this would not extend to the United States where the article could still be retrieved.

Crucially, what is actually eliminated from the web is the detrimental result arising from a search on the person’s name but not the content itself which remains available if other, pertinent search terms are used. Furthermore, the process of removal, carried out by a designated “removals team” at Google, is opaque – the company has not divulged details on how it carries it out. The thousands of requests sent in everyday are handled by a team of paralegals though it is uncertain whether these are tech experts or have other qualifications entirely. In response to requests for further information regarding this, Google has declined to comment. Though initiatives such as the “Oblivion” software are intended to streamline and optimise the process of de-linking, it is clear that the parameters of the RTBF need to be better defined legally and general guidelines to inform the practice of link removal need to be established.

But tides could be shifting. A 2020 Pew Research Center report conducted in the United States found that most Americans (74 percent) would agree to “keep things about themselves from being searchable online”. It is more pressing now than ever to be able to exert control over personal information, especially in light of the growing awareness surrounding the biased nature of search engines and the potential they harbour to favour results which distort and manipulate information. An article by Julia Powles for The Guardian summarises this aptly: data protection laws help people have agency over their personal information “in some circumstances because we think there are deeply-embedded values worth protecting, under appropriate guidelines, for the enrichment of society”.

Therefore, the importance of an individual’s power to control information which is no longer accurate, wrong or obsolete on the Internet regarding themselves cannot be understated. Securing the right to do so is essential for lives and futures to unfold freely.


bottom of page