Internet Dementia – European Court Judgment and the ‘Right to be Forgotten’

Christian Möller, a Fall 2014 CGCS Visiting Scholar, overviews and analyzes the recent European Court Judgement on the ‘right to be forgotten.’

The Spanish Case

In 1998, Mario Costeja González, a Spanish citizen from El Escorial near the Spanish capital Madrid, was about to be forced into the foreclosure sale of his property due to social security debts, a fact that, on page 23, was also reported by the regional La Vanguardia newspaper. Although the proceedings were concluded and resolved, for years to come a Google search of González’s name brought up the newspaper notice of the foreclosure.

In 2009, González filed a complaint with the Spanish Data Protection Agency (or Agencia Española de Protección de Datos, short AEPD) against La Vanguardia and Google Spain, asking for an injunction against both the newspaper and search engine. The AEPD dismissed the claim against the newspaper (which was under a legal obligation to publish the official notice), but issued an injunction against Google Spain SL and Google Inc. to delete the data from the search engine’s index. Google appealed to the AEPD, whichreferred the case to the European Court of Justice (ECJ) in Luxemburg.

On May 13, 2014, the ECJ ruled that Google must remove links to outdated or irrelevant personal information from search results upon request. The Court found that individuals have a right to control their private data and that they have the right to request that information be ‘forgotten’ when the results show links to information that is no longer accurate or relevant.

It also establishes that Google’s search engine results are fully subject to European data privacy law and that because Google has an advertising subsidiary in Spain it is subject to the control of Spanish data protection authorities.

This judgment cannot be overturned and is now referred back to the national court. Theoretically, however, it is still possible for Google to take this case to the European Court of Human Rights (based on article 10 ECHR) once the national Court makes a final decision, according to Jef Ausloos of the University of Leuven in his analysis of the judgment.

EU Justice Commissioner (and former Commissioner for the Information Society and Media) Viviane Reding commented on the judgment as ‘a clear victory for the protection of personal data of Europeans […] The data belongs to the individual, not to the company. And unless there is a good reason to retain this data, an individual should be empowered – by law ¬ to request erasure of this data.’

Reactions to the Judgment

In an initial reaction, Google called the judgment ‘disappointing,’ saying that it ‘went too far’. Google argues that if people want links about themselves removed, they should take up the issue directly with whoever originally posted the content under their respective legislation. Following the ECJ ruling, however, the company has set up a form and established a committee of internet experts to help handle requests from Europeans claiming their ‘right to be forgotten’. Reportedly Google received more than 12,000 removal requests on the first day of form availability.

In his personal blog, Google’s Global Privacy Counsel Peter Fleischer wrote that there was a need for more public debate about what the ‘right to be forgotten’ should mean since privacy rights cannot be deemed to trump freedom of expression.

The Representative on Freedom of Expression of the Organization for Security and Co-operation in Europe (OSCE), Dunja Mjatovic, warned that the decision might negatively affect access to information and create content and liability regimes that differ among different areas of the world, thus fragmenting the internet and damaging its universality. In her view, no restrictions or liability should be imposed on websites or intermediaries such as search engines.

The Judgment

The ECJ advisory judgment of May 13 ruled that the search engine is responsible for the collection of content that appears after a query. According to the judgment, the activity of finding information published or placed on the internet by third parties, indexing it automatically, storing it temporarily and, finally, making it available to internet users according to a particular order of preference, qualifies as ‘processing of personal data’ as laid out in the Data Protection Directive (DPD).

It is important to note that the judgment does not rule on the lawfulness of the original information itself. It rather sees the search engine as the ‘controller’ of personal information as laid out in the DPD with its specific responsibilities. These responsibilities include an obligation to keep data that permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected or for which they are further processed.

The right to ask for a removal from search results, according to the ECJ, applies to data that is inaccurate, inadequate, irrelevant or excessive in relation to the purposes of the processing, that is not kept up to date, or that is kept for longer than is necessary. Exceptions apply only for data that is kept for historical, statistical or scientific purposes.

As internet searches play a decisive role in the dissemination of that information, the Court sees a more significant interference with the data subject’s fundamental right to privacy than the initial publication on the web page. This means that the operator of a search engine might be obliged to remove from the search results of a person’s name links to web pages, published by third parties and containing information relating to that person, even when its publication in itself on those pages is lawful.

Even if the initial publication was done solely for journalistic purposes in the public interest – and thus benefits from some privileges under the DPD – these privileges do not apply to the processing carried out by the operator of a search engine, the Court argues. In consequence, a complainant can exercise his or her right to privacy against a search operator even if the same is not possible against the publisher of the web page.

In the case of Mr. González, the Court saw his rights override not only the economic interest of the operator of the search engine but also the interest of the general public. Things might have been different, though, if the role he played in public life would justify the interference with his fundamental rights by the preponderant interest of the general public.

‘Mere Conduit’ and Data Protection Directive

The ruling coincides with European Commission and European Parliament negotiations to update the current Data Protection Directive (DPD) of 1995 (Directive 95/46/EC). The ‘right to be forgotten’ is also part of these negotiations – however those came nearly to a standstill in recent months.

The Court decision also touches upon provisions of the 2000 E-Commerce Directive (Directive 2000/31/EC), namely the ‘mere conduit’ provision. In article 12 the Directive states that a service provider is not liable for the information transmitted, on condition that the provider does not initiate the transmission, does not select the receiver of the transmission and does not select or modify the information contained in the transmission.

Although some stakeholders suggest that search engines should receive similar protections, as they too merely facilitate access to content created by others, despite certain conceptual similarities, search engines do not neatly fit into any of the aforementioned categories. In the parlance of the E-Commerce Directive, a search engine is neither a caching nor hosting service but a ‘location tool service’ which does not fall under the mere conduit waiver. Though this sounds like small detail, it is of great significance for the responsibility of search engines.

June 2013 Opinion by the Advocate General

While the 1995 DPD indeed seems ripe for an overhaul to adapt it to the digital age, the ECJ judgment caused quite a number of raised eyebrows. In June 2013 the Court’s Advocate General Niilo Jääskinen considered that search engine service providers were not responsible, on the basis of the Data Protection Directive, for personal data appearing on web pages they process.

In the (non-binding) opinion he provided, the Advocate General opined that Google ‘is not generally to be considered as a ‘controller’ of the personal data appearing on web pages it processes’ and thus the provision of an information location tool does not imply any control over the content included on third party web pages.

The Advocate General also stated that the DPD ‘does not establish a general ‘right to be forgotten’’ and that such a right could not be invoked against search engine service providers. He considered that a ‘subjective preference alone does not amount to a compelling legitimate ground’ and does not entitle a person to restrict dissemination of personal data that he considers to be harmful or contrary to his interests.

Requesting search engine providers to suppress legitimate and legal information that has entered the public domain would entail an interference with the freedom of expression of the publisher of the web page and would amount to censorship of his published content by a private party, Advocate General Jääskinen conluded.

The ECJ, however, in his May 2014 judgment ruled differently as it saw a) the search engine provider as the ‘controller’ of personal data and b) did not see the right to freedom of expression of the newspaper website violated by the removal from search engine results.

The Right to be Forgotten and Freedom of Expression

The ECJ judgment and the previous opinion by the Advocate General show the degree of uncertainty and wide area of discussion with regards to data protection in the digital age. What becomes clear, however, is that on public platforms such as the internet, data protection and privacy cannot be looked at without looking at freedom of expression at the same time. While there needs to be some degree of privacy in order to be able to exercise the individual right to freedom of expression without the fear of reprisal, in return the right to privacy must not be used as a pretext for censorship of inconvenient or questionable content.

In addition, there is no clear-cut definition of the ‘right to be forgotten’, yet, and it would help the debate to agree on exactly what this right should entail. If the ‘right to be forgotten’ would just enable the author of information to ask for its removal, even across platforms, this would be a mere extension of copyright and data protection legislation to the internet.

If a ‘right to be forgotten’, however, would allow any individual to demand the removal of any information about him, even if it originates from third parties such as newspapers, this might indeed lead to undue censorship and restrictions of the right to freedom of expression on the grounds of (pretended) privacy and data protection concerns.

Search engines as intermediaries that do not create but only retrieve data arguably have lesser responsibility for the content of websites than the author of those websites. No matter whether ‘mere conduit’ provisions also apply to search engines or not: forcing the decision on whether or not content is inaccurate, inadequate, irrelevant or is no longer necessary on them is effectively putting the decision on the legality of content in the hands of private companies. In this regard, the ECJ judgment potentially opens a door for censorship by subjects of (critical) reporting on the pretext of data protection concerns. Now, it is up to the search engine and database providers in Europe to exercise their unwanted responsibility cautiously and wisely in the light of the right to freedom of expression and refer questionable removal requests to national and European courts.

//

Christian Möller, M.A., (@infsocblog) is lecturer at the University of Applied Sciences in Kiel, Germany. His main areas of work span from internet governance, international media regulation and human rights to social media and journalism in the digital age.

He also regularly serves as a consultant (theinformationsociety.org) and conference speaker for various national and international corporations and institutions, e.g. the Organization for Security and Co-operation in Europe (OSCE), the Council of Europe, the OSCE Project Co-ordinator in Ukraine, or the Chamber of Commerce in Kiel, Germany.

Featured Photo Credit:AttributionNoncommercialNo Derivative WorksSome rights reserved by Flооd

Leave a Reply