Ioannis Iglezakis
Assistant Professor
Faculty of Law,
Aristotle University of Thessaloniki
Abstract
The right to be forgotten is a new right that is introduced
in the Draft Proposal for a General Data Protection Regulation of 2012, which
has been widely discussed. Critics, on the one hand, disagree with its
necessity and hold the view that it represents the biggest threat to free
speech on the Internet in the coming years. Viviane Reding, former EU Justice
Commissioner and currently Vice-President of the EU Commission, on the other
hand, described this right as a modest expansion of existing data privacy
rights. The ECJ with its decision of 13 May 2014 in case C-131/12 confirmed
this view, interpreting the provisions of Directive 95/46/EEC in such a way as
to include a right ‘to be forgotten’ on the Net. The case referred particularly
to search engines and their obligation to remove links to web pages from their
lists of results, following requests of data subjects on the grounds that
information should no longer be linked to their name by means of such a list
and taking into account that even initially lawful processing of accurate data
may, in the course of time, become incompatible with the directive where those
data are no longer necessary in the light of the purposes for which they were
collected or processed. This ruling addresses only one aspect of the ‘right to
be forgotten’, which concerns the role of Internet Intermediaries, but has
wider implications that need to be examined.
Keywords: Freedom of speech, right to be forgotten, right to
oblivion, data protection, search engines, digital forgetting
I. Introduction
In 2012, the EU Commission presented the proposal for a
Regulation on the protection of individuals with regard to the processing of
personal data and on the free movement of such data (‘General Data Protection
Regulation’, GDPR), repealing Directive 95/46/EEC, with the aim to modernize
the legal framework for data protection in the EU. A central provision in the proposed
Regulation is Article 17 introducing the ‘right to be forgotten’ in the digital
environment, which draws its origins from the ‘right of oblivion’ – or le droit à l’oubli, recognized by
case-law in France and other countries (A. Mantelero).
The intended effect of the right to be forgotten is to
enhance users’ rights on the Internet and remedy the lack of control over their
personal data. It also presents an attempt to deal with the issue of digital
forgetting, in other words, with the privacy issues arising in a Web that never
forgets (Rosen, 2011). In more particular, in the digital age the ‘default of
forgetting’ has gradually shifted towards a ‘default of remembering’ as pointed
out by Mayer-Schönberder (Mayer-Schönberder, 2009), and this causes major
privacy risks in a world of big data (Koops, 2011). This is a world in which it
is almost impossible to escape the past, since every status update or
photograph, and every tweet may be still available online, even if it has been
deleted in its initial place.
In this context, the introduction of a right to be forgotten
is the recognition of the enhanced capacity of cyberspace to disseminate and
distribute huge amounts of data, including personal data, hence making it impossible
to control the flow of personal information.
The provision of Article 17 GDPR basically includes a right
to erasure of data that requires the controller to delete personal data and
preclude any further dissemination of this data, but also to oblige third
parties, e.g. search engines, etc., to delete any links to, or copies or
replication of that data. This applies in four instances, which derive from
data protection principles (Costa, Poullet, 2012): a) where data are no longer
necessary in relation to the purposes for which they were collected or
otherwise processed; b) where the data subject withdraws consent on which the
processing is based or when the storage period consented to has expired and
there is no other legal ground for the processing of the data; c) where the
data subject objects to the processing of personal data; or d) where the data
has been unlawfully processed.
The right to be forgotten which is enshrined in the GDPR is
not conceived as an absolute right; thus, a number of exceptions restrict its
ambit, the most important being the freedom of expression and information.
There is consensus that such a right cannot amount to a right of erasure of
history and turn our modern society into a society of ‘lotus eaters’
(Iglezakis, 2014), which would be the case if the Internet was programmed to
forget, e.g. if Internet content was programmed to auto-expire (see Fleischer,
2011).
However, there are concerns expressed by US authors that
this right will have chilling effects on free expression, as it might force
Internet intermediaries to censor the contents that they publish or to which
they link, and hence, lose their neutral status (see, e.g., Rosen, 2012,
Fleischer, 2011).
Viviane Reding, the former EU Justice Commissioner and
current Vice-President of the EU Commission, pointed out that this right builds
on already existing rules, and is not an ex novo right (Redding, 2012). Indeed,
the European Union Court of Justice issued a decision on May 13 2014, in case
C-131/12 (Google Spain SL, Google Inc. v. Agencia Espanola de Proteccion de
Datos, Mario Costeja Gonzalez), in which it confirmed that view, as it found
that the ‘right to be forgotten’ is rooted in the provisions of Directive
95/46/EEC. Consequently, Vivian Reding referred to this decision in a post on
Facebook as a ‘clear victory for the protection of personal data of Europeans’[1].
It should be underlined that this decision comes one month
after the decision of the Court in case C-293/12 and C0594/12 (Digital Rights
Ireland and Seitlinger and Others), which declared the Data Retention Directive
to be invalid. This does not suggest that the EU Court is carrying out judicial
activism in favor of informational privacy, since the rulings in both cases are
justified. It is a clear message, nevertheless; particularly as far as the
Google case is concerned, it is clear that the Court supports the reform of the
EU legal framework on data protection and the introduction of a control right,
such as the ‘right to be forgotten’.
Moreover, it is evident that the ruling of the CJEU in this case,
which recognized a right to have Google delete links to data that are
irrelevant and outdated, will have significant repercussions, particularly to
Internet companies, such as search engines. Google, shortly after the decision
was issued, received certain takedown requests; more specifically, an
ex-politician seeking re-election demanded to have links to an article about
his behavior in office removed, a man convicted of possessing child abuse
images also requested links to pages about his convictions to be erased and, a
doctor asked for the removal of negative reviews from patients from the results
on searches[2].
From Google’s perspective this represents a very negative situation, as it
anticipates receiving an overflow of takedown requests, on the basis of the
CJEU decision. It actually announced that it would create a new process for the
erasure of data, which may take some time, as it this a complicated task[3].
Thus, it should be discussed whether this ruling is actually
a victory for data protection, and not an obstacle for the Internet, impeding
its potential by restricting free expression and information, and curtailing the
right to conduct a business.
II. The ECJ decision
in the Google case
1. The dispute and
the request for a preliminary ruling
The dispute in this case arose when Mario Costeja González submitted
a request against the editor of a Spanish newspaper (La Vanguardia Ediciones SL) and against Google Spain
and Google Inc. due to the reason that a search of his name in Google produced
articles published in that newspaper (‘La
Vanguardia’) sixteen years ago concerning a real-estate auction connected with attachment proceedings
for the recovery of social security debts. Mr. González sustained that the
attachment proceedings concerning him had been fully resolved a number of years
back and that reference to them was now entirely irrelevant.
Thus,
he requested that the relevant pages of the La Vanguardia’s newspaper be
removed or altered so that the personal data relating to him no longer appeared
or to use certain tools made available by search engines in order to protect
the data. He also requested that Google Spain or Google Inc. be required to
remove or conceal the personal data relating to him so that they ceased to be
included in the search results and no longer appeared in the links to that
newspaper.
The Spanish Data Protection Agency (AEPD) rejected the complaint in so far
as it related to La Vanguardia, for it considered that the publication by it of
the information in question was legally justified; however, the complaint was
upheld in so far as it was directed against Google Spain and Google Inc. Subsequently,
Google Spain and Google Inc. brought separate actions against that decision
before the National High Court. That court issued an order for reference to the
CJEU, stating that Directive
95/46 must be interpreted in order to answer the question of what obligations
are owed by operators of search engines to protect personal data of persons
concerned who do not wish that certain information, which is published on third
parties’ websites and contains personal data relating to them that enable that
information to be linked to them, be located, indexed and made available to
internet users indefinitely.
In particular the National High Court referred nine
questions to the CJEU for a preliminary ruling, which concern: a) the
territorial application of Directive 95/46, b) the activity of search engines
as providers of content in relation to the Directive and c) the scope of the
right of erasure and the right to object in relation to the ‘right to be
forgotten’.
The Advocate General opined that the rights to erasure and
blocking of data foreseen in Article 12(b) and the right to object foreseen in
Article 14(a) of the Directive do not confer on the data subject a right to
address himself to a search engine service provider in order to prevent
indexing of the information relating to him personally, published legally on
third parties’ web pages, invoking his wish that such information should not be
known to internet users when he considers that it might be prejudicial to him
or he wishes it to be consigned to oblivion.
The CJEU did not adhere to this view, but essentially
granted a right to any Internet user to request that information be erased, if
it considered it to be inadequate, irrelevant or no longer relevant. In
particular, an analysis of this decision shall be undertaken in this paper.
2. The CJEU decision
in more detail
The CJEU decision first addressed some preliminary issues.
It examined whether Article 2(b) of Directive 95/46 is to be interpreted
as meaning that the activity of a search engine as a provider of content which
consists in finding information published or placed on the internet by third
parties, indexing it automatically, storing it temporarily and, finally, making
it available to internet users according to a particular order of preference
must be classified as ‘processing of personal data’ within the meaning of that
provision when that information contains personal data. And further, whether
Article 2(d) of Directive 95/46 is to be interpreted as meaning that the
operator of a search engine must be regarded as the ‘controller’ in respect of
that processing of the personal data, within the meaning of that provision.
The Court made reference to previous case law in the
Lindqvist case, in which it was held that the operation of loading personal
data on an Internet page must be considered to constitute such processing. It
then stated that the data found, indexed and stored by search engines and made
available to their users include information relating to identified or
identifiable natural persons and thus, ‘personal data’. Subsequently, it found
that the operator of a search engine who collects personal data in that it
explores the Internet automatically, constantly and systematically in search of
information published there, then stores this data on its servers and discloses
or makes such information available, carries out processing of personal data.
It does not make any difference that such processing concerns material already
published on the web, because the exemption of such cases from the field of
application of Directive would deprive the Directive of its effect.
Consequently, the CJEU held that the operator of a search
engine must be held as a controller in respect of that processing, pursuant to
Article 2(d), since it is the search operator which determines the purposes and
means of that activity and thus of the processing of personal data that it
itself carries out within the framework of that activity.
Next, the CJEU answered the question of the territorial
application of Directive 95/46 in the affirmative. Although Google Search is
operated and managed by Google Inc. which is established in the United States
and does not carry out any activity directly linked to the indexing or storage
of information contained on third parties websites, its subsidiary Google Spain
attends the promotion and sale of advertising space in Spain and its activity
is closely linked to Google Search. The activities of the operator of the
search engine and those of its establishment situated in Spain are, according
to the Court’s decision, inextricably linked and thus, the processing of
personal data carried out for the purposes of the operation of the search
engine takes place in the context of the commercial and advertising activity of
the controller’s establishment on the territory of Spain. Subsequently, the
CJEU interpreted the provision of Article 4 (1)(a) of the Directive as meaning that
processing of personal data is carried out in the context of the activities of
an establishment of the controller on the territory of a Member State, within
the meaning of that provision, when the operator of a search engine sets up in
a Member State a branch or subsidiary which is intended to promote and sell
advertising space offered by that engine and which orientates its activity
towards the inhabitants of that Member State.
Furthermore, the CJEU addressed the issue of the role played
by an Internet Intermediary such as Google with regard to the provisions of the
Directive imposing obligations on controllers. In particular, it discussed the question
whether the operator of a search engine is obliged to remove from the list of
results following a search made on the basis of a person’s name links to web
pages, published by third parties and containing information relating to that
person, also in a case where that name or information is not erased beforehand
or simultaneously from those web pages, and even when its publication in itself
on those pages is lawful. Google Spain and Google Inc. relied on the argument
that any request for removal of information must be addressed to the publisher
of the website concerned, since it is he who is responsible for making the
information public, who can defend the publication and who has the means to
make it inaccessible.
The CJEU considered that an effective and comprehensive
protection of data users could not be achieved if he had to obtain first or in
parallel the erasure of the information relating to them from the publishers of
websites, since information published on a website can be replicated very easily
on other sites and the persons responsible for its publication are not always
subject to EU legislation. It also mentioned that a publisher of a website may
rely on the exception of Article 9 of the Directive, if the publication of
information relating to an individual is carried out for journalistic purposes,
but the operator of a search engine may not rely on such an exception. And in
addition, it stressed that the activities of a search engine must be justified
under Article 7 of the Directive and a weighing of interests at issue must be
carried out under Article 7 (f) and Article 14 (a), while it should be taken
into account that the inclusion in the list of results of information relating
to a natural person, following a search, facilitates the finding of personal
information and plays a decisive role in the dissemination of such information.
Thus, this constitutes a more significant interference with the data subject’s
fundamental right to privacy than the publication on the web page.
Subsequently, the Court answered this question with assent.
The main issue at stake, however, was whether the relevant
provisions of the Directive might serve as a legal basis for claims of removal
of personal data from the list of search names displayed after a search is made
on the basis of the name of an individual.
The CJEU considered first the provisions of Article 12 (b)
of the Directive, which states that ‘Member States shall guarantee every data
subject the right to obtain from the controller as appropriate the rectification,
erasure or blocking of data the processing of which does not comply with the
Directive, in particular because of the incomplete or inaccurate nature of the
data’. The list of the reasons that justify such a claim is not an exhausting
one, so the Court held that the incompatibility of the processing with the
provisions of the Directive may also result from the fact that such data are
inadequate, irrelevant or excessive in relation to the purposes of the
processing, that they are not kept up to date, or that they are kept for longer
than is necessary, unless they are required to be kept for historical,
statistical or scientific purposes. This is a particular reference to the data
quality principle as enshrined in the provisions of Article 6 (1) (c) to (e) of
the Directive.
The Court further makes the argument that even initially
lawful processing of accurate data may, in the course of time, become
incompatible with the directive where those data are no longer necessary in the
light of the purposes for which they were collected or processed. It is evident
that this line of argumentation is influenced by the provisions of the Draft
Regulation establishing the right to be forgotten and shows the commitment of
the Court to the data protection reform process.
Subsequently, the Court applied this maxim to the
circumstances of the case; it stated in particular that if a request by the
data subject is made, in accordance with Article 12 (b) of the Directive, clarifying
that the inclusion in the list of results displayed following a search made on
the basis of his name of the links to web pages published lawfully by third
parties and containing true information relating to him personally is, at this
point in time, incompatible with Article 6(1)(c) to (e) of the Directive
because that information appears, having regard to all the circumstances of the
case, to be inadequate, irrelevant or no longer relevant, or excessive in
relation to the purposes of the processing at issue carried out by the operator
of the search engine, then the information and links in the list of results
must be erased.
Further, in case the data subject exercises his/her right to
object on compelling legal grounds relating to his/her particular situation to
the processing of personal data relating to him/her, according to Article 14
(a) of the Directive, the Court supported the view that where such requests are
based on alleged non-compliance with the conditions laid down in Article 7(f)
of the Directive, the processing must be authorized under Article 7 for the
entire period during which it is carried out.
The time factor appears to play a role in this case, and
thus, the Court found that in such requests it should be examined whether the
data subject has a right that the information relating to him/her personally
should, at this point in time, no longer be linked to his/her name by a list of
results displayed following a search made on the basis of his name.
The CJEU went even further; it emphasized that the right of
the data subject to request the removal of information from the search results
of search engines is based on Articles 7 and 8 of the Charter of Fundamental
Rights of the EU and these rights override not only the economic interest of
the operator of the search engine, but also the interest of the general public
in finding information concerning a data subject.
An exception from this rule is made in case the data subject
is a public figure, because then the interference with his fundamental rights
is justified by the preponderant interest of the general public in having, on
account of inclusion in the list of results, access to the information in
question.
Finally, the Court made particular reference to the issue in
the main proceedings concerning the display, in the list of results that the
internet user obtains by making a search by means of Google Search on the basis
of the data subject’s name, of links to pages of the on-line archives of a
daily newspaper that contain announcements mentioning the data subject’s name
and relating to a real-estate auction connected with attachment proceedings for
the recovery of social security debts. The Court’s decision is that, taking
into account the sensitivity of this information and the fact that this
information had taken place 16 years earlier, the data subject substantiated a
right not to have this information linked to his name by means of a list of
search results.
III. Conclusion
The CJEU decision is addressing one aspect of the right to
be forgotten as included in Article 17 GDPR and does amount to a comprehensive
recognition of the right to be forgotten. Also, it only
applies to Internet search engines and as far as it concerns the right to
erasure links to information of data subjects by a list of results displayed following
a search made on the basis of his name.
The most important consequence of this case law is that an
Internet search service provider needs to put itself in the position of the
provider of the web page, in which personal information is initially published
and make a privacy assessment of the facts underlying the dissemination of
personal information on the Internet. The Attorney General supported the view
that this would have as a result that the service provider would need to
abandon its intermediary function between the user and the publisher and assume
responsibility for the content of the source web page, and when needed, to censor
the content by preventing or limiting access to it[4].
This argument is not convincing, since an Internet search
engine provider is not responsible for the initial publishing of information on
the Internet, however, it provides a service, which has significant privacy
implications as pointed out by the CJEU. Therefore, the provider of such
services needs to assume responsibility for the processing of personal data
which it undertakes. In our view, the removal of any links to websites does not
constitute censorship, if it is ordered by a court or an administrative
authority and on the basis of legitimate grounds to protect privacy.
However, the court decision did not elaborate as much as
necessary on that aspect and on the relation between the obligations of a
search engine provider as a controller and the safe harbor principles of the
e-commerce Directive (2000/31), establishing a neutral position of Internet
intermediaries.
Furthermore, the CJEU made only a brief reference to the
need to reconcile privacy rights with the right to freedom of expression and
freedom of the press. It is evident that the exceptions from the right to be
forgotten should be clearly formulated by law or otherwise, be developed by
case law, which would evidently take a lot of time.
In conclusion, it would be accurate to say that this
decision leaves open questions that should be addressed by the EU legislator in
the data protection reform process.
REFERENCES
Jeffrey Rosen, Free
Speech, Privacy, and the Web that Never Forgets, 9 J. on Telecomm. and High Tech. L. 345 (2011).
J. Rosen, The Right to
Be Forgotten, 64 Stan. L. Rev. Online 88, February 13, 2012, online
available at: http://www.stanfordlawreview.org/sites/default/files/online/topics/64-SLRO-88.pdf
Costa, L./Poullet, Y., Privacy and the regulation of 2012, CLSR 2012, pp. 254-262.
Koops, B.–J., Forgetting
Footprints, Shunning Shadows. A Critical Analysis of the “Right to Forgotten”
in Big Data Practice, scripted vol.
8, Issue 3, Dec. 2011.
V. Mayer-Schönberder, Delete:
The Virtue of Forgetting in the Digital Age, 2009.
Alessandro
Mandelero, U.S. Concern about the European Right to Be Forgotten and Free
Speech: Much Ado about Nothing?, Contratto e impresa, 2012, pp. 727-740, online
available at: http://porto.polito.it/2503514/
V. Reding, The EU Data Protection Reform 2012: Making Europe
the Standard Setter for Modern Data Protection Rules in the Digital Age, Munich
22 January 2012, Speech/12/26.
I. Iglezakis, The right to digital oblivion and its
restrictions, Thessaloniki 2014 (in Greek).
Peter Fleischer, Foggy
Thinking About the Right to Oblivion, Privacy . . . ?
(Mar. 9, 2011), online available at: http://peterfleischer.blogspot.com/2011/03/foggy-thinking-about-right-to-oblivion.html
[2] J. Wakefield, Politican and paedophile ask Google to ‘be
forgotten’, bbc.com, 15 May 2014, http://www.bbc.com/news/technology-27423527
[4] See Opinion of Advocate
General in cases c-131/12, nr. 109.
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου