The recent decision of the European Court of Justice in Google Spain SL v Agencia Española de Protección de Datos is a good opportunity to compare the obligation of search engines with reference to the search results in Europe and United States.
The ruling of May 13, 2014, holds that all European citizens have a “right to be forgotten”. The ECJ decision does nothing but confirm an ongoing legislative trend. Now the issue in Europe is to find the right balance in enforcing this new, important freedom.
Also in the US individuals may object to the way certain information show on search results. However, it is not possible to find case law where US tribunals have required search engines to delete links to protect the right to be forgotten. This approach seems to be consistent with fundamental US values, such as freedom of expression and of press guarded by the Constitution’s First Amendment. However, another way to safeguard the precious First Amendment right to express ideas is to regulate how companies can treat personal data and keep electronic records. The protection of the First Amendment and the lack of direct privacy legislation has caused US privacy law to develop in a patchwork.
While the US is feeling the urge to safeguard the citizens’ right to privacy, consumers’ data are transferred around the world and the need to protect their rights cross-border may impact on future legislation.
Until the Google Spain decision search engines had never been held responsible for the data they displayed. However, the recent decision subjected search engines to the duty to delete personal information when so requested by the owner.
With the ruling the ECJ confirms the “right to be forgotten” pertaining to all European citizens. The right is subject to limitations and cannot be enforced if the information is of public interest.
In order to better understand the limits of the right to be forgotten let’s see the content of the ruling.
The ECJ granted the right to be forgotten reasoning that a search engine’s retrieval and listing of information to the benefit of the searcher can be classified as “processing of personal data”. Consequently, if the information retrieved are personal data, Google is a controller.
Based on the above, the EJC obliged the search engine to delete, when requested, results from the list:
“[I]n order to comply [with Article 12(b) and subparagraph (a) of the first paragraph of Article 14] … the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.”
Except in case data subjects are public figures, they have the right to ask that their personal information, after a certain point in time, no longer be linked to their name. This right descends from the “fundamental rights under Articles 7 and 8 of the Charter” which “override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject’s name.”
Following the decision, Google made available to users a form to ask for removal of information.
The ECJ decision confirmed an ongoing legislative trend.
In 2012 the European Commission proposed a comprehensive reform of the EU’s 1995 Data Protection Directive to strengthen online privacy rights and boost Europe’s digital economy. The proposed reform specifically provides for the data subject’s right to be forgotten and to erasure. Whereas 53 of Data Protection Proposal states that
“data subjects should have the right that their personal data are erased and no longer processed, where the data are no longer necessary in relation to the purposes for which the data are collected or otherwise processed, where data subjects have withdrawn their consent for processing or where they object to the processing of personal data concerning them or where the processing of their personal data otherwise does not comply with this Regulation…However, the further retention of the data should be allowed where it is necessary for historical, statistical and scientific research purposes, for reasons of public interest in the area of public health, for exercising the right of freedom of expression, when required by law or where there is a reason to restrict the processing of the data instead of erasing them.”
Whereas 54 goes even further in extending the right to erasure in such a way that “a controller who has made the personal data public should be obliged to inform third parties which are processing such data that a data subject requests them to erase any links to, or copies or replications of that personal data”.
Article 17 of Data Protection Proposal further elaborates and specifies the right of erasure provided for by Article 12(b), Directive 95/46/EC. It provides the conditions of the right to be forgotten, including the obligation of the controller, which made the personal data public, to inform third parties of the data subject’s request to erase any links to, or copy or replication of that personal data.
The first consequence of the right to be forgotten enforcement and trend is the massive number of requests to obtain deletion of negative information on the web sent to Google (more information here).
The ECJ suggested that the balance in considering deletion requests should be between the individual’s interest in erasing inadequate, irrelevant, or no longer relevant information, and the public interest in the information. In the real world, however, companies are left to interpret the court’s ruling and decide if the public has a right to know, or not. To avoid issues search engines might end up complying with all the requests for deletion with the obvious consequence that search results will stop being a valid source of information. This could negatively impact the public knowledge as well as the revenues of search engines, which might not receive the same amount of advertising requests.
On the bright side: notwithstanding the legitimate doubts and questions, public agencies and companies alike are aware of the difficulties in finding the balance the ECJ suggested and are getting organized. Google called a series of public meetings to discuss how an individual’s right to be forgotten should be balanced with the public’s right to information (see here). On June 4, 2014, in order to comply with the difficult task of balancing these important interests, Article 29 privacy group decided that a special taskforce of European privacy overseers should monitor Google’s attempts to respond to citizens’ requests to be forgotten (see here).
In addition, Article 29 Working Party is developing guidelines for appeals from people whose requests to remove information from search results have been turned down by search engines (see here). The European data protection authorities agreed on a common ‘tool-box’ to ensure a coordinated approach to the handling of complaints resulting from search engines’ refusals to “de-list” complainants from their results. It was decided to put in place a network of dedicated contact persons to develop common case-handling criteria to handle complaints by the data protection authorities.
Overall, it is clear that the right to be forgotten is a reality in Europe. With the advent of modern technologies it became a fundamental right and governments, agencies and companies are working to find the right balance in enforcing this new, important freedom. A right pertaining to European citizens independently of the area of the world in which their data is being processed” (See Viviane Reding, Speech – Brussels, Mar. 16, 2011).
In the US
Also in the US individuals may object to the way certain information show on search results. However, it is not possible to find any case law where US tribunals have yet required search engines to delete links to safeguard the right to be forgotten.
This approach is consistent with fundamental US values, such as freedom of expression and of press guarded by the Constitution’s First Amendment. The freedom of speech grants Americans the right to choose what to post or take down from the internet. The United States have traditionally held freedom of expression over privacy, as a fundamental value. It can be argued that requiring Google to delete inadequate, irrelevant or no longer relevant data may clash with constitutional speech rights.
Section 230 of the Communications Decency Act shields Internet service providers and search engines from lawsuits seeking to hold them liable for not screening content produced by other parties, which is defamatory, a breach of contract, or a violation of state privacy law.
Accordingly, courts have generally granted immunity to Internet service providers publishing third-party content where the provider could be characterized as a “publisher or speaker” of objectionable material. However, Section 230’s immunity is granted only where the content at issue is provided by “another information content provider” (Shiamili v Real Estate Group).
Researches show that Section 230 has become one of the most important – and one of the most intensely criticized – statutes impacting the right of speech on the Internet. Modern technologies allow for a vast amount of data gathering. Data subjects do not have control over the sale and dissemination of their personal data. Data controllers may easily misuse personal information. The need to safeguard privacy – which is also an important right – and the lack of direct federal legislation have caused US privacy law to develop in several state and federal statutes and common law doctrines.
As a consequence, settled privacy principles (fair information practice principles, OECD, APEC) and industry best practices (CTIA Best Practices) have been helping in granting individuals’ right to privacy.
A way to advance the individual right to erasure is the obligation to minimize the amount of data held. We can already see that administrative bodies have been called to minimization. Let’s think for example at the data minimization requirements contained in the Guide to Implementing Privacy issued by the DHS, or in the FCRA model which provides for the prompt deletion of inaccurate or unverifiable information.
All in all, there is a growing trend towards privacy protection.
In 2012 the FTC issued a report urging individual companies’ initiatives to accelerate the pace of self-regulatory measures implementing the Commission’s final privacy framework. The FTC highlighted how in the proposed privacy framework there are aspects of the right to be forgotten, which calls on companies to (1) delete consumer data that they no longer need and (2) allow consumers to access their data and in appropriate cases suppress or delete it. According to the Agency,
“reasonable collection limits and data disposal policies work in tandem with streamlined notices and improved consumer choice mechanisms. Together, they function to provide substantive protections by placing reasonable limits on the collection, use, and retention of consumer data to more closely align with consumer expectations, while also raising consumer awareness about the nature and extent of data collection, use, and third-party sharing, and the choices available to them”. To minimize the collection of individuals’ data “companies should implement reasonable restrictions on the retention of data and should dispose of it once the data has outlived the legitimate purpose for which it was collected.”
In order to protect individuals privacy rights, the FTC urged policy-making effort on five main action items: (i) do not track (see below); (ii) mobile (companies providing mobile services should work toward improved privacy protections, including the development of short, meaningful disclosures); (iii) data brokers (addressing the invisibility of, and consumers’ lack of control over, data brokers’ collection and use of consumer information); (iv) large platform providers (such as Internet Service Providers, operating systems, browsers, and social media tracking consumers’ online activities); and (v) promoting enforceable self-regulatory codes.
Of particular interest for the right to be forgotten is the proposal of a “do not track” option. The World Wide Web Consortium has already made progress in creating an international standard for Do Not Track (see W3C page).
Many commentators advocated for imposing the possibility to erase information posted by mistake or by minors (more info here and here). In the United States, legislation has been introduced that would give teens an eraser button, which would allow them to erase certain material on social networking sites (Do Not Track Kids Act of 2011).
The FTC concluded the Report by stating “although some companies have excellent privacy and data security practices, industry as a whole must do better.” And it seems that companies are devoting more attention to individuals’ privacy rights. Recently Apple made the higher privacy safeguard of their newest devices a point of sale (see their privacy statement).
Google’s removal policy provides that Google will remove personal information from Google Search results when they put the user at greater risk for identity theft, financial fraud, and other harm (e.g., a link to a page that shows somebody’s credit card, bank account or social-security number.)
Facebook took a different approach due to the nature of its service. When users choose to make information public, anyone, including people off Facebook, will be able to see it. Users have the option to deactivate or delete their account. However, even deletion of the account will not erase information on actions taken by the user on Facebook and not stored in one’s own account, like posting to a group or sending someone a message. Friends may still have messages sent from a deleted account (see Facebook policy). Even if Facebook does not guarantee a right to be forgotten to its users, in 2012 it agreed to implement measures designed to prevent any third party from accessing information under Facebook’s control within a reasonable time period, not to exceed thirty days, from the time the user has deleted such information (In the matter of Facebook).
In conclusion, does the fact that a “right to erasure” is not law in the US mean that the ECJ’s Google Spain decision has no impact on search engines in the US?
Even though there is no law obliging service providers to erase data and the EU approach could create a nightmare for business requested to exercise censorship, also in the US there is a growing trend in establishing higher privacy standards and companies have recognized that it is in their interest to consider erasure requests in a more transparent manner. Regulation (or self regulation) of data processing and data storage is another way to protect the precious First Amendment right to express ideas.
At a global level, in order to engage in safe exchanges between the U.S. and Europe, the U.S. Department of Commerce in consultation with the European Commission drafted and approved in 2000 the U.S.-EU Safe Harbor Framework. The Framework allows U.S. companies to voluntarily comply with the European standard for adequate privacy protection facilitating international business transactions.
Currently, there is no limit on the time a U.S. organization can retain consumers’ data, but if the right to erasure should be enforced through the adoption of the Proposed Data Protection Regulation, the right to be forgotten should be added to the Safe Harbor requirements.
Consumers’ data are transferred around the world and there is value in greater interoperability among data privacy regimes. It seems that “efforts are underway around the world to re-examine current approaches to protecting consumer privacy cross-border” says the FTC, who also highlights how the Commission’s privacy framework is consistent with the nine APEC privacy principles.
Also the White House noticed that “Meaningful protection for such data requires convergence on core principles, an ability of legal regimes to work together, and enhanced cross-border enforcement cooperation. Such interoperability is better for consumers, whose data will be subject to more consistent protection wherever it travels, and more efficient for businesses by reducing the burdens of compliance with differing, and sometimes conflicting, rules”. In short, as the Administration White Paper highlights, “global interoperability will provide more consistent protections for consumers and lower compliance burdens for companies.”