Letter by Google in response to Article 29WP questionnaire regarding the implementation of the ECJ judgment on the “right to be forgotten”
Google receive more than 91,000 removal requests involving more than 328,000 URLs. The majority of requests have been made under French law (around 17,500), follows Germany and the UK.
Google removed around 53% of the signaled URLs.
The information is contained with other interesting facts in the letter published in response to the meeting that took place on July 24, 2014, between Google representatives and Article 29 Working Party to discuss the challenges of implementing the European Court of Justice’s recent decision in the Costeja case, C-131/12.
Below some of the questions and answers deemed more interesting. The full questionnaire is available at https://docs.google…
Q: “What criteria do you use to balance your economic interest and/or the interest of the general public in having access to that information versus the right of the data subject to have search results delisted?”
A: “…We must balance the privacy rights of the individual with interests that speak in favour of the accessibility of information including the public’s interest to access to information, as well as the webmaster’s right to distribute information. When evaluating requests, we will look at whether the search results in question include outdated or irrelevant information about the data subject, as well as whether there’s a public interest in the information.
In reviewing a particular removal request, we will consider a number of specific criteria. These include the individual (for example, whether an individual is a public figure), the publisher of the information (for example, whether the link requested to be removed points to material published by a reputable news source or government website), and the nature of the information available via the link (for example, if it is political speech, if it was published by the data subject him- or herself, or if the information pertains to the data subject’s profession or a criminal conviction)…”
Q “If you filter out some requests based on the location, nationality, or place of residence, what kind of information must be provided by the data subject in order to prove his nationality and / or place of residence?”
A: “Our webform makes it clear that individuals need to select a relevant country. Practically, individuals will need some connection to that country, which will normally, but not always, mean that they are resident in it. Individuals need to select a country so that we know which country’s law to apply.
We are not automating decisions about these removals. We have to weigh each request individually on its merits, and that is done by people. We have many people working full time on the process, and ensuring enough resources are available for the processing of requests required a significant hiring effort”.
Q “Do you accept general claims for delisting (e.g. delist all search results linking to a news report)?”
A: “We are seeking to give effect to the CJEU’s ruling in case C-131/12. The court called for case-by-case analysis of requests for a removal of “results displayed following a search made on a person’s name”. Therefore a result such as a news report may not appear if one searches for the name of a person mentioned in that report, while a search for other terms mentioned in that report may still display a search result linking to that report”.
Q: “When you decide to accept a delisting request, what information do you actually delist? Do you ever permanently delist hyperlinks in response to a removal request, as opposed to delisting?”
A: “In the case of the removal requests at issue here, the CJEU makes it clear that a search result for which a data subject requests a removal may still be displayed for searches for terms other than the data subject’s name. Accordingly, we remove search results in as far as they are displayed for a search with the person’s name. This is important when, as often happens, valuable and lawful content happens to be on the same page as the challenged information about the data subject. For example, in the case leading to the CJEU’s decision, the webpage at issue contains information not only about the data subject but also an unrelated article about assisted suicide. In that example, a search for “assisted suicide” should still bring up that page”.
Q: “Do you notify users through the search results’ page information that some results have been removed according to EU law? In that case, which is the legal basis for this? What is the exact policy? In particular, it appears that this notice is sometimes displayed even in the absence of removal requests by data subjects. Can you confirm or exclude that this is actually the case and, if so, could you elaborate on the applicable criteria?”
A: “Historically it has been our policy to let users know when search results have been modified based on legal requirements. User trust is important to us and we want to let them know when our results have been changed.
With regards to the CJEU decision, our current approach is to show a notification at the bottom of all search result pages for queries where a name-based removal has occurred as well as for all other search result pages that appear to be for the name of a person, indicating that results may have been removed. However, most name queries are for famous people – people search disproportionately for celebrities and other public figures. As such searches are very rarely affected by a removal, due to the role played by these persons in public life, we have made a pragmatic choice not to show this notice by default for known celebrities or public figures.
The notification is intended to alert users to the possibility that their results for this kind of query may have been affected by a removal, but not to publicly reveal which queries were actually affected. But we are still building out the serving technology for the notification so the notice may sometimes not appear where it should, and vice versa.”
Q: “What statistics can you share at this stage (percentage of requests accepted / partially accepted / refused)? How many have you answered in total? How many per day?”
A: “As of 18th July, we have received more than 91,000 removal requests involving more than 328,000 URLs. The breakdown by country (for the 6 largest countries in terms of requests) was as follows:
- Around 17,500 requests have been made under French law (as chosen by the
- requester in the webform), involving around 58,000 URLs.
- Around 16,500 requests have been made under German law, involving around
- 57,000 URLs.
- Around 12,000 requests have been made under UK law, involving around 44,000
- Around 8,000 requests have been made under Spanish law, involving around 27,000
- Around 7,500 requests have been made under Italian law, involving around 28,000
- Around 5,5000 requests have been made under Dutch law, involving around 21,000
For all requests we see the following trends in processing results: Removal of around 53% of URLs for which a removal was requested. More information required from requesters for around 15% of URLs for which a removal was requested. Non-removal of 32% of URLs for which a removal was requested. These figures are indicative only…”
Q: “What particular problems have you faced when implementing the Court’s ruling? Are there particular categories of requests that pose specific problems?”
A: “We have identified some challenges in evaluating the requests in our response to question 4, above. In many cases, we lack the larger factual context for a request, without which it is difficult to balance the competing interests.
…Google has formed an external Advisory Council on the Right to be Forgotten, which we hope will advise us on principles, policies and processes Google should follow to balance an individual’s right to privacy with the public’s right to information under the existing decision. We also hope the Council will contribute to the evolving debate about the appropriate solutions for addressing knowing and forgetting in the Information Age..”