Google’s “Right to be forgotten” implementation, some observations
On June 26th, 2014, Google started its implementation of the “Right to be forgotten” (RTBF) ruling in its search results. Notices saying “Some results may have been removed under data protection law in Europe” started appearing on the bottom of the SERP’s for a large amount of queries in Google.co.uk. Similar messages are showing in different languages across the European versions of Google. The wording of, and SERPs in which, the message appears might provide an interesting insight into the way in which Google will handle the many RTBF requests.
The message explicitly mentions that results MAY have been removed, not that any actual results have in fact been removed or that a request has been made for any of the results shown in the SERP. An insider mentioned to the WSJ that the message is “added algorithmically to searches that appear to be for a name”.
The message however is showing for only a part of all searches that are for a name. It is safe to assume that Google can quite accurately predict whether or not a search is, or is in fact not, for a person’s name. It is therefore interesting to see that they have chosen to not display this messages for the majority of names. A quick sample of 30 searches for friends’ names showed about 1 in 3 SERPs showing a warning; I’m confident at most one of them might have filed a request.
Google is thus definitively showing the RTBF message for more than just queries for names of people who have filed a request, but not for everyone. So where do they draw the line, what triggers these messages when the people in question have not filed a RTBF request?
Politicians are excluded
This might be different in other countries, but in The Netherlands none of the SERPs for members of cabinet is showing a warning. Given the general prevalence of the messages, these SERPs seem to be exempted from these messages. Although there can be good reasons to exempts these SERPs (people might start assuming hidden problems that are really not there) this is an interesting observation. However, the exemption may have more to do with Wikipedia than with the job of being a member of parliament.
The Wikipedia Connection
Another sample of about 80 well, and lesser well, known Dutch people showed that those having a Wikipedia profile on them have a 0% chance of having a warning showing in the SERPs for their name. The group of people that has a fair amount of content written about them but who are not encyclopaedic material however very likely to have this message showing. These findings heavily contradict the statement in the WSJ that the message is a “blanket” message showing algorithmically for searches for peoples’ names. Wikipedia is a very trusted source and uses a mark-up specifically designed to tell Google that a page is about a person.
The only exemption I could quickly find was for Max Mosley, it is however very likely that a request has been made for results mentioning his name.
More than just people’s names
It was expected that Google would only censor search results for searches that are an exact match for a person’s name. If the searches for which the RTBF message is showing are anything to navigate by, this assumption can be binned immediately. Warnings are showing for both queries that incorporate only a part of a person’s name (combined with other words to identify the individual, i.e. “Johnson CEO affairs”) and queries encompassing a person’s name and one or more extra keywords. If in fact search results would be removed for these queries as well, this would be both a very drastic implementation of the court ruling and a dangerous development leading to a too high degree of censorship. Whilst many agree that hiding some results for searches for a person’s name would be an acceptable measure to protect people’s personal information, this is quite different for searches that combine the person’s name with a smoking gun such as “convicted” or “fired”. This would effectively prohibit the public, journalists, investors and employers of finding out more about cases that they already heard of.
Though Google will first have to accept a request from an individual to censor search results, it is not unimaginable that it would censor search results for a CEO of a non-profit organisation (let’s call him Johnny Johnson) that has acted against the philosophy of the organisation by having affairs with multiple women whilst being married. If this person lands a new position at another organisation or company, donors and investors who vaguely remember the case should have the possibility to learn more about it and should not be frustrated by censored SERPs for “Johnny Johnson fired” or “Johnny Johnson affair”. Many, and maybe Google too, would object to censorship for “Johnny Johnson” as well, given his position in the public eye.
Right to be forgotten or Right to a spotless reputation
Censoring results for just peoples’ names (i.e. “Johnny Johnson”) would help the public forget about certain negatives and prevent unknowing individuals from reading about these negatives. Censoring search results for peoples’ name plus qualifiers, such as “Johnny Johnson affairs”, is just plain censorship and frustrating for those who have not forgotten. Individuals might deserve a second chance, with Google chipping in to help by not showing certain facts for searches for the individuals name. Nobody deserves a spotless reputation made through Google by censoring negative results for all searches encompassing an individual’s name, even for searches that explicitely show a user’s prior knowledge of the censored facts.
What do you think, have you seen anything different and if not, what do you think Google will censor?