In a previous post, I’ve mentioned the annoying algorithmic habits of Facebook (suggesting that you reconnect with dead friends or putting up ads asking, “Do you want to get pregnant?”…um no, not right now!)
Google also uses algorithms which suggest searches related to the one you have just undertaken. A French man has successfully sued Google because he alleged than when his name was entered into the search engine, the terms “rapist” and “satanist” were suggested by Google as associated search terms. The man had in fact been convicted for corruption of a minor, receiving a three year suspended jail sentence. He said that he had tried contacting Google to get them to remove the algorithm, with no success. Saul Weintraub at CNN Fortune reports:
‘Suggest’ is a service that gives additional terms for further searching after a query is done. Those words are based on terms that are grouped on the web and Google’s PageRank algorithm.
Those results were likely a manifestation of news reports of the man’s crimes and related searches based on those terms. Google states that the results aren’t its responsibility, they are just a manifestation of their computers reporting what’s out there on the web.
The French court concluded that the search engine’s linking his name to such words was defamatory. Google CEO Eric Schmidt and Google were ordered to pay €1 settlement plus €5,000 for the man’s court costs. Google was also ordered to take down the results of its algorithm and would be fined daily until such action had been taken.
Interestingly, the court felt that Google France wasn’t liable, but somehow Eric Schmidt in his editorial capacity at the Google HQ in the US were in fact responsible.
Because Google isn’t simply pointing to search results but its robots are making “Editorial decisions”, it may find itself in trouble in other jurisdictions around the globe – at least until the courts understand what is happening behind the scenes.
The judgment is here (en français).
The Google algorithm infers that there is a link between X’s name and the terms “rapist” and “satanist”. Would Google be protected by a defence of “truth” under our defamation laws? It probably depends very much on what the newspaper reports said. If the newspaper reports said, “X was accused of being a satanist and a rapist, but was found not guilty,” then the inference created by the search suggestion would be untrue. I suspect that this is probably what happened.
It is interesting, though, when a “ghost in the machine” creates the defamatory communication. Perhaps the position of Google is analogous to the situation where a defamatory comment is made by a commenter on a blog post. The blogger will be liable for defamation if she leaves the comment up, even if she did not make the comment herself and does not agree with it. Similarly, here the problem is not the algorithm itself, but Google’s failure to deal with the man’s complaint in a timely fashion.
I suspect organisations like Facebook and Google need to have better mechanisms for dealing with things like this. Issues do not just arise in defamation law, either. Facebook is working with Australian police to develop better protocols for dealing with illegal activities after Australian police smashed a pedophilia ring on Facebook. Facebook repeatedly shut down the groups which exchanged indecent images, but did not report them to police, even though one of the users alerted Facebook. In another incident, a mother was worried that her 12-year-old daughter was being stalked by a pedophile on Facebook, and alleged that she was having difficulty getting the social networking site to respond. Facebook does not have an Australian office. (Grotesquely, the stalker turned out to be another 12-year-old girl masquerading as an older male pedophile.)
I suppose that we’re all still learning the pitfalls and possibilities of online communication. Companies, lawmakers and courts are all having to develop better mechanisms to deal with the legal ramifications of online communication.
(Hat tip: Heath G)