In July of this year Prime Minister David Cameron called on Google and Microsoft's Bing - which together account for 95% of internet search traffic - to do more to prevent people getting online access to illegal images. He said they needed to ensure that searches which were unambiguously aimed at finding illegal images should return no results.
The issue of online images showing the sexual abuse of children has made headlines in recent months after the convictions of Stuart Hazell and Mark Bridger for the murders of Tia Sharp and April Jones. Both Hazell and Bridger were known to have sought out and viewed child abuse images online.
About a month ago Google and Microsoft announced that they had agreed measures to make it harder to find child abuse images online. Apparently as many as 100,000 search terms associated with child pornography will now return no results for illegal material and will trigger warnings that child abuse images are illegal.
Google communications director Peter Barron said the changes would make it "much, much more difficult to find this content online".
Microsoft says its Bing search engine will also produce clean results. Microsoft's general manager of marketing and operations Nicola Hodson said: "Day-to-day we're fierce competitors, and we collaborate on this issue because it transcends that”.
"It will be much harder to find that content on both Bing and Google. We are blocking content, removing content and helping people to find the right content or also sources of help should they need that," she said.
David Cameron welcomed the move and the National Crime Agency’s director general said that initial tests showed that the changes introduced by the search engines were working.
David Cameron’s adviser on the sexualisation and commercialisation of childhood, Tory MP Claire Berry, said that the new measures were a “great step forward”. "We're not declaring victory but this is a massive step in the right direction," she said.
That all sounds like very good news although I was astounded to read of 100,000 potential search term being “cleaned up”. In addition, I have grown cynical and suspicious about “good news”, especially when it involves multi-billion dollar companies and politicians of any stripe so I decided to run a little experiment.
Not being familiar with any of the terms paedophiles use I assumed that any term I might guess at would certainly be one of the 100,000 that have been “cleaned up”. Using Google as my search engine I began with the most basic “child pornography” and sure enough, the first item in my search list was a warning from google.com.au which read “Warning: Child abuse imagery is illegal, if you see it, report it”. This was followed by pages and pages of listings covering mostly news items and scholarly articles.
This seemed like a positive result if you think the whole idea of blocking searches for child pornography is a good idea. I decided to expand my research a little and using a very common slang term for female genitalia combined with the word “little” I did another search. The result was appalling. Pages and pages of images of young girls involved in very explicit sexual acts. I had seen enough. Obviously the reach of Google’s new restrictions has not extended to Australia.
I discovered after conducting this experiment that the new restrictions are being launched in Britain before being expanded to other English-speaking countries and then 158 different languages in the next six months. I have written to Google asking why it is necessary to phase in these restrictions in this way, especially in regard to English language searches, but have not received an answer at the time of writing.
In addition to this obvious flaw in what has been trumpeted as a major step forward there are also those who say that the measures introduced by Google and Microsoft, or at least the fanfare surrounding their introduction is little more than “feel good” propaganda designed to make the public feel that something is being done.
Jim Gamble, former head of the Child Exploitation and Online Protection Centre (Ceop) said that the search engines had been blocking inappropriate content for some time and that the latest move was nothing new but merely an expansion of what was already happening. He also said that he did not think the measures, however expanded, would make much difference with regard to protecting children from paedophiles.
“They don’t go on to Google to search for images. They go on to the dark corners of the internet on peer-to-peer websites”.
The fact that distributors of child abuse images evade detection by using encrypted networks and other secure methods was highlighted in a report by Ceop released in June.
The apparent flaws in this alleged major breakthrough in the fight against online child pornography and the claim that the restrictions (when they work) will be largely ineffectual in stopping the traffic in child abuse imagery are disturbing enough in themselves but they leave unexamined a number of other very important questions which will be addressed in forthcoming pieces.
First, is attacking online child pornography is an effective way of addressing the whole issue of child sexual abuse? Then there is the broader question of whether these kinds of restrictions are compatible with a free civil society. Who decides which terms should be restricted? In this case that may be an easy one – most people who are not paedophiles probably feel that restrictions surrounding the child exploitation industry are reasonable, and the search companies can be left to do the job – but what is next? Perhaps gay pornography, perhaps pornography generally, perhaps inter-racial pornography or perhaps certain types of social or political activity? Of course, “slippery slope” arguments are possible whenever human activity is regulated. But unless the rest of us have some way to know what the search companies are regulating, they could become vehicles for backdoor censorship.
NEXT: Is focusing on online child pornography a case of whistling in the wind or fiddling while Rome burns?
FOLLOWING: Restrictions on child pornography and its implications for a civil society.