Is focusing on online child pornography a case of whistling in the wind or fiddling while Rome burns?

There has been much interest and activity surrounding the issue of online child pornography in the last few months. As we saw in the previous blog, on July 22 Prime Minister David Cameron made a speech to the National Society for Prevention of Cruelty to Children in which he announced that there had already been a police clampdown on the uploading and hosting of child pornography images in the UK. He went on to issue a challenge to Internet Service Providers and search engine companies, demanding that they do more to deter people who are searching for child abuse images. He asked that these companies cooperate with the police and the Child Exploitation and Online Protection Centre (CEOP) to create a “blacklist” of internet search terms associated with child abuse so that anyone using these terms in their searches would be presented with a warning and no results. He also issued an ultimatum, saying if there was inadequate cooperation the government was prepared to force action through legislation.

His demands and ultimatum seem to have borne fruit, for on November 18 Google and Microsoft, which together account for about 95% of search traffic in Britain, announced that they had created a list of over 100,000 terms used by paedophiles which would now return a warning but no results on their search sites. 

This is where I must raise the issues of “whistling in the wind” or “fiddling while Rome burns”.

A society, indeed a world, without child abuse is the wish and goal of every reasonable and decent human being and it is clear that child abuse, especially sexual abuse, is a pandemic. It reaches into every corner, every nook and cranny of every nation. The spread of online child abuse images is a very disturbing symptom of this pandemic. It looks, however, that it is a symptom which is going to be very difficult to eradicate. There seems to be substantial opinion among those who should know that it will be almost impossible to eradicate online child pornography, especially by the measures announced recently.

As we saw in a previous blog, the former Child Exploitation and Online Protection Centre (CEOP) chief executive Jim Gamble said: "I don't think this will make any difference with regard to protecting children from paedophiles. They don't go on to Google to search for images. They go on to the dark corners of the Internet on peer-to-peer websites."

This concern was echoed by Dr Joss Wright, a research fellow at the University of Oxford’s Internet Institute, who questioned just how many paedophiles are actually using Google or Bing to look for child abuse images. He added that if they’re doing so now, they can easily switch to a less regulated search engine. He added that it is important to remember that Google provides just the search engine – it does not control the Internet. “The only thing that they are going to be able to block is things coming up in their search; they can’t take it down from the Internet”, he said. So the images remain until the hosts are found and forced to remove them.

Google and Microsoft are now able to block searches for child pornography because new algorithms, sets of instructions for software, have been developed which block illegal pornography and pathways to illegal content. Alongside the 100,000 already blocked terms the system is designed to pick up on new code words or terms paedophiles start to use and block them too. Dr Wright, who recently wrote a report on cybercrime for the UN, says, however, that this is a process which has proved almost impossible to make airtight in other situations. “If you block a certain word people will find a way around it. If you look at China every time they block a word a new one pops up”.

This creates an endless spiral,where the number of terms to be blocked goes on changing and increasing so that in the end, to be effective, blocks have to be placed on innocuous things.  “If paedophiles start referring to abuse images as cake you can’t block cake from Internet searches”, he said.

Google has developed the technology to tag illegal images and videos so that all duplicates can be removed as well as the original. The trouble is, according to Dr. Wright, an image only has to be changed slightly, by resizing it for instance, and it becomes an entirely new file. The technology has the capability to detect this new file but there becomes a risk of over-blocking images so people find their harmless holiday snaps being blocked. Very recently a friend of mine found her Facebook account suspended and an image removed. The offending image was a photograph of her newborn having his first bath, posted for the benefit of his grandmother who was travelling abroad. The reverse risk is that in order to avoid blocking innocuous family photographs under-blocking might occur, thus creating a false sense of security about what is being controlled.

Dr Wright did qualify his reservations by saying that when it came to images of child abuse “people agree it is best to be heavy handed,” adding that so as long as the companies have a reasonable balance it is likely not to cause too much collateral damage while succeeding in cutting the availability of illegal and dreadful images, each one of which records the terrible suffering of a child.

I do not wish to argue that because the enterprise is difficult it ought to be placed in the “too hard basket” and not attempted. Anything that will reduce the amount of child pornography on the internet and concomitantly reduce ease of access to what remains is worth trying. Google and Microsoft are to be commended for what they have done already and for what they are continuing to do. Both are companies with almost bottomless pockets and it might be useful to encourage them to share the technology they have developed to block and tag and track illegal images with other smaller companies who might not be able to afford the large personnel required for the project. (In the last three months Google employed an extra 200 people just to work on this issue).

My concern is that we, and especially our governments do not now think we have reached the panacea for the pandemic of child abuse which besets us.

This brings me to the “fiddling while Rome burns” aspect of attempting to block access to child pornography. I do not deny that the spread of online child abuse images is a major problem but it raises two serious issues in my mind.

First, the very effort to block these images implies that the object of doing so is to prevent them from being seen. This is as it should be. The posting and exchange of child pornography is big business which operates with two types of currency. On the “dark” or “hidden” net the currency is more often than not status and kudos. On the open internet the currency is money – often big money. If the attempts to block, tag, track and remove can seriously dent the demand because the risks are too high then it is to be hoped this will seriously reduce any incentive there is to create a supply. The more that can be done to deprive this vile trade of oxygen the sooner we might suffocate it.

This, however, must be only one line of attack. It is important to remember that every image of child abuse on the internet, and there seem to be millions of them, represents an instance when a child has been abused. This is so whether the image is blocked, removed or completely destroyed. To be sure, removing such images protects the victims from ongoing harm, but blocking or removing images is not enough to help these victims, some of whom are still children and some are extremely damaged adults.

In addition, it is my belief, based on years of clinical practice and research, that for every child who is abused in order to obtain images for the internet many, many more are being abused, silently, invisibly, in private homes and other places across the world. None of the many abuse victims I have been in contact with over the years have been filmed or photographed. Mostly they have been abused by fathers, uncles, grandfathers, brothers, trusted family friends who have sneaked into their bedrooms and abused them in the dark.

It seems that very little is being done to help these victims of abuse. The blocking of internet images won’t help them very much, although it is possible that the less paedophiles are stimulated the less they might abuse. Nevertheless sexual abuse of children was widespread long before the advent of the internet and children abused in pre-digital times are still suffering.

While I do not want to discourage or discount any efforts which are being made to counteract any aspect of child abuse I am concerned that an excessive focus of attention, time and money on internet images will set up an illusion that we “are doing all we can” when that is clearly not the case. We can work to block and then shut down child pornography online and we can and should also be working to render assistance to the survivors of these horrible crimes.

Of course resources are limited, or at least, we are told they are limited. I would question whether this is so. For instance, Britons pour 46 billion pounds into gambling machines every year. A miniscule levy on these amounts – say of 0.01% --would yield 460 million pounds in the UK, which could be established as a fund to help all victims of child abuse.  A data bank of faces could be established and much more work could be put into tracing children whose images have been spread around the internet. Assistance could be given to them to pursue legal remedies for their suffering. In the USA, for example, several victims who are now adults have successfully sued men who downloaded their images.  

The abuse of children, especially the sexual abuse of children, is one of the greatest medical and moral challenges of our age. It is time governments did more than merely set up inquiries and commissions and tell other agencies like private companies that they must act. It is time that we demanded a comprehensive and very active legislative and administrative approach aimed at helping victims.

 

NEXT: The administrative fallout and failures following my experimental Google search for online child pornography.

 

FOLLOWING: Restrictions on child pornography and its implications for a civil society.