Google is creating an image database to better control the traffic of offensive images and remove child pornography from the web.

In the past month the internet giant has been heaped with criticism from British MPs, for what they view as the company's reluctance to crack down on child internet pornography.

But now Google has announced plans to create an online database of offensive images which will enable other companies, law enforcement agencies and charities to more easily detect and remove offensive content.

Professor Catherine Lumby from Macquarie University's media department says the move is a welcome sign of leadership.

"I think what's particularly important is that Google is demonstrating leadership when it comes to working with civil and government organisations on what is a really appalling abuse of human rights," Ms Lumby said.

Google says the new database will use 'hashing' technology, which enables it to identify duplicate images elsewhere online.

Offending pictures will be given a unique ID that computers can recognise without humans having to view them.

Google has been using this technique since 2008, but says it will now incorporate encrypted "fingerprints" of child sexual abuse images into a cross-industry database.

'Hashing' an efficient alternative to government internet filters

The chairman of Electronics Frontiers Australia, David Cake, says Google's latest initiative is a welcome development.

"The real change is Google is making available a technological solution," Mr Cake said.

"They are working towards being part of that actively, rather than being a passive consumer of someone else's list."

Professor Lumby says the initiative is an efficient alternative to governmental internet filters.

"The difference here is this hashing technology has been developed and has been shown to be effective, and the genre of the material that's being focused on is narrow and specific," she said.

"I think that's very different to a broad internet filter which trawls across a whole range of material."

Some of the $5 million that Google is committing to identifying illegal content will be provided to internet monitoring charities such as the UK-based Internet Watch Foundation (IWF), as well as other unspecified groups in Australia.

However Mr Cake says the question of who decides what is offensive content could be problematic.

"We have had some concerns with organisations like the Internet Watch Foundation that Google is now committed to supporting, so we'll certainly be looking at this in some detail," he said.

"Famously, the IWF once ended up blocking most of Wikipedia due to one 1970s album cover, and that sort of issue may still remain."

But, providing that the technology is used appropriately and carefully, Mr Cake says the move has promise.

 

Advertisement