Artificial Intelligence programmed to detect child abuse images does not stop confusing deserts with nudity

The AI ​​may become an infallible tool in the future. But for now, algorithms tend to fall into the most unexpected errors. The last one happened in the United Kingdom, where a new AI programmed to detect child pornography has decided that deserts are a completely indecent image.

Artificial Intelligence programmed to detect child abuse images does not stop confusing deserts with nudity

The objective of the Metropolitan Police of London is to use the AI ​​to examine seized computers in search of images containing child pornography. The idea is excellent, because analyzing thousands of images is a psychologically terrible task for police technicians and it consumes precious time. Unfortunately, in its current state, the system is not yet ready to operate.

“Sometimes AI finds a picture of a desert and believes it is pornography,” explains the director of the forensic electronics department of the Police. “For some reason, there are many people with pictures of deserts as wallpaper, and AI confuses the color of sand with that of human skin.”

Police work with Silicon Valley technicians to try to educate AI so that they have a better eye on the photos that actually contain child pornography. The department trusts that the algorithms are prepared to work without failures in about two years.

For now, there are no algorithms that can identify the content of photos without failures. A recent application that we tested in Juicy Links to detect nudes on a smartphone registered false positives with the photo of a dog, a donut and Grace Kelly (fully dressed). The validation of the results by a human being is still completely necessary.

Advertisements