Artificially intelligent strategies for filtering

offensive images on the Internet

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Mark Wilson

B351: Introduction to AI

AI: Tei Laine

April 29, 2001

On December 21, 2000, the Children's Internet Protection Act was signed into law. The act states that libraries receiving Internet access discounts through the federal e-rate program must have their technology protection measures certified by the FCC. That is to say, the law threatens certain libraries with the withdrawal of federal technology support if they do not take steps towards filtering out visual depictions that are deemed harmful to minors. A number of organizations, including the Multnomah County Public Library in Oregon, have filed a suit that claims CHIPA will stifle Constitutionally protected speech (Sponsler 2001).

The goal of this paper is to explore the role of artificial intelligence in filtering potentially obscene images on the Internet. Current algorithms that answer the demanding query "is this image pornographic?" include the Basic Artificial Intelligence Routine, PORNsweeper, Heartsoft Gauntlet, and the Naked People Finder. Despite the ephemeral definition of pornography, most of these agents rely on simple reflexive categorizations of image shapes and skin tones. For this reason, these approaches provide poor performance, and are especially prone to false positive errors that pigeonhole obviously harmless images as pornography.

Conventional Internet filtering software relies on a handmade database of keywords and blocked sites. However, the exponential growth of the Internet makes such databases humanly impossible to maintain. Furthermore, these products are notorious for blocking innocuous sites such as Beaver College (Bicknell 2000) and their own critics (Cullen 2001).

Exotrope Inc.’s Basic Artificial Intelligence Routine took site blocking a step further by introducing image recognition software to the filtering process. Exotrope claims that BAIR’s proprietary "active information matrix," probably a neural network, "has been taught to recognize sexually explicit graphics regardless of the text on the Web page," (Fitzwater). BAIR analyses pictures according to shapes and skin tones.

Exotrope’s claims were found to be hugely exaggerated when Wired News tested the system against random photos that were stripped of textual clues. Wired discovered that 90 to 95 percent of all photos were blocked, and the odds of correctly discovered pornography were nearly 50-50. Carnegie Mellon University researcher Dave Touretzky explained that the only difference between an image of a woman in a bikini and a naked woman is "a couple of nipples and a patch of pubic hair. You're not going to be able to find that with a neural network... If they don't disclose the training data, there's no way to figure out what's going on," (McCullagh 2000).

Baltimore Technology’s PORNsweeper is another proprietary scheme that attempts to block access to pornography. In this case, the product examines corporate email to prevent trading of pornographic photographs between employees. Companies buy this software in order to lessen their liability in a sexual harassment lawsuit. The rules programmed into PORNsweeper are fairly simple—it measures the amount of flesh tones in a photograph and if they are significant, it uses Digitech Projects’ face detection algorithms to determine if the reason for the high percentage of flesh tones is because the photograph is an innocent headshot. Baltimore Technology claims that their algorithm can block 85 percent of all pornography with a mere 15 percent false positive error. An independent review by Dan’s Data revealed that the software indeed catches 85 percent of all pornography, but also blocks 50 percent of "clean" images (Rutter 2000).

Another proprietary method for blocking potentially objectionable images is used in Heartsoft’s Internet Safari web browser for children. Code-named "Gauntlet," the company claims their algorithm uses 1980s era AI software developed by NASA and the Strategic Defense Initiative. Heartsoft’s software also examines flesh tones and curves to determine image acceptability (Kahney 1999). Heartsoft claims that the browser is 90-95 percent accurate (Geek.com). However, without an unprecedented breakthrough in image recognition, Heartsoft’s vague accuracy claim is not believable.

The Naked People Finder is the best documented categorization strategy of this type. It was programmed from an image retrival point of view, rather than an image suppression viewpoint. The algorithm uses flesh tones to create a pixel mask of the located skin. After removing all non-fleshy pixels, the image is fed into a grouper, which categorizes objects according to a set of geometric rules that pertain to human physiology. Elongated regions of skin-colored pixels are grouped and these individual parts are classified according to their structure, spacial relationships and textures (Fleck 1).

The Naked People Finder justifies the use of skin color as a signifier of nudity because skin color is restricted to the possible hues that can be derived from a combination of blood and melanin (3). After determining regions of skin, the algorithm uses grouping rules to form complex groups from individual segments. For instance, two connected segments form a limb and two connected limbs form a limb-limb girdle. A segment added to a limb-limb girdle may yield a girdle and trunk. The system also accounts for multiple people present in the photograph and rejects groupings that seem physically impossible for a human to perform (4). The Finder reports naked people if it can find an acceptable spine-thigh group or a girdle group in the photograph (6). Much of the Finder’s false positive errors were the result of skin-colored materials containing parallel edges, such as in industrial scenes (8). The Finder returns half as many false positive errors as skin filtering alone (9). False negative errors were attributed to close-ups that did not include limbs and poses, such as profiles, that were not included in the Finder’s geometrical model (8). The second incarnation of the Finder had 44 percent recall and 74 percent precision. That is to say, 44 percent of samples in the population containing naked people were correctly located and out of all retrived samples, 74 percent of them contained naked people.

The universally poor performance of approaches that attempt to block offensive imagery on the Internet indicates that this particular use of artificial intelligence is misguided. Although it is generally conceded that there are images on the Internet that are harmful to children, actual agreement on what images this category entails is not possible within human reasoning, let alone machine reasoning. This trend is particularly disturbing considering the high, intangible, costs associated with supressing intellectual freedom in our society. Community decency standards vary between communities and as a function of time. There was a time when it was illegal to mail information about contraception through the mail. Therefore, a task domain that cannot be clearly defined by people will be similarly difficult to define for a computer. Image recognition capable of reliably finding people and determining their state of dress would requite significant advances in AI and "Heck, pinning down a quantifiable difference between ‘dirty’ and ‘clean’ might well get you prizes in philosophy, too," (Rutter 2000).

On the other hand, there are imaginable cercumstances where this type of agent would be useful. For instance, sociologists at the Kinsey Institute could use a variant of the Finding Naked People algorithm to study pornographic viewing habits. Also, feminist media analysists could use it to investigate the male gaze phenomenon. Furthermore, implimentations of these face, curve and girdle locating algorithms, minus skin filters, would be highly useful for general image retrival purposes.

A combination of techniques could be used to improve the precision of these schemes for the purposes of image retrival. An agent could relieve instances of commonly confused categories by first retriving images that fit the search request and then using a second filter, which maintains a knowledge base of previous mistakes, to dispose of irrelivant returns.

Another avenue of research that is interesting is the use of genetic algorithms in developing image recognition strategies. In Scalable Evolvable Hardware Applied to Road Image Recognition, Jim Torresen details a methodology for evolving circuitry that can steer an automobile using image recognition. Although an imperfectly evolved circuit is dangerous in a mission-critical environment such as driving, the cost of error is not as high in image retrival. Genetic evolution has designed the world’s most sophisticated image recognition device—the human nervous system—therefore, it is concievable that a scheme could be developed where visual input categorization schemes are represented in a genome and refined using a genetic algorithm. This input could then be fed into a neural network for processing.

Bibliography:

Craig Bicknell. "Beaver College Not a Filter Fave." March 22, 2000. Wired.com. April 29, 2001. <http://www.wired.com/news/politics/0,1283,35091,00.html>.

Drew Cullen. "This story is why Cyber Patrol banned The Register." August 3, 2001. The Register. April 29, 2001. <http://www.theregister.co.uk/content/6/17420.html>

Kevin Fitzwater. "Product Information on the BAIR." Exotrope, Inc. April 29, 2001. <http://www.thebair.com/info.htm>

M.M. Fleck, D.A. Forsyth and C. Bregler. "Finding naked people." 1996. University of Iowa and University of California-Berkeley. April 29, 2001. <http://www.cs.hmc.edu/~fleck/naked-people.ps.Z>.

Geek.com. "New Porn Protection." June 1, 2000. Geek.com. April 29, 2001. <http://www.geek.com/news/geeknews/q22000/gee2000601001563.htm>.

Leander Kahney. "Kids' Browser to Spot Dirty Pics." June 18, 1999. Wired.com. April 29, 2001. <http://www.wired.com/news/technology/0,1282,20298,00.html>.

Declan McCullagh. "Smut Filter Blocks All But Smut." Jun. 20, 2000. Wired.com. April 29, 2001. <http://www.wired.com/news/technology/0,1282,36923,00.html>

Daniel Rutter. "Review: PORNsweeper." November 1, 2000. Dansdata.com. April 29, 2001. <http://www.dansdata.com/pornsweeper.htm>

Thomas Sponsler. "Multnomah Public Library v. U.S. Complaint." April 2, 2001. Electronic Frontier Foundation. April 29, 2001. <http://www.eff.org/Legal/Cases/Multnomah_Library_v_US/20010402_multnomah_complaint.html>

Jim Torrensen. "Scalable Evolvable Hardware Applied to Road Image Recognition." September 28, 2000. University of Oslo. April 29, 2001.

<http://ic-www.arc.nasa.gov/ic/eh2000/slides/torresen/sld001.htm>

 

 

Home