What convolutional neural networks look at when they look at nudity

Abstract: Automating the discovery of nude pictures has been a major problem in computer vision for over two decades now, and, because of its rich history and straightforward goal, serves as a great example of how the field has evolved. We'll use the problem of nudity detection to illustrate how training modern convolutional neural networks (convnets) differs from research done in the past. In particular, we'll discuss the notion of a deconvolutional network and demonstrate how they can be used to visualize intermediate feature layers and the operation of a classifier. We'll cover limitations of out-of-the box NSFW filters and show how to personalize a classifier via Clarifai's custom training API. A version of this presentation is available in blog form: http://blog.clarifai.com/what-convolutional-neural-networks-see-at-when-they-see-nudity/

Bio: Ryan Compton currently heads data science at Clarifai. His day-to-day is designing datasets to train convnets and then shipping them. Ryan holds a PhD in mathematics from UCLA with a focus on sparsity-promoting optimization and was previously on staff at Howard Hughes laboratories. Some of his research has been covered in Forbes, the New York Observer, and Business Insider among other places.

Open Data Science Conference