A Deeper Stack For Deep Learning: Adding Visualisations And Data Abstractions to Your Workflow

Abstract: In this training session I introduce a new layer of Python software, called ConX, which sits on top of Keras, which sits on a backend (like TensorFlow.) Do we really need a deeper stack of software for deep learning? Backends, like TensorFlow, can be thought of as ""assembly language"" for deep learning. Keras helps, but is more like ""C++"" for deep learning. ConX is designed to be ""Python"" for deep learning. So, yes, this layer is needed.

ConX is a carefully designed library that includes tools for network, weight, and activation visualizations; data and network abstractions; and an intuitive interactive and programming interface. Especially developed for the Jupyter notebook, ConX enhances the workflow of designing and training artificial neural networks by providing interactive visual feedback early in the process, and reducing cognitive load in developing complex networks.

This session will start small and move to advanced recurrent networks for images, text, and other data. Participants are encouraged to have samples of their own data so that they can explore a real and meaningful project.

A basic understanding of Python and a laptop is all that is required. Many example deep learning models will be provided in the form of Jupyter notebooks.

Documentation: https://conx.readthedocs.io/en/latest/

Bio: Doug Blank is now a Senior Software Engineer at Comet.ML, a start-up in New York City. Comet.ML helps data scientists and engineers track, manage, replicate, and analyze machine learning experiments.

Doug was a professor of Computer Science for 18 years at Bryn Mawr College, a small, all-women's liberal arts college outside of Philadelphia. He has been working on artificial neural networks for almost 30 years. His focus has been on creating models to make analogies, and for use with robot control systems. He is one of the core developers of ConX.