
Abstract: This tutorial introduces Gluon, a flexible new interface that pairs MXNet’s speed with a user-friendly frontend. In the past, deep learning practitioners had to choose between ease of use and speed when choosing frameworks. On one side, there were symbolic frameworks like Theano and TensorFlow. These offer speed and memory efficiency but are harder to program, can be a pain to debug, and don’t allow for many native language features, like basic control flow. On the other side, there are imperative frameworks like Chainer and PyTorch. They’re a joy to program and easy to debug, but they can seldom compete with the symbolic code when it comes to speed. Gluon reconciles the two, removing a crucial pain point.
Gluon can run as a fully imperative framework. In this mode, you enjoy native language features, painless debugging, and rapid prototyping. You can also effortlessly deploy arbitrarily complex models with dynamic graphs. And if you want the story to end there, it can. But when you need real performance, Gluon can also provide the blazing speed of MXNet’s symbolic API by calling down to Gluon’s just-in-time compiler.
We’ll cover everything from the basic constructs of Gluon to advanced models. We’ll walk you through MXNet’s NDArray data structure and automatic differentiation. Well show you how to define neural networks at the atomic level, and through Gluon’s predefined layers. We’ll demonstrate how to serialize models and build dynamic graphs. Then, we’ll you how to hybridize your networks, simultaneously enjoying the benefits of imperative and symbolic deep learning. Finally, we’ll teach you how Gluon works under the hood and show you how to hack your own layers.
Bio: Zachary Lipton is a mad scientist at Amazon AI and assistant professor at Carnegie Mellon University (2018-). He researches ML methods, applications (especially to healthcare), and social impacts. In addition to corralling deep neural neurons and starting fires on Twitter (@zacharylipton), he is the editor of the Approximately Correct blog and lead author of Deep Learning - The Straight Dope, an interactive book teaching deep learning and MXNet Gluon through Jupyter notebooks.

Zachary Chase Lipton. PhD
Title
Data Scientist at Amazon AI, Contributing Editor at KDnuggets
Category
west2017trainings
