Revealing the Inner Self: Automatic Differentiation (Autodiff) Clearly Explained


Forget everything you know about calculus! That is if you just want to compute gradients using your favorite deep learning framework like PyTorch, Keras, or similar. In this session, you can learn how gradients are derived in these frameworks by implementing the core automatic differentiation algorithm (autodiff) in basic Python, all from the first principles, and without any calculus.

Why should you care about autodiff? In deep learning, you can do amazing things with pre-trained and publicly available PyTorch, TensorFlow, or Jax models. But if want to train your own models instead of just re-using existing ones, you should have a firm understanding of the algorithm behind autodiff. While it is possible to use autodiff by treating it as a black box without fully comprehending how it works, if you wish to develop the skills for troubleshooting autodiff in production, MLOps scenarios, you should have at least a basic understanding of this critical feature of the deep learning frameworks.

This live coding session introduces the ideas behind autodiff and teaches its fundamentals by walking you through a simple example of implementing autodiff using the core Python programming language features, without PyTorch. In the process, you will gain a deeper understanding of the PyTorch autodiff functionality and develop the knowledge that will help you troubleshoot PyTorch model training (for example, using Horovod) in your projects. You will see that while autodiff can be straightforward, it scales to complex applications of the calculus chain rule.

Join us at this session where automatic differentiation will be clearly explained.


Carl implemented his first neural net in 2000. He is a senior director of the AI / ML practice at Cognizant, focusing on communications, technology, and media customers. Previously he worked on deep learning and machine learning at Google and IBM. Carl is an author of over 20 articles in professional, trade, and academic journals, an inventor with 6 patents at USPTO, and holds 3 corporate awards from IBM for his innovative work. His machine learning book, "MLOps Engineering at Scale" continues to receive reader acclaim. You can find out more about Carl from his blog

Open Data Science




Open Data Science
One Broadway
Cambridge, MA 02142

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from - Youtube
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google