
Abstract: Deep learning is a rapidly evolving field, and models are increasingly complex. Recently, researchers have begun to explore "differentiable programming", a powerful way to combine neural networks with traditional programming. Differentiable programs may include control flow, functions and data structures, and can even incorporate ray tracers, simulations and scientific models, giving us even unprecedented power to find subtle patterns in our data.
This workshop will show you how this technique, and particularly Flux – a state-of-the-art deep learning library – is impacting the machine learning world. We will show you how Flux makes it easy to create traditional deep learning models, and explain how the flexibility of the Julia language allows complex physical models can be optimised by the same architecture. We'll outline important recent work and show how Flux allows us to easily combine neural networks with tools like differential equations solvers.
Bio: Avik Sengupta has worked on risk and trading systems in investment banking for many years, mostly using Java interspersed with snippets of the exotic R and K languages. This experience left him wondering whether there were better things out there. Avik's quest came to a happy conclusion with the appearance of Julia in 2012. He has been happily coding in Julia and contributing to it ever since.

Avik Sengupta
Title
VP of Engineering | Julia Computing
Category
beginner-europe19 | deep-learning-europe19 | intermediate-europe19 | machine-learning-europe19 | workshops-europe19
