
Abstract: Flux (http://fluxml.ai) is a new machine learning library that’s easy and intuitive to use, but scales to handle the most difficult research challenges. As machine learning models grow increasingly complex, we suggest that neural networks are best viewed as an emerging, differentiable programming paradigm, and ask what decades of research into programming languages and compilers has to offer to the machine learning world.
Flux is written entirely in Julia, an easy but high-performance programming language similar to Python. You can train models using high-level Keras-like interfaces, or drop down to the mathematics, allowing complete customisation even down to the CUDA kernels. Meanwhile, Julia’s advanced compiler technology allows us to provide cutting edge performance.
This workshop will introduce Flux and its approach to building differentiable, trainable algorithms and show simple but practical examples in image recognition, reinforcement learning and natural language processing. We’ll also cover Flux’s ecosystem of existing ready-made models, and how these can be used to get a head start on real-world problems.
Bio: Mike Innes is a software engineer at Julia Computing, where he works on among other things the Juno IDE and the machine learning ecosystem. He is the creator of the Flux machine learning library.

Mike Innes
Title
Software Engineer at Julia Computing
Category
europe-2018-workshops
