Opening The Black Box — Interpretability In Deep Learning
Opening The Black Box — Interpretability In Deep Learning

Abstract: 

The recent application of deep neural networks to long-standing problems has brought a break-through in performance and prediction power. However, high accuracy often comes at the price of loss of interpretability, i.e. many of these models are black-boxes that fail to provide explanations on their predictions. This tutorial focuses on illustrating some of the recent advancements in the field of interpretable artificial intelligence. We will show some common techniques that can be used to explain predictions on pretrained models and that can be used to shed light on their inner mechanisms. The tutorial is aimed to strike the right balance between theoretical input and practical exercises. The tutorial has been designed to provide the participants not only with the theory behind deep learning interpretability, but also to offer a set of frameworks, tools and real-life examples that they can implement in their own projects.

Bio: 

Matteo is a Research Staff Member in Cognitive Health Care and Life Sciences at IBM Research Zürich. He's currently working on the development of multimodal deep learning models for drug discovery using chemical features and omic data. He also researches in multimodal learning techniques for the analysis of pediatric cancers in a H2020 EU project, iPC, with the aim of creating treatment models for patients. He received his degree in Mathematical Engineering from Politecnico di Milano in 2013. After getting his MSc he worked in a startup, Moxoff spa, as a software engineer and analyst for scientific computing. In 2019 he obtained his doctoral degree at the end of a joint PhD program between IBM Research and the Institute of Molecular Systems Biology, ETH Zürich, with a thesis on multimodal learning approaches for precision medicine.

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google