Optuna: A Define-by-Run Hyperparameter Optimization Framework
Optuna: A Define-by-Run Hyperparameter Optimization Framework

Abstract: 

In this workshop, we introduce Optuna, a next-generation hyperparameter optimization framework with new design-criteria: (1) define-by-run API that allows users to concisely construct dynamic, nested, or conditional search spaces, (2) efficient implementation of both sampling and early stopping strategies, and (3) easy-to-setup, versatile architecture that can be deployed for various purposes, ranging from scalable distributed computing to lightweight experiment conducted in a local laptop machine. Our software is available under the MIT license.

Bio: 

Crissman has worked at Preferred Networks on the Deep Learning Open Source Software team for over two years, improving the documentation for the Deep Learning framework Chainer and giving presentations on Open Data Science Conferences, SciPy, PyCon, GTC, and other venues. His ODSC West workshop on Chainer was selected as one of the top 10 workshops for learning Machine Learning.

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from Youtube
Vimeo
Consent to display content from Vimeo
Google Maps
Consent to display content from Google