Many engineering, scientific, and industrial applications including automated machine learning (e.g., hyper-parameter tuning) involve making design choices to optimize one or more expensive to evaluate objectives. Some examples include tuning the knobs of a compiler to optimize performance and efficiency of a set of software programs; designing new materials to optimize strength, elasticity, and durability; and designing hardware to optimize performance, power, and area. Bayesian Optimization (BO) is an effective framework to solve black-box optimization problems with expensive function evaluations. The key idea behind BO is to build a cheap surrogate model (e.g., Gaussian Process) using the real experimental data; and employ it to intelligently select the sequence of function evaluations using an acquisition function, e.g., expected improvement (EI).
The goal of this tutorial is to present recent advances in BO by focusing on challenges, principles, algorithmic ideas and their connections, and important real-world applications. Specifically, we will cover recent work on acqusition functions, BO methods for discrete and hybrid spaces, BO methods for high-dimensional input spaces, multi-fidelity and multi-objective BO, and key innovations in BoTorch toolbox along with a hands-on demonstration.
The tutorial is on Wednesday, 8th February 2023, 2 p.m. EST — 6:00 p.m. EST.
If you are looking for the NeurIPS 2022 tutorial, the link is here.