Washington, DC

slideshow 1

You are here

Schedule

Schedule

6 June 2021 - Course Hours:  1 Day duration | 08:00–17:00 hrs

Preliminary agenda:

08:15–08:30  Welcome Laura Mainini (UTRC)

08:30–09:45 Tutorial/Lecture 1: Matthias Poloczek (UBER)

Scalable Bayesian optimization for high dimensional expensive functions

Bayesian optimization has recently emerged as a powerful method for the sample-efficient optimization of expensive black-box functions. These functions do not have a closed-form and are evaluated for example by running a complex simulation, a lab experiment, or solving a PDE. Use cases arise in machine learning, e.g., when optimizing a reinforcement learning policy; examples in engineering include the design of aerodynamic structures or searching for better materials. However, the application of Bayesian optimization to high-dimensional problems remains challenging, and on difficult problems, Bayesian optimization is often not competitive with other paradigms. In the first part of the talk I will give a self-contained introduction to Bayesian optimization. Then I will present novel algorithms that overcome the previous limitations of Bayesian optimization and set a new state-of-the-art performance for high-dimensional problems.

Based on joint work with Alexander Munteanu and Amin Nayebi presented at ICML 2019
and on joint work with David Eriksson, Michael Pearce, Jake Gardner, Ryan Turner that appeared in the Proc. of NeurIPS 2019.

09:45–11:00 Tutorial/Lecure 2: Nathalie Bartoli (ONERA/DTIS, Université de Toulouse)

Bayesian optimization via multi-fidelity surrogate modeling – method and practice

In a context of optimization with multiple information sources with varying degrees of fidelity, with varying associated accuracy and querying costs, we propose to formulate a multi-fidelity
extension for Efficient Global Optimization. An illustration of the method will be done in 1D and some airfoil shape optimization results using both a RANS solver and a low fidelity approximation will be presented. For a pratical use, a Jupyter Python Notebook based on the open source toolbox SMT will be proposed on different academic problems. https://github.com/SMTorg/smt
Based on joint work with Thierry Lefebvre (ONERA), Mostafa Meliani and Joseph Morlier (ISAE-SUPAERO) and on collaboration with University of Michigan (J. R.R.A. Martins, M.-A.Bouhlel).

11:00–11:15 Coffee break

11:15–12:30 Tutorial/Lecture 3: Benjamin Peherstorfer (NYU)

Multifidelity Uncertainty Quantification

Uncertainty quantification with sampling-based methods such as Monte Carlo can require a large number of numerical simulations of models describing the systems of interest to obtain estimates with acceptable accuracies. Thus, if a computationally expensive high-fidelity model is used alone, Monte-Carlo-based uncertainty quantification methods quickly become intractable. In this tutorial presentation, we survey recent advances in multifidelity methods for sampling-based uncertainty quantification. The goal of the multifidelity methods that we discuss is to significantly speedup uncertainty quantification by leveraging low-cost low-fidelity models while establishing accuracy guarantees and unbiasedness via occasional recourse to the expensive high-fidelity models. We survey methods for (a) uncertainty propagation, (b) rare event simulation, (c) sensitivity analysis, and (d) Bayesian inverse problems. If time permits, we will (e) give an outlook to context-aware learning of data-driven low-fidelity models, where models are learned explicitly for improving the performance of multifidelity computations rather than providing accurate approximations of high-fidelity models.

Links to implementations of multifidelity methods: https://cims.nyu.edu/~pehersto/code.html

Multifidelity Monte Carlo implementation in Matlab: https://github.com/pehersto/mfmc

12:30–13:45 Lunch break

13:45–14:45 Tutorial/Lecture 4: Phil Beran (AFRL)

[TITLE] -[ABSTRACT] -[MATERIAL]

14:45–15:45 Tutorial/Lecture 5: more information to come soon

[TITLE] -[ABSTRACT] - [MATERIAL]

15:45–16:00 Coffee break

16:00–17:00 Round Table and wrap up: more information to come soon

[TITLE] - [ABSTRACT] - [MATERIAL]