- Mengwu Guo, University of Twente
- Anirban Chaudhuri, University of Texas at Austin
Gaussian process (GP) regression is a non-parametric method for surrogate modeling. Featuring a Bayesian nature, GP surrogates have a rich history in machine learning and engineering applications. Recent advancements in GP surrogates, such as sparse GP, manifold GP, multifidelity GP, and deep GP, have significantly enhanced their feasibility to realistic applications. A big feature of GP surrogates is the associated interpretability. GP surrogates also connect various algorithmic methods in machine learning and uncertainty quantification, such as kernel methods, Bayesian methods, deep learning, and manifold learning, and there are opportunities to further open the box and add physical constraints to formulate physics-informed GPs. The minisymposium seeks to bring together researchers from various fields that work on theoretical advancement of GP models and those using GP surrogates in computational modeling and analysis. Areas of interests include Bayesian optimization, active learning, uncertainty quantification, Bayesian inference, and physics-constrained surrogate modeling.