Functional Gradient Descent methods for optimization and sampling
Qiang Liu is an Assistant Professor of Computer Science at University of Texas at Austin (UT). Dr. Liu leads the Statistical Learning & AI Group at UT, and has had several recent publications in advanced machine learning. His research group had four papers accepted at this year’s International Conference on Machine Learning, and two papers accepted at the International Conference on Learning Representations. His research focuses on Artificial Intelligence and Machine Learning, especially statistical learning methods for high-dimensional and complex data.
Gradient descent provides a fundamental tool of optimization in machine learning. However, the power of gradient descent is not limited to the typical Euclidean space. In this work, I will discuss several ideas of using generalized notion of gradient in infinite dimensional spaces for solving challenging optimization and optimization problems, including 1) Stein variational gradient descent, which provides a deterministic mechanism for drawing samples from intractable distributions using a functional steepest descent w.r.t. a RKHS-Wasserstein metric; 2) splitting steepest neural architecture descent which leverages a functional steepest procedure for joint estimation of neural network weights and structures; 3) and extensions of these ideas for handling constrained, bilevel and multi-objective optimization.