SEMINAR

Parameter Estimation & Interpretability in Bayesian Mixture Models

Wednesday, Aug 28 2019 - 5:40 pm (GMT + 7)
Speaker
Long Nguyen
Working
University of Michigan
Timeline
Wed, Aug 28 2019 - 10:00 am (GMT + 7)
About Speaker

Long Nguyen is an associate professor in the Department of Statistics and, by courtesy, Department of Electrical Engineering and Computer Science at the University of Michigan, Ann Arbor. He received his PhD degree from the University of California, Berkeley in 2007. Nguyen’s interests include nonparametric Bayesian statistics, machine learning and optimization, as well as applications in signal processing and environmental sciences. He is a recipient of the Leon O. Chua Award from UC Berkeley for his PhD research, the IEEE Signal Processing Society’s Young Author best paper award, the CAREER award from the NSF’s Division of Mathematical Sciences, and best paper awards from the International Conference on Machine Learning (ICML) in 2004 and 2014. Nguyen currently serves as associate editor of Bayesian Analysis, Journal of Machine Learning Research, the Annals of Statistics and SIAM Journal on Mathematics of Data Science.

Abstract

We study posterior contraction behaviors for parameters of interest in the context of Bayesian mixture modeling, where the number of mixing components is unknown while the model itself may or may not be correctly specified. Two representative types of prior specification will be considered: one requires explicitly a prior distribution on the number of mixture components, while the other places a nonparametric prior on the space of mixing distributions. The former is shown to yield an optimal rate of posterior contraction on the model parameters under minimal conditions, while the latter can be utilized to consistently recover the unknown number of mixture components, with the help of a fast probabilistic post-processing procedure. We then turn the study of these Bayesian procedures to the realistic settings of model misspecification. It will be shown that the modeling choice of kernel density functions plays perhaps the most impactful roles in determining the posterior contraction rates in the misspecified situations. Drawing on concrete posterior contraction rates established in this paper we wish to highlight some aspects about the interesting tradeoffs between model expressiveness and interpretability that a statistical modeler must negotiate in the rich world of mixture modeling.

Related seminars

Coming soon
Niranjan Balasubramanian

Stony Brook University

Towards Reliable Multi-step Reasoning in Question Answering
Fri, Nov 03 2023 - 10:00 am (GMT + 7)
Nghia Hoang

Washington State University

Robust Multivariate Time-Series Forecasting: Adversarial Attacks and Defense Mechanisms
Fri, Oct 27 2023 - 10:00 am (GMT + 7)
Jey Han Lau

University of Melbourne

Rumour and Disinformation Detection in Online Conversations
Thu, Sep 14 2023 - 10:00 am (GMT + 7)
Tan Nguyen

National University of Singapore

Principled Frameworks for Designing Deep Learning Models: Efficiency, Robustness, and Expressivity
Mon, Aug 28 2023 - 10:00 am (GMT + 7)