SEMINAR

Towards calibrated and flexible probabilistic deep learning

Friday, Jul 24 2020 - 8:43 pm (GMT + 7)
Speaker
Thang Bui
Working
University of Sydney
Timeline
Fri, Jul 24 2020 - 03:00 pm (GMT + 7)
About Speaker

Thang Bui is a research scientist at Uber AI and a lecturer in Machine Learning at the University of Sydney. He has a PhD degree in Machine Learning from the Department of Engineering, University of Cambridge and a BEng from the University of Adelaide. He is broadly interested in machine learning and statistics, with a particular focus on neural networks, probabilistic models, approximate Bayesian inference, and sequential decision making under uncertainty.

Abstract

Deep learning has achieved great successes in many real-world domains, ranging from vision, language to game playing. Yet, it has been shown to possess many limitations, including: (i) it is not robust to out-of-distribution inputs and (ii) it suffers from catastrophic forgetting when faced with streaming data. In this talk, I will show how we have addressed some of these limitations by combining deep learning with probabilistic modelling. This combination provides desirable test-time uncertainty estimates on out-of-distribution data and allows neural networks to be trained in an incremental way. If time permits, I will show general distributed learning, also known as federated learning, can also be handled by the same algorithmic framework.

Related seminars

Coming soon
Niranjan Balasubramanian

Stony Brook University

Towards Reliable Multi-step Reasoning in Question Answering
Fri, Nov 03 2023 - 10:00 am (GMT + 7)
Nghia Hoang

Washington State University

Robust Multivariate Time-Series Forecasting: Adversarial Attacks and Defense Mechanisms
Fri, Oct 27 2023 - 10:00 am (GMT + 7)
Jey Han Lau

University of Melbourne

Rumour and Disinformation Detection in Online Conversations
Thu, Sep 14 2023 - 10:00 am (GMT + 7)
Tan Nguyen

National University of Singapore

Principled Frameworks for Designing Deep Learning Models: Efficiency, Robustness, and Expressivity
Mon, Aug 28 2023 - 10:00 am (GMT + 7)