Scalable inference for Bayesian Non-parametrics
- 2020-03-02 (Mon.), 10:30 AM
- 中研院-統計所 6005會議室(環境變遷研究大樓A棟)
- 茶 會:上午10:10統計所6005會議室(環境變遷研究大樓A棟)
- Dr. Michael Minyi Zhang
- Department of Computer Science, Princeton University, USA.
Abstract
Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow.? We partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees.
最後更新日期: