jump to main area
:::
A- A A+

Seminars

Scalable inference for Bayesian Non-parametrics

  • 2020-03-02 (Mon.), 10:30 AM
  • R6005, Research Center for Environmental Changes Building
  • Dr. Michael Minyi Zhang
  • Department of Computer Science, Princeton University, USA.

Abstract

Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow.? We partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees.

Update:
scroll to top