jump to main area
:::
A- A A+

Seminars

Convergence Rates for Posterior Distributions

  • 2000-12-28 (Thu.), 10:30 AM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Prof. Tzee-Ming Huang
  • Department of Statistic, Carnegie Mellon University

Abstract

The main goal of this thesis is to provide general theorems on convergence rates of posterior distributions that can be applied to density estimation and nonparametirc regression. There have been other results on convergence rates of posterior distributions, but what is new in this thesis is adaptive estimation. The idea of adaptive estimation can be illustrated using an example: suppose that the true density function or regression function is in a Sobolov space with smoothness parameter s, then the optimal convergence rate for this space in the minimax sense is known to ben-2s/(1+2s) . In previous results by others, there are examples showing how to specify priors according to s in order to achieve this optimal convergence rate. However, in the case where s is unknown, we would like to have a prior that doesn't depend on s yet the corresponding posterior distribution still achieves the optimal convergence rate simultaneously for all s. In such a case, we say that the posterior distribution is adaptive. In this thesis, examples are given to show how adaptive estimation can be achieved using the theorems provided here.

Update:
scroll to top