jump to main area
:::
A- A A+

Seminars

Approximate Bayesian Inference of Variance Components in Generalized Linear Mixed Models

  • 2006-06-26 (Mon.), 10:30 AM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Prof. Miao-Yu Tsai   
  • Graduate Institute of. Statistics and Information Science, National Changhua University of Education

Abstract

A common class of models for correlated data is the generalized linear mixed models (GLMM). They offer many advantages including the flexibility of accounting for both fixed and random effects, and the ability of making inference of complex covariances. However, the inference of variance components can be a difficult task. The Bayes factor is often used in the Bayesian tests of variance components, and is sensitive to the prior distributions. Furthermore, the calculation of Bayes factors is difficult due to the multi- dimension of the covariance matrix and complexity of the likelihood function. In this talk, I will consider fully Bayesian approach on the random covariance matrix, and discuss the relation between Bayesian estimates and the restricted maximum likelihood estimate (REML). I will then show that the posterior modes are asymptotically equivalent to the REML estimate. In addition, I will focus on the Bayesian tests in GLMMs with two reference priors, the approximate uniform shrinkage prior and approximate Jeffreys' prior. Besides, I will consider a simulated version of Laplace's method combining the importance sampling approach to evaluate the approximate Bayes factor under GLMMs. Finally, simulation studies and two clinical trials are considered for illustrations. All results based on our methods are compared with frequentist approach.

Update:
scroll to top