跳到主要內容區塊
:::
A- A A+

博士後演講公告

:::

Model Averaging for High-dimensional Linear Regression Models with Dependent Observations

  • 2022-07-13 (Wed.), 14:00 PM
  • 統計所B1演講廳;茶 會:下午03:00。
  • 實體與線上視訊同步進行。
  • Mr. Ting-Hung Yu (余定宏先生)
  • Department of Statistics and Actuarial Science, University of Iowa, U.S.A.

Abstract

Averaging several informative models to make a better prediction has been a long-standing research area in statistics. However, there are only a few results in the high-dimensional statistics literature. In this talk, we will propose a two-stage procedure, named OGA+HDMMA, to perform the model averaging for gaining prediction efficiency in using high-dimensional linear regression models with dependent observations. We first introduce the orthogonal greedy algorithm (OGA) to screen out the nested sets of signal variables from high-dimensional data and construct nested high-dimensional linear regression models accordingly. In the second stage, we devise the high-dimensional Mallow model averaging (HDMMA) criteria to determine the weight for averaging those nested high-dimensional linear regression models found in the first stage. We further analyze rates of convergence of prediction error for the averaging model under different sparsity conditions. Our contribution is threefold. First, we show that our procedure can achieve optimal convergence rates of prediction error discussed in Ing (AoS, 2020). Second, simulation results show that the out-sample prediction of OGA+HDMMA performs favorably than the MCV method proposed in Ando and Li (JASA, 2014), especially when the covariates are highly correlated or possess time-series effects. Third, the finite sample out-sample prediction of OGA+HDMMA performs comparably or even better than many well-known high-dimensional variables selection methods in some scenarios.

線上視訊請點選連結

附件下載

1110713 余定宏先生.pdf
最後更新日期:2022-07-15 15:44
回頁首