跳到主要內容區塊
:::
A- A A+

演講公告

:::

【交誼廳視訊轉播】Exploring the Second Order Sparsity in Large Scale Optimization

  • 2018-08-30 (Thu.), 16:00 PM
  • 中研院-統計所 2F 交誼廳(視訊轉播)
  • Dr. Xu-Dong Li
  • Princeton University, U.S.A.

Abstract

In this talk, we shall demonstrate how the second order sparsity (SOS) in important optimization problems such as the sparse optimization models, semidefinite programming, and many others can be explored to induce efficient algorithms. The SOS property allows us to incorporate the semismooth Newton methods into the augmented Lagrangian method framework in a way that the subproblems involved only need low to medium costs, e.g., for lasso problems with sparse solutions, the costs for solving the subproblems at each iteration of our second order method are comparable or even lower than those in many first order methods. Consequently, with the fast convergence rate in hand, usually asymptotically superlinear linear, we now reach the stage of being able to solve many challenging large scale convex optimization problems efficiently and robustly. For the purpose of illustration, we present a highly efficient software called LassoNAL for solving the well-known Lasso-type problems.

最後更新日期:
回頁首