jump to main area
:::
A- A A+

Seminars

Exploring the Second Order Sparsity in Large Scale Optimization

  • 2018-08-30 (Thu.), 16:00 PM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Dr. Xu-Dong Li
  • Princeton University, U.S.A.

Abstract

In this talk, we shall demonstrate how the second order sparsity (SOS) in important optimization problems such as the sparse optimization models, semidefinite programming, and many others can be explored to induce efficient algorithms. The SOS property allows us to incorporate the semismooth Newton methods into the augmented Lagrangian method framework in a way that the subproblems involved only need low to medium costs, e.g., for lasso problems with sparse solutions, the costs for solving the subproblems at each iteration of our second order method are comparable or even lower than those in many first order methods. Consequently, with the fast convergence rate in hand, usually asymptotically superlinear linear, we now reach the stage of being able to solve many challenging large scale convex optimization problems efficiently and robustly. For the purpose of illustration, we present a highly efficient software called LassoNAL for solving the well-known Lasso-type problems.

Update:
scroll to top