jump to main area
:::
A- A A+

Seminars

Smooth Support Vector Machines for Classification and Regression

  • 2002-08-26 (Mon.), 10:30 AM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Prof. Yuh-Jye Lee
  • Department of Computer Science and Information Engineering, National Taiwan University

Abstract

Support Vector Machines (SVMs), a new generation learning system based on the statistical learning theory, have been significantly developed in the theoretical understanding as well as algorithmic implementing in the last decade. In this talk, we employ a smoothing technique to reformulate the support vector machines for classification and regression as an unconstrained minimization problem. We term such reformulation Smooth Support Vector Machines (SSVMs). A fast Newton-Armijo algorithm for solving SSVMs is proposed that converges to the unique solution of the unconstrained minimization problem globally and quadratically. In order to deal with the nonlinear classification and regression problems with massive datasets, we introduce the reduced kernel technique that will speed up the computational time as well as reduce the memory usage. Numerical results and comparisons are given to demonstrate the effectiveness and speed of SSVMs.

Update:
scroll to top