jump to main area
:::
A- A A+

Seminars

Reduced Kernel on Support Vector Machines

  • 2005-05-09 (Mon.), 10:30 AM
  • Recreation Hall, 2F, Institute of Statistical Science
  • Prof. Yuh-Jye Lee
  • Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology

Abstract

The reduced support vector machine (RSVM) was proposed for the practical objective to overcome the computational difficulties as well as to reduce the model complexity in generating a nonlinear separating surface for a massive data set. It has been also successfully applied to other kernel-based learning algorithms. Also, there were experimental studies on RSVM that showed the efficiency of RSVM. In this talk, we first present a study the RSVM from the viewpoint of robust design in model building and consider the nonlinear separating surface as a mixture of kernels. The RSVM uses a compressed model representation instead of a saturated full model. Our main result shows that the uniform random selection of a reduced set to form the compressed model in RSVM is the optimal robust selection scheme in terms of the following criteria: (1) it minimizes an intrinsic model variation measure; (2) it minimizes the maximal model bias between the compressed model and the full model; (3) it maximizes the minimal test power in distinguishing the compressed model from the full model. In the second part of the talk, we propose a new algorithm, Incremental Reduced Support Vector Machine (IRSVM). In contrast to the uniform random selection of a reduced set used in RSVM, IRSVM begins with an extremely small reduced set and incrementally expands the reduced set according to an information criterion. This information-criterion based incremental selection can be achieved by solving a series of small least squares problems. In our approach, the size of reduced set will be determined automatically and dynamically but not pre-specified. The experimental tests on four publicly available datasets from the University of California (UC) Irvine repository show that IRSVM used a smaller reduced set than RSVM without scarifying classification accuracy.

Update:
scroll to top