Abstract
Least Squares Support Vector Machines (LSSVM) perform classification using L2-norm on the weight vector and a squared loss function with linear constraints. The major advantage over classical L2-norm support vector machine (SVM) is that it solves a system of linear equations rather than solving a quadratic programming problem. The L2norm penalty on the weight vectors is known to robustly select features. The zero-norm or the number of non-zero elements in a vector is an ideal quantity for feature selection. The L0-norm minimization is a computationally intractable problem. However, a convex relaxation to the direct zero-norm minimization was proposed recently. In this paper, we propose a combination of L2-norm penalty and the convex relaxation of the L0-norm penalty for feature selection in classification problems. We propose a primaldual framework for feature selection using the combination of L2-norm and L0-norm penalty resulting in closed form solution. A series of experiments on microarray data and UCI data demonstrates that our proposed method results in better performance.
Original language | English |
---|---|
Number of pages | 4 |
Publication status | Published - 2013 |
Event | The 19th International Conference on Management of Data (COMAD) - Ahmedabad, India Duration: 19 Dec 2013 → 21 Dec 2013 |
Conference
Conference | The 19th International Conference on Management of Data (COMAD) |
---|---|
Country/Territory | India |
City | Ahmedabad |
Period | 19/12/13 → 21/12/13 |