Primal-Dual Framework for Feature Selection using Least Squares Support Vector Machines

Raghvendra Mall, Johan A.K. Suykens, Mohammed El Anbari, Halima Bensmail

Research output: Contribution to conferencePaperpeer-review

Abstract

Least Squares Support Vector Machines (LSSVM) perform classification using L2-norm on the weight vector and a squared loss function with linear constraints. The major advantage over classical L2-norm support vector machine (SVM) is that it solves a system of linear equations rather than solving a quadratic programming problem. The L2norm penalty on the weight vectors is known to robustly select features. The zero-norm or the number of non-zero elements in a vector is an ideal quantity for feature selection. The L0-norm minimization is a computationally intractable problem. However, a convex relaxation to the direct zero-norm minimization was proposed recently. In this paper, we propose a combination of L2-norm penalty and the convex relaxation of the L0-norm penalty for feature selection in classification problems. We propose a primaldual framework for feature selection using the combination of L2-norm and L0-norm penalty resulting in closed form solution. A series of experiments on microarray data and UCI data demonstrates that our proposed method results in better performance.
Original languageEnglish
Number of pages4
Publication statusPublished - 2013
EventThe 19th International Conference on Management of Data (COMAD) - Ahmedabad, India
Duration: 19 Dec 201321 Dec 2013

Conference

ConferenceThe 19th International Conference on Management of Data (COMAD)
Country/TerritoryIndia
CityAhmedabad
Period19/12/1321/12/13

Fingerprint

Dive into the research topics of 'Primal-Dual Framework for Feature Selection using Least Squares Support Vector Machines'. Together they form a unique fingerprint.

Cite this