A Least Square Kernel Machine With Box Constraints
The Journal of Pattern Recognition Research (JPRR) provides an international forum for the electronic publication of high-quality research and industrial experience articles in all areas of pattern recognition, machine learning, and artificial intelligence. JPRR is committed to rigorous yet rapid reviewing. Final versions are published electronically
(ISSN 1558-884X) immediately upon acceptance.
A Least Square Kernel Machine With Box Constraints
Jayanta Basak
JPRR Vol 5, No 1 (2010); doi:10.13176/11.181 
Download
Jayanta Basak
Abstract
  Principle of parsimony (Occam’s razor) is a key principle where the unnecessary complexity of a classifier is regulated to improve the generalization performance in pattern classification. In decision tree construction, often the complexity is regulated by early stopping the growth of a tree at a node whenever the impurity at that node reaches below a threshold. In this paper, we generalize this heuristic and express this principle in terms of constraining the outcome of a classifier instead of explicitly regularizing the model complexity in terms of the model parameters. We construct a classifier using this heuristic namely, a least square kernel machine with box constraints (LSKMBC). In our approach, we consider uniform priors and obtain the loss functional for a given margin considered to be a model selection parameter. The framework not only di?ers from the existing leastsquare kernel machines, but also it does not require Mercer condition satisfiability. We also discuss the relation- ship of the proposed kernel machine with several other existing kernel machines. Experimentally we validate the performance of the classifier over real-life datasets, and observe that LSKMBC performs competitively, and is able to produce certain results even better than SVM.   
JPRR Vol 5, No 1 (2010); doi:10.13176/11.181 | Full Text  | Share this paper: