A Least Square Kernel Machine With Box Constraints
Jayanta Basak
Abstract
Principle of parsimony (Occam’s razor) is a key principle where the unnecessary complexity of a classifier is regulated to improve the generalization performance in pattern classification. In decision tree construction, often the complexity is regulated by early stopping the growth of a tree at a node whenever the impurity at that node reaches below a threshold. In this paper, we generalize this heuristic and express this principle in terms of constraining the outcome of a classifier instead of explicitly regularizing the model complexity in terms of the model parameters. We construct a classifier using this heuristic namely, a least square kernel machine with box constraints (LSKMBC). In our approach, we consider uniform priors and obtain the loss functional for a given margin considered to be a model selection parameter. The framework not only di?ers from the existing leastsquare kernel machines, but also it does not require Mercer condition satisfiability. We also discuss the relation- ship of the proposed kernel machine with several other existing kernel machines. Experimentally we validate the performance of the classifier over real-life datasets, and observe that LSKMBC performs competitively, and is able to produce certain results even better than SVM.