Capped K-NN Editing in Definition Lacking Environments
Andrew Stranieri, Andrew Yatsko, Isaac Golden, Musa Mammadov, Adil Bagirov
Abstract
While any input may be contributing, imprecise specification of class of data subdivided into classes identifies as rather common a source of noise. The misrepresentation may be characteristic of the data or be caused by forcing of a regression problem into the classification type. Consideration is given to examples of this nature, and an alternative is proposed. In the main part, the approach is based on a well-known technique of data treatment for noise using k-NN. The paper advances an editing technique designed around idea of variable number of authenticating instances. Test runs performed on publicly available and proprietary data demonstrate high retention ability of the new procedure without loss of classification accuracy. Noise reduction methods in a broader classification context are extensively surveyed.