Skip to content

Ensemble of a subset of kNN classifiers

Research output: Contribution to journalArticle

Original languageEnglish
Pages (from-to)827-840
Number of pages14
JournalAdvances in Data Analysis and Classification
Volume12
Issue number4
Early online date22 Jan 2016
DOIs
DateAccepted/In press - 10 Dec 2015
DateE-pub ahead of print - 22 Jan 2016
DatePublished (current) - Dec 2018

Abstract

Combining multiple classifiers, known as ensemble methods, can give substantial improvement in prediction performance of learning algorithms especially in the presence of non-informative features in the data sets. We propose an ensemble of subset of kNN classifiers, ESkNN, for classification task in two steps. Firstly, we choose classifiers based upon their individual performance using the out-of-sample accuracy. The selected classifiers are then combined sequentially starting from the best model and assessed for collective performance on a validation data set. We use bench mark data sets with their original and some added non-informative features for the evaluation of our method. The results are compared with usual kNN, bagged kNN, random kNN, multiple feature subset method, random forest and support vector machines. Our experimental comparisons on benchmark classification problems and simulated data sets reveal that the proposed ensemble gives better classification performance than the usual kNN and its ensembles, and performs comparable to random forest and support vector machines.

    Research areas

  • Bagging, Ensemble methods, Nearest neighbour classifier, Non-informative features

Download statistics

No data available

Documents

Documents

  • Full-text PDF (final published version)

    Rights statement: This is the final published version of the article (version of record). It first appeared online via Springer at http://link.springer.com/article/10.1007%2Fs11634-015-0227-5. Please refer to any applicable terms of use of the publisher.

    Final published version, 844 KB, PDF document

    Licence: CC BY

DOI

View research connections

Related faculties, schools or groups