Projects per year
We provide a unifying perspective for two decades of work on cost-sensitive Boosting algorithms. When analyzing the literature 1997–2016, we find 15 distinct cost-sensitive variants of the original algorithm; each of these has its own motivation and claims to superiority—so who should we believe? In this work we critique the Boosting literature using four theoretical frameworks: Bayesian decision theory, the functional gradient descent view, margin theory, and probabilistic modelling. Our finding is that only three algorithms are fully supported—and the probabilistic model view suggests that all require their outputs to be calibrated for best performance. Experiments on 18 datasets across 21 degrees of imbalance support the hypothesis—showing that once calibrated, they perform equivalently, and outperform all others. Our final recommendation—based on simplicity, flexibility and performance—is to use the original Adaboost algorithm with a shifted decision threshold and calibrated probability estimates.
|Number of pages||26|
|Early online date||2 Aug 2016|
|Publication status||Published - 1 Sep 2016|
- Jean Golding
- Class imbalance
- Classifier calibration
FingerprintDive into the research topics of 'Cost-sensitive boosting algorithms: Do we really need them?'. Together they form a unique fingerprint.
- 2 Finished
SPHERE (EPSRC IRC)
Craddock, I. J., Coyle, D. T., Flach, P. A., Kaleshi, D., Mirmehdi, M., Piechocki, R. J., Stark, B. H., Ascione, R., Ashburn, A. M., Burnett, M. E., Damen, D., Gooberman-Hill, R., Harwin, W. S., Hilton, G., Holderbaum, W., Holley, A. P., Manchester, V. A., Meller, B. J., Stack, E. & Gilchrist, I. D.
1/10/13 → 30/09/18
Project: Research, Parent
Engineering and Physical Sciences Research Council
1/02/13 → 1/08/16
Professor Peter A Flach
- Intelligent Systems Laboratory
- Department of Computer Science - Professor of Artificial Intelligence
Person: Academic , Member