Abstract
We consider classification in the presence of classdependent asymmetric label noise with unknown noise probabilities. In this setting, identifiability conditions are known, but additional assumptions were shown to be required for finite sample rates, and so far only the parametric rate has been obtained. Assuming these identifiability conditions, together with a measure-smoothness condition on the regression function and Tsybakov’s margin condition, we show that the Robust kNN classifier of Gao et al. attains, the mini-max optimal rates of the noise-free setting, up to a log factor, even when trained on data with unknown asymmetric label noise. Hence, our results provide a solid theoretical backing for this empirically successful algorithm. By contrast the standard kNN is not even consistent in the setting of asymmetric label noise. A key idea in our analysis is a simple kNN based method for estimating the maximum of a function that requires far less assumptions than existing mode estimators do, and which may be of independent interest for noise proportion estimation and randomised optimisation problems.
Original language | English |
---|---|
Pages (from-to) | 5401-5409 |
Number of pages | 9 |
Journal | Proceedings of Machine Learning Research |
Volume | 97 |
Publication status | Published - 15 Jun 2019 |
Event | International Conference on Machine Learning - Long Beach, United States Duration: 9 Jun 2019 → 15 Jun 2019 Conference number: 36 https://icml.cc/Conferences/2019 |