We study the approximate nearest neighbour method for cost-sensitive classification on lowdimensional manifolds embedded within a high-dimensional feature space. We determine the minimax learning rates for distributions on a smooth manifold, in a cost-sensitive setting. This generalises a classic result of Audibert and Tsybakov. Building upon recent work of Chaudhuri and Dasgupta we prove that these minimax rates are attained by the approximate nearest neighbour algorithm, where neighbours are computed in a randomly projected low-dimensional space. In addition, we give a bound on the number of dimensions required for the projection which depends solely upon the reach and dimension of the manifold, combined with the regularity of the marginal.
|Number of pages||46|
|Journal||Proceedings of Machine Learning Research|
|Publication status||Published - 17 Oct 2017|
|Event||International Conference on Algorithmic Learning Theory - Kyoto, Japan|
Duration: 1 Oct 2017 → 3 Oct 2017
Conference number: 28