Abstract
The minimization of the empirical risk based on an arbitrary Bregman divergence is known to provide posterior class probability estimates in classification problems, but the accuracy of the estimate for a given value of the true posterior depends on the specific choice of the divergence. Ad hoc Bregman divergences can be designed to get a higher estimation accuracy for the posterior probability values that are most critical for a particular cost-sensitive classification scenario. Moreover, some sequences of Bregman loss functions can be constructed in such a way that their minimization guarantees, asymptotically, minimum number of errors in nonseparable cases, and maximum margin classifiers in separable problems. In this paper, we analyze general conditions on the Bregman generator to satisfy this property, and generalize the result for cost-sensitive classification.
Original language | English |
---|---|
Article number | 6331548 |
Pages (from-to) | 1896-1904 |
Number of pages | 9 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 23 |
Issue number | 12 |
DOIs | |
Publication status | Published - 1 Dec 2012 |
Keywords
- Bregman divergences
- cost-sensitive
- maximum margin
- posterior probability estimation