Cost-sensitive sequences of Bregman divergences

Raúl Santos-Rodriguez, Jesús Cid-Sueiro

Research output: Contribution to journalArticle (Academic Journal)peer-review

1 Citation (Scopus)


The minimization of the empirical risk based on an arbitrary Bregman divergence is known to provide posterior class probability estimates in classification problems, but the accuracy of the estimate for a given value of the true posterior depends on the specific choice of the divergence. Ad hoc Bregman divergences can be designed to get a higher estimation accuracy for the posterior probability values that are most critical for a particular cost-sensitive classification scenario. Moreover, some sequences of Bregman loss functions can be constructed in such a way that their minimization guarantees, asymptotically, minimum number of errors in nonseparable cases, and maximum margin classifiers in separable problems. In this paper, we analyze general conditions on the Bregman generator to satisfy this property, and generalize the result for cost-sensitive classification.

Original languageEnglish
Article number6331548
Pages (from-to)1896-1904
Number of pages9
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number12
Publication statusPublished - 1 Dec 2012


  • Bregman divergences
  • cost-sensitive
  • maximum margin
  • posterior probability estimation


Dive into the research topics of 'Cost-sensitive sequences of Bregman divergences'. Together they form a unique fingerprint.

Cite this