Proper Losses for Learning with Example-Dependent Costs

Alex Hepburn, Ryan McConville, Raul Santos-Rodriguez, Jesús Cid-Sueiro, Darío García-García

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

1 Citation (Scopus)
65 Downloads (Pure)

Abstract

We study the design of cost-sensitive learning algorithms with example-dependent costs, when cost matrices for each example are given both during training and test. The approach is based on the empirical risk minimization framework, where we replace the standard loss function by a combination of surrogate losses belonging to the family of proper losses. The actual contribution of each example to the risk is then given by a loss that depends on the cost matrix for the specific example. We then evaluate the use of such example-dependent loss functions in real-world binary and multiclass problems, namely credit risk assessment and musical genre classification. Using different neural network architectures, we show that with the appropriate choice of the example-dependent losses, we can outperform conventional cost-sensitive methods in terms of total cost, making a more efficient use of cost information during training and test as compared to existing discriminative approaches.
Original languageEnglish
Title of host publicationSecond International Workshop on Learning with Imbalanced Domains
Subtitle of host publicationTheory and Applications, 10 September 2018, ECML-PKDD, Dublin, Ireland
PublisherPMLR
Pages52-66
Number of pages15
Publication statusPublished - 5 Nov 2018

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume94
ISSN (Electronic)2640-3498

Keywords

  • Cost-sensitive
  • Proper losses
  • Bregman divergences

Fingerprint

Dive into the research topics of 'Proper Losses for Learning with Example-Dependent Costs'. Together they form a unique fingerprint.

Cite this