We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of/-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.
|Title of host publication||Proceedings of the 28th International Conference on Machine Learning, ICML 2011|
|Number of pages||8|
|Publication status||Published - 7 Oct 2011|
|Event||28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United Kingdom|
Duration: 28 Jun 2011 → 2 Jul 2011
|Conference||28th International Conference on Machine Learning, ICML 2011|
|Period||28/06/11 → 2/07/11|