Search Machine Learning Repository:
Cost-sensitive Multiclass Classification Risk Bounds
Authors: Bernardo V. Pires, Csaba Szepesvari and Mohammad Ghavamzadeh
Conference: Proceedings of the 30th International Conference on Machine Learning (ICML-13)
Abstract: A commonly used approach to multiclass classification is to replace the $0-1$ loss with a convex surrogate so as to make empirical risk minimization computationally tractable. Previous work has uncovered sufficient and necessary conditions for the consistency of the resulting procedures. In this paper, we strengthen these results by showing how the $0-1$ excess loss of a predictor can be upper bounded as a function of the excess loss of the predictor measured using the convex surrogate. The bound is developed for the case of cost-sensitive multiclass classification and a convex surrogate loss that goes back to the work of Lee, Lin and Wahba. The bounds are as easy to calculate as in binary classification. Furthermore, we also show that our analysis extends to the analysis of the recently introduced ``Simplex Coding'' scheme.
authors venues years
Suggest Changes to this paper.
Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).