Search Machine Learning Repository: Near-Optimal Bounds for Cross-Validation via Loss Stability
Authors: Ravi Kumar, Daniel Lokshtanov, Sergei Vassilvitskii and Andrea Vattani
Conference: Proceedings of the 30th International Conference on Machine Learning (ICML-13)
Year: 2013
Pages: 27-35
Abstract: Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure of stability called \emph{loss stability} and relate the cross-validation performance to loss stability; we also establish that this relationship is near-optimal. Our work thus quantitatively improves the current best bounds on cross-validation.
[pdf] [BibTeX]

authors venues years
Suggest Changes to this paper.
Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).