Search Machine Learning Repository:
A proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions
Authors: Quoc T. Dinh, Anastasios Kyrillidis and Volkan Cevher
Conference: Proceedings of the 30th International Conference on Machine Learning (ICML-13)
Abstract: We propose an algorithmic framework for convex minimization problems of composite functions with two terms: a self-concordant part and a possibly nonsmooth regularization part. Our method is a new proximal Newton algorithm with local quadratic convergence rate. As a specific problem instance, we consider sparse precision matrix estimation problems in graph learning. Via a careful dual formulation and a novel analytic step-size selection, we instantiate an algorithm within our framework for graph learning that avoids Cholesky decompositions and matrix inversions, making it attractive for parallel and distributed implementations.
authors venues years
Suggest Changes to this paper.
Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).