Search Machine Learning Repository:
Authors: Ian Goodfellow, David Warde-farley, Mehdi Mirza, Aaron Courville and Yoshua Bengio
Conference: Proceedings of the 30th International Conference on Machine Learning (ICML-13)
Abstract: We consider the problem of designing models to leverage a recently introduced approximate model averaging technique called dropout. We dene a simple new model called maxout (so named because its output is the max of a set of inputs, and because it is a natural companion to dropout) designed to both facilitate optimization by dropout and improve the accuracy of dropouts fast approximate model averaging technique. We empirically verify that the model successfully accomplishes both of these tasks. We use maxout and dropout to demonstrate state of the art classication performance on four benchmark datasets: MNIST, CIFAR-10, CIFAR-100, and SVHN.
authors venues years
Suggest Changes to this paper.
Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).