Search Machine Learning Repository:
**Nonparametric Estimation of Renyi Divergence and Friends**

**Authors:** *Akshay Krishnamurthy*, *Kirthevasan Kandasamy*, *Barnabas Poczos* and *Larry Wasserman*

**Conference:** Proceedings of the 31st International Conference on Machine Learning (ICML-14)

**Year:** 2014

**Pages:** 919-927

**Abstract:** We consider nonparametric estimation of $L_2$, Renyi-$\alpha$ and Tsallis-$\alpha$ divergences between continuous distributions. Our approach is to construct estimators for particular integral functionals of two densities and translate them into divergence estimators. For the integral functionals, our estimators are based on corrections of a preliminary plug-in estimator. We show that these estimators achieve the parametric convergence rate of $n^{-1/2}$ when the densities' smoothness, $s$, are both at least $d/4$ where $d$ is the dimension. We also derive minimax lower bounds for this problem which confirm that $s > d/4$ is necessary to achieve the $n^{-1/2}$ rate of convergence. We validate our theoretical guarantees with a number of simulations.

[pdf] [BibTeX]

authors venues years

Suggest Changes to this paper.

Brought to you by the WUSTL Machine Learning Group. We have open faculty positions (tenured and tenure-track).