Rényi Divergence and Kullback-Leibler Divergence
Abstract
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Rényi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of {\sigma}-algebras and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders), and present several other minimax results.