Kullback's inequality

In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability distributions on the real line, such that P is absolutely continuous with respect to Q, i.e. P<<Q, and whose first moments exist, then where is the rate function, i.e. the convex conjugate of the cumulant-generating function, of , and is the first moment of The Cramér–Rao bound is a corollary of this result.

Kullback's inequality

In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability distributions on the real line, such that P is absolutely continuous with respect to Q, i.e. P<<Q, and whose first moments exist, then where is the rate function, i.e. the convex conjugate of the cumulant-generating function, of , and is the first moment of The Cramér–Rao bound is a corollary of this result.