Find link

language:

jump to random article

Find link is a tool written by Edward Betts.

searching for Kullback–Leibler divergence 9 found (152 total)

alternate case: kullback–Leibler divergence

Normal-inverse-gamma distribution (2,039 words) [view diff] no match in snippet view article find links to article

In probability theory and statistics, the normal-inverse-gamma distribution (or Gaussian-inverse-gamma distribution) is a four-parameter family of multivariate
McCullagh's parametrization of the Cauchy distributions (577 words) [view diff] exact match in snippet view article find links to article
also let easily prove the invariance of f-divergences (e.g., Kullback-Leibler divergence, chi-squared divergence, etc.) with respect to real linear fractional
LogSumExp (1,152 words) [view diff] exact match in snippet view article find links to article
2020. Nielsen, Frank; Sun, Ke (2016). "Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities"
Central tendency (1,720 words) [view diff] exact match in snippet view article find links to article
MLE minimizes cross-entropy (equivalently, relative entropy, KullbackLeibler divergence). A simple example of this is for the center of nominal data:
Truncated normal distribution (2,251 words) [view diff] exact match in snippet view article find links to article
family. Nielsen reported closed-form formula for calculating the Kullback-Leibler divergence and the Bhattacharyya distance between two truncated normal distributions
Dirichlet distribution (6,666 words) [view diff] exact match in snippet view article find links to article
distribution to the convex conjugate of the scaled reversed Kullback-Leibler divergence: log ⁡ E ⁡ ( exp ⁡ ∑ i = 1 K s i X i ) ≤ sup p ∑ i = 1 K ( p
Wasserstein metric (5,194 words) [view diff] exact match in snippet view article find links to article
Springer. ISBN 978-3-540-71050-9. "What is the advantages of Wasserstein metric compared to KullbackLeibler divergence?". Stack Exchange. August 1, 2017.
Lambert W function (12,429 words) [view diff] exact match in snippet view article find links to article
a set of histograms defined with respect to the symmetrized KullbackLeibler divergence (also called the Jeffreys divergence ) has a closed form using
Brahmi script (15,027 words) [view diff] exact match in snippet view article find links to article
; Panigrahi, B. K. (2009). "Multi-objective optimization of Kullback-Leibler divergence between Indus and Brahmi writing". World Congress on Nature &