Kullback–Leibler Divergence Calculator

JJ Ben-Joseph headshot JJ Ben-Joseph

What this calculator does

This page helps you compare two discrete probability distributions (probability vectors) P and Q defined over the same set of outcomes (categories). You enter the probabilities as comma-separated lists (for example, 0.6, 0.4). The calculator then reports common information-theoretic divergence measures, such as:

You can also choose the log base (natural log gives results in nats; base 2 gives results in bits).

Definitions and formulas (discrete case)

Assume P and Q are discrete distributions over outcomes i=1,,n, with Pi0, Qi0, and iPi=iQi=1.

KL divergence

The Kullback–Leibler divergence from P to Q is:

DKL (PQ) = i=1 n Pi · log ( Pi Qi )

If you choose ln, then log is ln and the unit is nats. If you choose log2, the unit is bits.

Cross-entropy

Cross-entropy of P relative to Q is:

H(P, Q) = - ∑ P(i) log Q(i)

It relates to KL divergence via:

H(P, Q) = H(P) + D_KL(P‖Q), where H(P) = -∑ P(i) log P(i) is the entropy of P.

Jensen–Shannon divergence (JSD)

JSD is a symmetric, smoothed divergence based on the mixture M=12(P+Q):

JSD(P, Q) = 1/2 · D_KL(P‖M) + 1/2 · D_KL(Q‖M)

With base-2 logs, JSD is bounded between 0 and 1 bit for discrete distributions.

How to interpret the results

Worked example

Let:

Using natural logs:

D_KL(P‖Q) = 0.6·ln(0.6/0.5) + 0.4·ln(0.4/0.5)

= 0.6·ln(1.2) + 0.4·ln(0.8) ≈ 0.6·0.1823 + 0.4·(-0.2231) ≈ 0.0201 nats

This is small, indicating P and Q are close.

Metric comparison (at a glance)

Metric Discrete formula Symmetric? Range / behavior Notes
KL(P‖Q) ∑ P(i) log(P(i)/Q(i)) No ≥ 0; can be ∞ Undefined/infinite if Q(i)=0 where P(i)>0
KL(Q‖P) ∑ Q(i) log(Q(i)/P(i)) No ≥ 0; can be ∞ Highlights different failure modes than KL(P‖Q)
Cross-entropy H(P,Q) −∑ P(i) log Q(i) No ≥ H(P); can be ∞ Common in classification/log-loss settings
JSD(P,Q) ½·KL(P‖M)+½·KL(Q‖M), M=(P+Q)/2 Yes Finite; bounded (≤ 1 bit with log2) More stable and interpretable for “distance-like” comparison

Limitations and assumptions (important)

References

Enter P and Q.

Embed this calculator

Copy and paste the HTML below to add the Kullback–Leibler (KL) Divergence Calculator to your website.