Inprobability theoryandinformation theory, the Kullback–Leibler divergence[1][2][3](also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits requ...
Proof. Like the previous lemma, we apply the change of variable X = Σ 2 − 1 / 2 Y and compute the KL-divergence between the transformed distributions. The expression for entropy (35) in the case of ZEG distributions becomes H ( X ) = 1 2 log ( | Σ | ) − ∫ 0 ∞ ...