Relative entropy (relative entropy) is also called KL divergence (Kullback–Leibler divergence, KLD for short), information divergence ( information divergence), information gain (information gain).
KL divergence is a measure of the asymmetry of the difference between two probability distributions P and Q.
KL divergence is a measure of the number of extra bits required to encode the average of samples from P using Q-based coding. Typically, P represents the true distribution of the data, and Q represents the theoretical distribution of the data, the model distribution, or an approximate distribution of P.
http://zh.wikipedia.org/zh-tw/%E7%9B%B8%E5 %AF%B9%E7%86%B5
http://baike. baidu.com/view/951299.htm