1024programmer Blog Relative entropy/KL divergence (Kullback–Leibler divergence, KLD)_weixin_34068198’s blog

Relative entropy/KL divergence (Kullback–Leibler divergence, KLD)_weixin_34068198’s blog

Relative entropy (relative entropy) is also called KL divergence (Kullback–Leibler divergence, KLD for short), information divergence ( information divergence), information gain (information gain).
KL divergence is a measure of the asymmetry of the difference between two probability distributions P and Q.

KL divergence is a measure of the number of extra bits required to encode the average of samples from P using Q-based coding. Typically, P represents the true distribution of the data, and Q represents the theoretical distribution of the data, the model distribution, or an approximate distribution of P.

http://zh.wikipedia.org/zh-tw/%E7%9B%B8%E5 %AF%B9%E7%86%B5

http://baike. baidu.com/view/951299.htm

This article is from the internet and does not represent1024programmerPosition, please indicate the source when reprinting:https://www.1024programmer.com/relative-entropy-kl-divergence-kullback-leibler-divergence-kld_weixin_34068198s-blog/

author: admin

Previous article
Next article

Leave a Reply

Your email address will not be published. Required fields are marked *

Contact Us

Contact us

181-3619-1160

Online consultation: QQ交谈

E-mail: [email protected]

Working hours: Monday to Friday, 9:00-17:30, holidays off

Follow wechat
Scan wechat and follow us

Scan wechat and follow us

Follow Weibo
Back to top
首页
微信
电话
搜索