Search This Blog

Entropy in information theory

Entropy in information theory:

Average amount of Information per individual message is known as Entropy.

It is a very important topic in Information theory and Coding.

Check the following video for detail explanation:


If you want the actual derivation , check this: 7 , equation 1.20)

No comments