Measures of Information
1. Shannon entropy is a measure of uncertainty associated with random variables.
2. Joint entropy is the amount of information in two (or more) random variables. Conditional entropy is the amount of information in one random variable given we already know the other.
3. Mutual Information measures how much one random variable tells about the other. It is a dimensionless quantity with units of bits and can be thought of as the reduction in uncertainty about one random variable given knowledge of another.
4. Kullback–Leibler Divergence is a way to measure the difference between two probability distributions.
5. The Total Variation Distance is a measure used for assessing the distances between distributions. It is also known as variational distance, statistical distance or L1 -distance.
Properties of Measures of Information
Chain Rules allows dividing the overall uncertainty in a random variable into smaller components.
Properties of Measures of Information Related to Shapes
• Log-Sum Inequality
• Convex and Concave Functions
• Nonnegative Measures of Information
• Boundedness Measures of Information
Log in to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Sign up to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Please enter you email address and we will mail you a link to reset your password.