 Notes Study Reminders Support
Text Version

### Properties of Measures of Information - Lesson Summary

We will email you at these times to remind you to study.
• Monday

Tuesday

Wednesday

Thursday

Friday

Saturday

Sunday

Measures of Information
1. Shannon entropy is a measure of uncertainty associated with random variables.
2. Joint entropy is the amount of information in two (or more) random variables. Conditional entropy is the amount of information in one random variable given we already know the other.
3. Mutual Information measures how much one random variable tells about the other. It is a dimensionless quantity with units of bits and can be thought of as the reduction in uncertainty about one random variable given knowledge of another.
4. Kullback–Leibler Divergence is a way to measure the difference between two probability distributions.
5. The Total Variation Distance is a measure used for assessing the distances between distributions. It is also known as variational distance, statistical distance or L1 -distance.
Properties of Measures of Information
Chain Rules allows dividing the overall uncertainty in a random variable into smaller components.
Properties of Measures of Information Related to Shapes
• Log-Sum Inequality
• Convex and Concave Functions
• Nonnegative Measures of Information
• Boundedness Measures of Information