Alison's New App is now available on iOS and Android!

en

# Understanding the Properties of Information Measures

## Learn about the properties used for measuring information and their conditional variants with this free online course.

Publisher: NPTEL
Have you ever wondered about the ways of measuring information? This course aims at answering this question by illustrating the significant properties used in the process of quantifying information. You will discover the underlying connection between ‘Information Theory’ and ‘Measure Theory’. Learn about the mechanisms behind reconstructing a body of evidence from incomplete knowledge by registering in this course now.

1.5-3 Hours

13

CPD

## Description

The process of measuring information is solely based on the probabilities of the events that convey the information. This course begins by describing the various measures of information and their conditional variants. You will explore the significance of probability functions in ascertaining discrete and continuous distributions. In addition to this, the process of building larger uncertainty by quantifying the amount of information held in a random variable is explained. Next, you will be taught about the various shapes of information measure functions. This will include the functional measures of the shape of univariate distributions with respect to the concave and convex transform order. Following this, you will discover the usefulness of the data processing inequality (DPI) in quantum information. The holds for different entropy measures for characterizing the uncertainty about different systems are also illustrated. Further, you will explore the various reasons behind the non-enhancement of information using post-processing inequalities.

Next, the course highlights the importance of inequalities in the context of information theory. You will explore a number of different contexts in which these inequalities appear. A number of new inequalities on the entropy rates of subsets and the relationship of entropy and Lp-norm are also discussed. The intimate relationship between inequalities and Kullback–Leibler divergence is explored, culminating in a common proof of the lower bounds for the inequalities derived as linear combinations. The process of expressing the interaction information as a special case of Gibbs' inequality of the joint distribution with respect to the product of the marginal is also described. Following this, you will study about a model of a communication system that relates the average information lost in a noisy channel to the probability of the categorization error. In addition to this, you will discover how Fano’s inequality is used to find a lower bound on the error probability of any decoder as well as the lower bounds for minimax risks in density estimation.

Finally, the course explains the concept of variational formula in the quantities of information theory framework. You will be taught about the procedure for measuring the reduction in uncertainty for one variable given a known value of the other variable using variational equations. This will include the process of measuring the distance between two partitions of elements and also expressing a quantity in terms of its minimum or maximum values. Following this, the applications of the variational formulae used in various quantities of information theory are explained. You will be taught about the procedure for determining the maximum number of bits that can be transmitted error-free per channel use. Lastly, the method of bounding the total variation distance by refining its constant factors on a measurable space is explained. ‘Understanding Measures of Information and their Properties’ is an informative course that highlights the parallels between the inequalities in information theory and inequalities in other branches of mathematics such as matrix theory and probability theory. Enrol in this course now and explore the connections between different information measures.

Start Course Now