Loading

Alison's New App is now available on iOS and Android! Download Now

Understanding the Properties of Information Measures

This free online course explores the properties used for measuring information and its conditional variants.

Publisher: NPTEL
Have you ever wondered about the ways you can measure information? This course aims to answer this question by illustrating the significant properties used in quantifying information. You will discover the underlying connection between ‘Information Theory’ and ‘Measure Theory’. Learn about the mechanisms behind reconstructing a body of evidence from incomplete knowledge by registering for this course today!
Understanding the Properties of Information Measures
  • Duration

    1.5-3 Hours
  • Students

    43
  • Accreditation

    CPD

Description

Modules

Outcome

Certification

View course modules

Description

You base the process of measuring information sorely on the probabilities of the events that convey the information. This course begins by describing the various measures of information and their conditional variants. Investigate the significance of probability functions in ascertaining discrete and continuous distributions. In addition to this,we explain the process of building larger uncertainty by quantifying the amount of information held in a random variable. Next, you will learn about the various shapes of information measure functions. This includes the functional measures of the shape of univariate distributions with respect to the concave and convex transform order. Discover the usefulness of the data processing inequality (DPI) in quantum information. We illustrate the holds for different entropy measures for characterizing the uncertainty about other systems. Further, you will explore the various reasons behind the non-enhancement of information using post-processing inequalities.

This course highlights the importance of inequalities in the context of information theory. You will research several different contexts in which these inequalities appear. First, we discuss some new inequalities on subsets’ entropy rates and the relationship between entropy and Lp-norm. Then, we analyse the intimate relationship between inequalities and Kullback–Leibler divergence, culminating in a common proof of the lower bounds for the inequalities derived as linear combinations. The process of expressing the interaction information is a special case of Gibbs' inequality of the joint distribution to the marginal product. Following this, you will study a communication system model that relates the average information lost in a noisy channel to the probability of the categorization error. In addition to this, you will uncover how to use Fano’s inequality to find a lower bound on the error probability of any decoder and the lower bounds for minimax risks in density estimation.

Finally, the course explains the concept of variational formula in the quantities of information theory framework. Using variational equations, investigate the procedure for measuring the reduction in uncertainty for one variable given a known value of the other variable. This will include measuring the distance between two partitions of elements and expressing a quantity in terms of its minimum or maximum values. We outline the applications of the variational formulae used in various amounts of information theory. You will learn the procedure for determining the maximum number of bits transmitted error-free per channel use. Lastly, we describe the method of bounding the total variation distance by refining its constant factors on a measurable space. ‘Understanding Measures of Information and their Properties’ is an informative course that highlights the parallels between the inequalities in information theory and imbalances in other branches of mathematics such as matrix theory and probability theory. Enrol in this course now and explore the connections between different information measures.

Start Course Now

Careers