Alison's New App is now available on iOS and Android!

en

# Understanding Information and Statistical Inferences

## Learn about the process of measuring and examining a random sample of the population using statistical methods.

Publisher: NPTEL
Have you ever wondered how to analyse and draw valuable conclusions from data? This course aims at answering the question by assessing the plausibility of a hypothesis using different datasets. You will study about a statistical method that focuses on concepts and graphs to make decisions about population probability distributions. Learn how a finding becomes statistically significant by enrolling in this course now.

1.5-3 Hours

28

CPD

## Description

This course aims to communicate the interplay between information theory and statistics by introducing the basic elements of statistical decision theory. Initially, it begins by describing the procedure for estimating and testing the hypothesis using various statistical methods. You will discover the ways of casting detection problems using binary and m-ary hypothesis testing methods. Next, you will study about the practice of utilizing data analysis to infer properties of an underlying distribution of probability. This will include the process of estimating predictions based on the best available information. Following this, you will be taught about several conventional approaches for testing the hypothesis. In addition to this, the procedure for ascertaining the statistical power of a hypothesis test using Neyman-Pearson formulation is explained. The method of determining the best statistical model by assessing the goodness of fit of two competing models based on the ratio of their likelihoods are also described.

Next, the course highlights the procedure for determining the dissimilarity between two probability distributions. You will discover the significance of kullback-leibler divergence in ascertaining the distance of a given arbitrary distribution from the true distribution. Have you ever wondered how much information is revealed by a single coin toss? You will discover the method for building a heuristic model through bayesian formulation for ascertaining the information revealed through a coin toss. Following this, the procedures of testing more than one hypothesis at the same time are illustrated. You will comprehend the potential of multiple hypotheses testing to produce a discovery of the same or dependent datasets. In addition to this, the method of estimating the parameters of a probability distribution by maximizing a likelihood function is also explained. Subsequently, you will study the measure of similarity between two labels of the same data. This will include the ways of computing the average information lost in a noisy channel to the probability of the categorization error.

Finally, you will study how mutual information is used to measure the reduction in uncertainty about one random variable based on the knowledge of another. In addition to this, you will discover, how Fano’s inequality gives a lower bound on the mutual information between two random variables that take values on an element set. Lastly, you will learn about a set of probabilistic theoretical tools to characterize the fundamental and asymptotic performance limits of binary and multiple hypothesis testing. The statistical ways of testing the hypothesis and estimating the value of population distribution are the core components of this course. “Understanding Information and Statistical Inferences” is an information course that highlights quantities such as entropy, mutual information, total variation distance, and kullback-leibler divergence. It also explains how these quantities play a role in important problems in the filed of communication, statistics, and computer science. So, why wait? Enrol in this course now and explore how information-theoretic methods are used to predict and bound the performance in statistical decision theory.

Start Course Now