Multiple Hypothesis Testing
The Multiple Hypothesis Testing problem occurs when a number of individual hypothesis tests are considered simultaneously. In this case, the significance or the error rate of individual tests no longer represents the error rate of the combined set of tests.
Maximum Likelihood Test
The Maximum Likelihood (ML) is a parameter estimation method, in which the parameters of a probability distribution are chosen to maximize the likelihood function.
Maximum A Posteriori Probability (MAP) can be seen as a regularization of maximum likelihood estimation. It is an estimate of an unknown quantity, that equals the mode of the posterior distribution.
Mutual Information
Mutual Information is one of many quantities that measures how much one random variable tells about the other. It is a dimensionless quantity with units of bits, and can be thought of as the reduction in uncertainty about one random variable given knowledge of another.
Fano's Inequality
Fano's Inequality relates the average information lost in a noisy channel to the probability of the categorization error.
Fano's Inequality is a result from information theory that relates the conditional entropy of a random variable X relative to the correlated variable Y to the probability of incorrectly estimating X from Y.
The intuition here is that the probability of making a mistake when estimating X using the value of Y is going to depend on how certain we are about the value of X given Y.
Log in to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Sign up to save your progress and obtain a certificate in Alison’s free Diploma in the Foundations of Information Theory online course
Please enter you email address and we will mail you a link to reset your password.