Loading

Module 1: Multiple Hypothesis Testing

Notes
Study Reminders
Support
Text Version

Multiple Hypothesis Testing - Lesson Summary

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Multiple Hypothesis Testing

The Multiple Hypothesis Testing problem occurs when a number of individual hypothesis tests are considered simultaneously. In this case, the significance or the error rate of individual tests no longer represents the error rate of the combined set of tests.

Maximum Likelihood Test

The Maximum Likelihood (ML) is a parameter estimation method, in which the parameters of a probability distribution are chosen to maximize the likelihood function.

Maximum A Posteriori Probability (MAP) can be seen as a regularization of maximum likelihood estimation. It is an estimate of an unknown quantity, that equals the mode of the posterior distribution.

Mutual Information

Mutual Information is one of many quantities that measures how much one random variable tells about the other. It is a dimensionless quantity with units of bits, and can be thought of as the reduction in uncertainty about one random variable given knowledge of another.

Fano's Inequality

Fano's Inequality relates the average information lost in a noisy channel to the probability of the categorization error.

Fano's Inequality is a result from information theory that relates the conditional entropy of a random variable X relative to the correlated variable Y to the probability of incorrectly estimating X from Y.

The intuition here is that the probability of making a mistake when estimating X using the value of Y is going to depend on how certain we are about the value of X given Y.