Loading

Module 1: Information Theory Lower Bounds

Notes
Study Reminders
Support
Text Version

Information Theory Lower Bounds - Lesson Summary

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Information Theory Lower Bounds
Information-Theoretic Lower Bounds are derived for the purpose of minimizing the time required by any algorithm for distributed function computation over a network of point-to-point channels with finite capacity.
This course introduces:
• Lower bound for compression problem, derived by using Chebychev's inequality and Fano's inequality.
• Shannon's source coding theorem establishes the limits to possible data compression and the operational meaning of the Shannon entropy.
• Lower bound for random number generation, which relies on the continuity of entropy.
• Lower bound for hypothesis testing, derived by using data processing inequality and the log-likelihood ratio test.
• Two different proof of strong converse for source coding. The first proof looming into the problem and trying to figure out what is the cardinality of the large probabilities, while the second proof uses Fano's inequality.
• Minmax lower bound for statistical estimation, derived by using Markov inequality and Fano's inequality.