Loading
Notes
Study Reminders
Support
Text Version

Entropic Compression - Lesson Summary

Set your study reminders

We will email you at these times to remind you to study.
  • Monday

    -

    7am

    +

    Tuesday

    -

    7am

    +

    Wednesday

    -

    7am

    +

    Thursday

    -

    7am

    +

    Friday

    -

    7am

    +

    Saturday

    -

    7am

    +

    Sunday

    -

    7am

    +

Entropic and Universal Compression
Entropic Compression exploits the distribution and uses it to better represent a sequence of symbols. The distribution is actually learned or imported as a model for a given sequence. The process of learning the distribution from the data refers to Universal Compression.

Variable-length Code
A variable-length code is a code that maps source symbols to a variable number of bits. It allows sources to be compressed and decompressed with zero error.

Prefix-Free Codes
Prefix-free codes are those where no codeword is a prefix of another. They are relevant in practice for compressing a sequence of symbols and not a single symbol.

Kraft’s Inequality
Kraft's inequality limits the lengths of codewords in a prefix code. There is no chance of finding a prefix code if the lengths do not satisfy the Kraft inequality.

Shannon and Huffman Code
Shannon Code is a lossless data compression technique for constructing a prefix-free code based on a set of symbols and their probabilities.
A Huffman code is a particular type of optimal prefix-free code that is commonly used for lossless data compression.

Minmax Redundancy
The ultimate benchmark of performance that any code should aspire to is called the Minmax redundancy.

Compression Using Word Frequencies
Compression using Word Frequencies basically thinks of the symbol text as words. A book is a sequence of words that take the frequencies of those words and store them.