Next: What is InformationSeptember 21,
Up: Are all encoding methods
Previous: Huffman Coding Example
- Example: The binary sequence
1111001001100111000010111011000
can be decoded as
wethepeople
- To encode all occurrences of the 7 characters we have considered
requires
- Hence, we require on average 2.54 bits per character.
- Information theorists have shown that at least
are required on average for representing the 7 characters in our example.
- The quantity H is called the entropy and is a fundamental limit
that no encoding method can ever exceed.
- Huffman coding achieves efficiencies very close to the entropy H.
- The entropy of the entire constitution is 4.1 bits per character.
- ASCII encoding uses 8 bits per character.
- Hence, we can expect approximately 50% compression (4.1/8) when we use
Huffman coding on the entire constitution.
Prof. Bernd-Peter Paris
1998-12-14