Homework 4 (Due: February 25)

[PDF]

TCOM 500: Modern Telecommunications
Prof. B.-P. Paris
Homework 4
Due: February 25, 2010

Reading
review class notes.
Problems
  1. Huffman Coding: a source produces symbols with the following probabilities:
    P [X  = 1] = 0.3  P[X  = 2] = 0.2  P[X  = 3] = 0.2
P [X  = 4] = 0.1  P[X  = 5] = 0.1  P[X =  6] = 0.1.

    1. Compute the Entropy of this source.
    2. Construct a Huffman code for this source.
    3. What is the average code length of the code?
  2. Huffman Coding with Multiple Symbols: A biased coin produces heads and tails with probailities Pr[X = H] = 0.6 and Pr[X = T] = 0.4, respectively.
    1. Compute the Entropy of this source.
    2. Construct a Huffman code and find the average code length when one symbol at a time is considered.
    3. Construct a Huffman code and find the average code length when two symbols at a time are considered.
    4. Repeat parts (a)-(c) when the probabilities are Pr[X = H] = 0.9 and Pr[X = T] = 0.1, respectively.
    5. Is pairwise Huffman coding more beneficial when the difference between source symbol probabilities is large or small?
  3. Lempel-Ziv Coding: Using the LZW-coding algorithm discussed in class, you are to encode the message “MISSISSIPPI” under the assumption that the initial dictionary contains:




    IndexPhrase


    1 I
    2 M
    3 P
    4 S




    Show both the transmitted sequence of indices and the content of the dictionary.

  4. Lempel-Ziv Decoding: You receive the following LZW-encoded sequence of indices: “2 1 3 5 1” and the initial dictionary contains:




    IndexPhrase


    1 A
    2 B
    3 N




    Find the original message. Also determine the contents of the dictionary.