Homework 5 (Due: March 5)

[PDF]

ECE 630: Statistical Communication Theory
Prof. B.-P. Paris
Homework 5
Due: March 5, 2019

Reading
Madhow: Section 3.2.
Problems
  1. Let x and y be elements of a normed linear vector space.
    1. Determine whether the following are valid inner products for the indicated space.
      1. x,y= xT Ay, where A is a nonsingular,NxN matrix and x, y are elements of the space of N-dimensional vectors.
      2. x,y= xyT , where x and y are elements of the space of N-dimensional (column!) vectors.
      3. x,y= 0T x(t)y(T-t)dt, where x and y are finite energy signals defined over [0,T].
      4. x,y= 0T w(t)x(t)y(t)dt, where x and y are finite energy signals defined over [0,T] and w(t) is a non-negative function.
      5. E[XY ], where X and Y are real-valued random variables having finite mean-square values.
      6. Cov(X,Y ), the covariance of the real-valued random variables X and Y . Assume that X and Y have finite mean-square values.
    2. Under what conditions is
      ∫ T ∫  T
        Q (t,u)x(t)y(u)dtdu
 0    0

      a valid inner product for the space of finite-energy functions defined over [0,T]?

  2. Let x(t) be a signal of finite energy over the interval [0,T]. In other words, x(t) is a vector in the Hilbert space L2(0,T). Signals may be complex values, so that the appropriate inner product is
            ∫
           T       *
⟨x,y⟩ =     x (t) ⋅ y (t)dt.
          0

    Consider subspace L of L2(0,T) that consists of signals of the form

    yn(t) = Xn exp(j2 πnt∕T ) for 0 ≤ t ≤ T,

    where Xn may be complex valued.

    1. Find the signal ŷn(t) that best approximates the signal x(t), i.e., ŷn(t) minimizes x - ynamong all elements of L.
      Hint: Find the best complex amplitude Xˆn.
    2. Now define the error signal z(t) = x(t) -ŷn(t). Show that z(t) is orthogonal to the subspace L, i.e., it is orthogonal to all elements of L.
    3. How do the above results illustrate the projection theorem?
  3. Linear Regression
    The elements of a vector of random variables ⃗Y follow the model
    Yn =  axn + b + Nn

    where xn are known and Nn are zero mean, iid Gaussian noise samples with variance σ2. The parameters a and b are to be determined. We can think of the solution to this problem as the projection of ⃗Y onto the subspace spanned by a⃗x + b

    1. Determine the least-squares estimates for a and b, i.e., find
        ˆ            ⃗            2
ˆa,b = argmian,b ∥ Y - (a⃗x + b)∥ .

    2. What are the expected values of these estimates, E[â] and E[ˆ
b ]?
    3. Compute â and ˆb , when data are given by the (xn,Y n) pairs
      {(xn,Yn)}5n=1 =  {(0,1.3), (1,0.2),(2,0.1),(3,- 0.4 ),(4,- 1.2)}.

    4. Is it true that the least-squares estimates for a and b are given by the inner products
      ˆa = ⟨⃗Y ,⃗x⟩ and ˆb = ⟨⃗Y ,⃗1⟩?

      ⃗1 denotes a vector of 1’s. Explain why or why not?