next up previous
Next: Homework 3 Up: Homework Assignments Previous: Homework 1

Homework 2

ECE 630: Statistical Communication Theory
Prof. B.-P. Paris
Homework 2
Due Feb. 4, 2003
Reading
Wozencraft & Jacobs: Chapter 2 pages 58-114.
Problems
  1. Wozencraft & Jacobs: Problem 2.30
  2. Let $\underline{X}$ be a zero mean Gaussian random vector with covariance matrix $K$.

    \begin{displaymath}
K = \left[ \begin{array}{ccc} 3 & 3 & 0  3 & 5 & 0  0 & 0 & 6
\end{array} \right]
\end{displaymath}

    1. Give an expression for the density function $f_{\underline{X}}(x)$.
    2. If $Y = X_1 + 2X_2 - X_3$, find $f_Y(y)$.
    3. If the vector $\underline{Z}$ has components defined by

      \begin{displaymath}
\begin{array}{ccl}
Z_1 & = & 5X_1 - 3X_2 - X_3  Z_2 & = & -X_1 + 3X_2 - X_3  Z_3 & = & X_1 +
X_3
\end{array}\end{displaymath}

      determine $f_{\underline{Z}}(\underline{z})$. What are the properties of the new random vector?
    4. Determine $f_{X_1 \vert X_2}(x_1 \vert x_2 = \beta)$

  3. Characteristic Function of Gaussian Random Variables
    1. Show that the characteristic function of a Gaussian random variable $X$ having mean $m_X$ and variance $\sigma_X^2$ is

      \begin{displaymath}
M_X(j\nu) = \exp(j \nu m_X - \frac{1}{2} \nu^2 \sigma_X^2).
\end{displaymath}

    2. Find the characteristic function of a Gaussian random vector having mean $\underline{m}$ and covariance matrix $K$.
    3. Use this result to show that the components of a Gaussian random vector are Gaussian.
    4. Show that the n-th central moment of a Gaussian random variable is given by

      \begin{displaymath}
\mbox{\bf E}[(X-m_X)^n] = \left\{
\begin{array}{cl}
1 \cdo...
...X^n & \mbox{n even} \\
0 & \mbox{n odd}
\end{array} \right.
\end{displaymath}

  4. We are concerned about the values of the two random variables $X$ and $Y$. However, we can only observe the values of $Y$. We wish to estimate (i.e. guess intelligently) the value of $X$ by using a wisely chosen function $g(Y)$ of the observed value $Y=y$. Let $\hat{X}$ denote this estimate, $\hat{X}=g(y)$.
    1. Show that the mean-square estimation error

      \begin{displaymath}
\epsilon = \mbox{\bf E}[(X-\hat{X})^2]
\end{displaymath}

      is minimized by choosing $g(y)=\mbox{\bf E}[X \vert Y=y]$, where $\mbox{\bf E}[X \vert
Y=y]$ denotes the conditional expected value of $X$ given the observation of $Y=y$,

      \begin{displaymath}
\mbox{\bf E}[X \vert Y=y] = \int_{-\infty}^{\infty} x f_{X\vert Y}(x\vert Y=y) dx.
\end{displaymath}

    2. Let $X$ and $Y$ be jointly distributed, zero mean Gaussian random variables with variances $\sigma_X^2$ and $\sigma_Y^2$ and correlation coefficient $\rho$. Show that the conditional expected value of $X$ given $Y=y$ is a Gaussian random variable with mean $\rho \cdot (\sigma_X /
\sigma_Y)\cdot y$ and variance $(1-\rho^2)\cdot \sigma_X^2$.
    3. We wish to estimate $X$ based on the observation of $Y=y$; however we wish to restrict the estimate to be linear:

      \begin{displaymath}
\hat{X} = ay + b.
\end{displaymath}

      Find the values of $a$ and $b$ that minimize the mean-square estimation error.
    4. Of the estimators defined in part (a) and part (c) of this problem, which do you think will be ``better'' in general? Which do you think will be easier to compute? Give reasons for your answers.


next up previous
Next: Homework 3 Up: Homework Assignments Previous: Homework 1
Dr. Bernd-Peter Paris
2003-05-01