Homework Assignment

HW 32 — Comparing ML and MMSE Estimators (MBIP Fig. 2.1)

📘 Related: Lesson 32 🛠 MATLAB required

📖 Background

The goal of this assignment is to compare the ML and MMSE estimators on a correlated signal buried in white Gaussian noise.

The unknown signal \(X \in \mathbb{R}^p\) is modeled as a Gaussian random vector with zero mean and covariance matrix \(R_x\), where the \((i,j)\) entry is:

\[ [R_x]_{i,j} = \frac{1}{1 + \left|\dfrac{i-j}{10}\right|^2} \]

This covariance models a smooth signal: nearby components are strongly correlated and distant components are weakly correlated. The observation is \(Y = X + W\), where \(W \sim \mathcal{N}(0, \sigma_w^2 I)\) is independent white Gaussian noise with \(\sigma_w = 0.75\) and \(p = 50\).

The two estimators are:

\[ \hat X_{\mathrm{ML}} = Y \qquad \text{(just the noisy observation)} \] \[ \hat X_{\mathrm{MMSE}} = R_x (R_x + R_w)^{-1} Y \qquad \text{where } R_w = \sigma_w^2 I \]

Tasks

Part 1 — Generate the Data

  1. Build the covariance matrix \(R_x\). Construct the \(50 \times 50\) matrix with entries \([R_x]_{i,j} = 1/(1 + |{(i-j)/10}|^2)\) for \(i, j = 1, \ldots, 50\).
  2. Generate a realization of \(X\). Draw a single random vector \(X \sim \mathcal{N}(0, R_x)\) using MATLAB's mvnrnd function. Transpose as needed so that \(X\) is a column vector.
  3. Generate noise and observation. Generate \(W \sim \mathcal{N}(0, \sigma_w^2 I)\) and compute \(Y = X + W\).

Part 2 — Compute the Estimates

  1. Compute the ML estimate: \(\hat X_{\mathrm{ML}} = Y\).
  2. Compute the MMSE estimate: \(\hat X_{\mathrm{MMSE}} = R_x(R_x + \sigma_w^2 I)^{-1} Y\).
    Hint: Use MATLAB's backslash operator or inv() for the matrix inverse. For numerical stability, prefer R_x / (R_x + R_w) or R_x * ((R_x + R_w) \ eye(p)).

Part 3 — Plot the results

Create a single figure with four subplots arranged in a 2×2 grid:

  1. Top-left: Plot of the true signal \(X\). Title: "Original Signal, X".
  2. Top-right: Plot of the noisy observation \(Y\). Title: "Signal with Noise, Y".
  3. Bottom-left: Plot of the ML estimate \(\hat X_{\mathrm{ML}}\). Title: "ML Estimate of X".
  4. Bottom-right: Plot of the MMSE estimate \(\hat X_{\mathrm{MMSE}}\). Title: "MMSE Estimate of X".

Label the \(x\)-axis as "Index" and the \(y\)-axis as "Value" on all panels. Use the same \(y\)-axis limits on all panels for easy comparison.

Part 4 — Compute MSE

Compute the mean squared error for both estimates:

\[ \text{MSE}_{\mathrm{ML}} = \frac{1}{p}\| \hat X_{\mathrm{ML}} - X \|^2, \qquad \text{MSE}_{\mathrm{MMSE}} = \frac{1}{p}\| \hat X_{\mathrm{MMSE}} - X \|^2 \]

Report both values. Which estimator has lower MSE for your realization?

Part 5 — Explanation

In a few sentences, explain why the MMSE estimate is typically more accurate than the ML estimate for this problem. Your answer should address:

💡 Hints