2024-10-21
\[ \def\Er{{\mathrm{E}}} \def\cov{{\mathrm{Cov}}} \def\var{{\mathrm{Var}}} \def\R{{\mathbb{R}}} \newcommand\norm[1]{\left\lVert#1\right\rVert} \def\rank{{\mathrm{rank}}} \]
Theorem: Gauss-Markov
If \(\Er[u] = 0\) and \(\Er[uu'] = \sigma^2 I_n\), then the best linear unbiased estimator (BLUE) of \(a'\beta = a'\hat{\beta}\) where \(\hat{\beta} = (X'X)^{-1} X'y\)
Definition
An \(L \subset V\) is a subspace if \(\forall x, y \in L\), \(\alpha, \beta \in \R\), \(\alpha x + \beta y \in L\)
Definition
Given a subspace \(L \subset V\) the orthogonal complement of \(L\) is \[ L^\perp = \{x \in V: x' l = 0 \,\forall l \in L\} \]
Lemma 1.1
Let \(L_1\) and \(L_2\) be subspaces of \(V\), then \[ (\underbrace{L_1 + L_2}_{\{l_1 + l_2 \in V: l_1 \in L_2, l_2 \in L_2\}})^\perp = L_1^\perp \cap L_2^\perp \] and \[ (L_1 \cap L_2)^\perp = L_1^\perp + L_2^\perp \]
Definition
\(P_L y \in L\) is the projection of \(y\) on \(L\) if \[ \norm{y - P_L y } = \inf_{w \in L} \norm{y - w} \]
Projection Theorem
\(P_L y\) exists, is unique, and is a linear function of \(y\)
For any \(y_1^* \in L\), \(y_1^* = P_L y\) iff \(y- y_1^* \perp L\)
Theorem 1.2
A linear map \(G: V \to L\) is the projection map onto \(L\) iff \(Gy = y\) \(\forall y \in L\) and \(Gy = 0\) \(\forall y \in L^\perp\)
Definition
Linear \(G: V \to V\) is
idempotent if \(G (G y) = G y\) \(\forall y \in V\)
symmetric if \(G'y = G y\) \(\forall y \in V\)
Theorem 1.3
A linear map \(G: V \to V\) is a projection map onto its range, \(\mathcal{R}(G)\), iff \(G\) is idempotent and symmetric.
Theorem 1.4
Let \(L \subset V\) and \(L_0 \subset L\) be subspaces. Then \(P_L - P_{L_0} = P_{L \cap L_0^\perp}\)
Definition
For linear \(H: \R^s \to \R^r\), the g-inverse of \(H\) is any \(H^{-}\) s.t. \(H H^{-} H = H\)
Theorem 1.5
Let \(X: \R^k \to \R^n\) be linear. The projection onto \(\mathcal{R}(X)\) is \(P_X = X(X'X)^- X'\) where \((X'X)^{-}\) is any g-inverse of \(X'X\)
Definition
Let \(A: V \to V\) be linear. Then \(\lambda\) is an eigenvalue of \(A\) and \(v \neq 0\) is an associated eigenvector if \(A v = \lambda v\)
Lemma 1.2
The eigenvalues of a symmetric and idempotent matrix, \(P\) are either \(0\) or \(1\). Furthmore rank of \(P\) is the sum of its eigenvalues.
Theorem 1.6
\(\mathrm{rank}(P_X) = \mathrm{rank}(X)\)
\(\rank(I-P_X) = n - \rank(X)\)
\[ Y = \theta + u \]
\(\theta \in L \subset \R^n\), \(L\) a known subspace
\(u \in \R^n\) unobserved
Theorem: Gauss-Markov
If \(\Er[u] = 0\) and \(\Er[uu'] = \sigma^2 I_n\), then the best linear unbiased estimator (BLUE) of \(a'\theta = a'\hat{\theta}\) where \(\hat{\theta} = P_L y\)
Corollary
If \[ y = X'\beta + u \] and \(\Er[u] = 0\) and \(\Er[uu'] = \sigma^2 I_n\), then the BLUE of \(c'\beta\) is \(c'\hat{\beta}\) with \(\hat{\beta} = (X'X)^{-1} X' y\)