2024-09-03
\def\Er{{\mathrm{E}}} \def\cov{{\mathrm{Cov}}} \def\var{{\mathrm{Var}}} \def\R{{\mathbb{R}}}
Given a parameter of interest \theta_0, an estimator is a measurable function of an observed random vector X, i.e. \hat{\theta} = \tau(X) for some known map \tau
An estimate given X=x is \tau(x)
X \in \R^n distribution P_X \in \mathcal{P} = \{P_\theta: \theta \in \Theta \subset \R^d \}
P_\theta dominated by \sigma-finite \mu with density f_X(\cdot;\theta)
Likelihood \ell(\cdot, X): \Theta \to [0,\infty) \ell(\theta; X)= f(X; \theta)
Maximum likelihood estimator \hat{\theta}_{MLE} = \textrm{arg}\max_{\theta \in \Theta} \ell(\theta;X)
Theorem 1.1
If \hat{\theta} is the MLE of \theta, then for any function g:\Theta \to G, the MLE of g(\theta) is g(\hat{\theta}).
X \in \R^n distribution P_X \in \mathcal{P} = \{P_\theta: \theta \in \Theta \subset \R^d \}, likelihood \ell(\theta;x) = f_X(x;\theta)
Question: if an estimator is unbiased, what is the smallest possible variance?
Cramér-Rao Bound
Let T = \tau(X) be an unbiased estimator, and suppose the condition of the previous slide and of the score equality hold. Then, \var_\theta(\tau(X)) \equiv \int \left(\tau(x) - \int \tau(x) dP_\theta\right)\left(\tau(x) - \int \tau(x) dP_\theta\right)' dP\theta \geq I(\theta)^{-1}
Definition
Lemma (Neyman-Pearson)
Let \Theta = \{0, 1\}, f_0 and f_1 be densities of P_0 and P_1, \tau(x) =f_1(x)/f_0(x) and C^* =\{x \in X: \tau(x) > c\}. Then among all tests C s.t. P_0(C) = P_0(C^*), C^* is most powerful.