Midterm Solutions 2023
Problem 1
Suppose
fields
For
Solution. The range of
Identification
- Let
show that is not identified by finding an observationally equivalent .
Solution. Since addition is commutative,
- Show that if
, then is identified.
Solution. In this case, the expectation of
Estimation
- Assuming
, find the maximum likelihood estimator1 for , and show whether it is unbiased.
Solution. The density of
This estimator is unbiased.
- Show whether or not the maximum likelihood estimator is consistent.
Solution. Note that
Testing
For this part, still assume
- For find the most powerful test for testing
against where .
Solution. By the Neyman-Pearson Lemma, the likelihood ratio is most powerful. Let’s describe the critical region for this test. The likelihood ratio is
- Is this test also most powerful against the alternative
? (Hint: does the critical region depend on ?
Solution. In the previous part, we saw that the critical region is the same for any
Problem 2
Suppose you have a linear model with grouped data
Identification
- Show that
is identified by explicitly writing as a function of the distribution of and .
Solution. We can identify
- Suppose that instead of observing
for each and , you only observe group averages, and . Can still be identified?
Solution. Yes,
Estimation
Continue to assume that you only observe group averages. Construct a sample analogue estimator for
- Show whether your estimator is unbiased.
Solution. The estimator is
For
- Assume that
and for all . Show whether or not your estimator is consistent as .
Solution. In this question and in the distribution question, there is some complication because
These assumptions imply that the law of large numbers applies to
Therefore,
Efficiency
Assume that
- Suppose you observe
and for each and . What is the minimal variance unbiased estimator for that is a linear function of ?
Solution. In this case, all the assumptions of the Gauss Markov thereom are met, so ordinary least squares in the best linear unbiased estimator.
- Suppose you only observe group averages
and . What is the minimal variance unbiased estimator for that is a linear function of ?
Solution. Now, the Gauss Markov theorem does not directly apply because
Distribution
Continue to assume that you only observe group averages. Let
Solution. As above, assuming
We already have assumptions that imply
Alternatively, we could assume
In that case, by the central limit theorem,
Definitions and Results
Measure and Probability:
A collection of subsets,
, of is a -field , if- If
, then - If
, then
A measure,
s.t.- If
are pairwise disjoint, then
Lesbegue integral is
- Positive: if
a.e., then - Linear:
- Positive: if
Radon-Nikodym derivative: if
, then nonnegative measureable function, , s.t.Monotone convergence: If
are measurable, , and for each , , then asDominated converegence: If
are measurable, and for each , Furthermore, for some such that , for each . Then,Markov’s inequality:
Jensen’s inequality: if
is convex, thenCauchy-Schwarz inequality:
is -field generated by , it is- smallest
-field w.r.t. which is measurable
- smallest
iff s.t.Events
are independent if for any sub-collection -fields are independent if this is true for any events from them. Random variables are independent if their -fields are.Conditional expection of
given -field satisfies
Identification
observed, distribution , probability model is identified in if there exists a known s.t. , two structures and in are observationally equivalent if they imply the same distribution for the observed data, i.e. for all .- Let
, is observationally equivalent to if that are observationally equivalent and and is identified if there is no that is observationally equivalent to is identified (in ) if there is no observationally equivalent
Cramér-Rao Bound: in the parametric model
wiht likelihood , if appropriate derivatives and integrals can be interchanged, then for any unbiased estimator , where andHypothesis testing:
=Type I error rate =Type II error rate = power = size of test- Neyman-Pearson Lemma: Let
, and be densities of and , and . Then among all tests s.t. , is most powerful.
Projection:
is the projection of on if exists, is unique, and is a linear function of- For any
, iff iff and- Linear
is a projection map onto its range, , iff is idempotent and symmetric.
Gauss-Markov:
with , a known subspace. If and , then the best linear unbiased estimator (BLUE) of whereConvergence in probability:
converge in probability to if ,- If
, then - If
, and continuous, then - Weak LLN: if
are i.i.d. and exists, then if s.t. if
Convergence in distribution:
converge in distribution to if ,- If
and is continuous, then - Slutsky’s lemma: if
and and is continuious, then - Levy’s Continuity Theorem:
iff - CLT: if
are i.i.d. with and , then - Delta Method: suppose
and is differentiable, then