The Gauss-Markov Theorem (pt. 2)

I continue the proof from my last post of the Gauss-Markov Theorem.

Theorem: By respecting assumptions described last time, the estimators \hat \beta_1 and \hat \beta_2, in the class of unbiased linear estimators, are best linear unbiased estimators of \beta_1 and \beta_2 respectively. That is, they are linear unbiased estimators that have minimium variance.

Last time I showed that \hat \beta_2 is linear. Today, I show that it is unbiased–that the expected value of \hat \beta_2 is equal to \beta_2.

Proof: Recall that \hat \beta_2 = \sum k_i Y_i, where k_i = \frac{x_i}{\sum{x_i^2}}, where x_i = X - \bar X. Since we assume the values of X are fixed, we treat k_i as a constant. Additionally, recall that Y_i = \beta_1 + \beta_2X_i + u_i. Therefore we may write,
\hat \beta_2 = \sum k_i Y_i = \sum k_i (\beta_1 + \beta_2X_i + u_i) = \beta_1\sum k_i + \beta_2\sum k_iX_i + \sum k_iu_i.

Notice now that \sum k_i = \sum \frac{x_i}{\sum{x_i^2}} = \frac{1}{\sum{x_i^2}}\sum x_i, since we know the value of \sum x_i^2 from the given sample. Moreover, since the sum of the deviations from the mean is zero, \sum x_i = 0. Thus, \hat\beta_2 = \beta_2\sum k_i X_i + \sum k_i u_i. But, we can do better: \sum k_iX_i = \sum \frac{x_i}{\sum{x_i^2}}X_i = \frac{1}{\sum{x_i^2}}\sum (X_i - \bar X)X_i = \frac{1}{\sum{x_i^2}}\sum (X_i^2 - X_i \bar X) = \frac{1}{\sum{x_i^2}}\left [ \sum X_i^2 - \sum X_i \bar X \right ] = \frac{1}{\sum{x_i^2}}\left [ \sum X_i^2 - \bar X \sum X_i \right ]. Notice that since \bar X = \frac{\sum X_i}{n}, \sum X_i = n \bar X. Thus, \sum k_iX_i = \frac{1}{\sum{x_i^2}}\left [ \sum X_i^2 - n \bar X^2 \right ].

So, what is \frac{1}{\sum{x_i^2}}? Well, \frac{1}{\sum{x_i^2}} = \frac{1}{\sum{(X_i - \bar X)^2}}. Notice that \sum (X_i - \bar X)^2 = \sum X_i^2 -2X_i\bar X + \bar X^2 = \sum X_i^2 - \sum 2X_i\bar X + \sum \bar X^2 = \sum X_i^2 -2\bar X \sum X_i + n\bar X^2, since we are summing over n. Thus, \sum X_i^2 - 2\bar X\sum X_i + n\bar X^2 = \sum X_i^2 - 2n\bar X^2 + n\bar X^2 = \sum X_i^2 - n\bar X^2. Therefore, \frac{1}{\sum{x_i^2}} = \frac{1}{\sum X_i^2 - n\bar X^2}. Finally then, \sum k_iX_i = \frac{\sum X_i^2 - n\bar X^2}{\sum X_i^2 - n\bar X^2} = 1.

As a result, \hat \beta_2 = \beta_2 + \sum k_i u_i. So, taking expectation on both sides, and noting that k_i can be treated as constants since they are non-stochastic, E(\hat \beta_2) = \beta_2 + \sum k_i E(u_i) = \beta_2, since E(u_i) = 0 by the third assumption (which can be found in part 1).

There we go! Next time I will finish up the proof and show that \hat \beta_2 has minimum variance.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s