# What is a minimum-variance, mean-unbiased estimator?

##### 1 Answer

Of all estimators with the property of being "mean-unbiased", it is the estimator with the smallest variance, and sometimes also referred to as the "best" estimator.

#### Explanation:

Say you observe some data on N individuals. Label one variable

**An estimator** is some function of observed data designed to estimate some true underlying relationship.

So we have to have a belief of the true underlying relationship, and statisticians call this the specification assumption. Often, a linear specification is assumed:

Suppose we want an estimator of

We use a hat to denote our estimator -

Note that this can be any function using the data (X,Y) and so there are limitless possible estimators. So we narrow down which to use by looking for those with nice properties.

An estimator is said to be **mean-unbiased** if its expected value (mean) is the true parameter value:

But for any given question there may be many estimators that satisfy this requirement, and so **among all the possible estimators with the property of being mean-unbiased, we look for the one with the minimum variance**, that is, we want

For all

We like this property so that our estimate will be the closest to the true value among all the unbiased estimators. This is not always simple, given how many estimators are possible, but, you can sometimes appeal to previous results, e.g.,

If you restrict your attention to linear models like (1) above with certain assumptions about