Properties of Good Estimator

Small Sample properties

  • Unbiased Estimator : Biased means the difference of true value of parameter and value of estimator. When the difference becomes zero then it is called unbiased estimator. i.e., \(E(\hat\beta) = \beta\)
  • Best Estimator : An estimator is called best when value of its variance is smaller than variance is best. i.e \(var(\hat\beta) < var(\beta^\ast) \). Where  \(\beta^\ast\) is another estimator.
  • Efficient Estimator : An estimator is called efficient when it satisfies following conditions
    • \(\beta\) is Unbiased i.e \(E(\hat\beta) = \beta\)
    • The estimator is best i.e \(var(\hat\beta) < var(\beta^\ast) \)
  • Linear Estimator : An estimator is called linear when its sample observations are linear function. For Example \(y_{1},y_{2},y_{3}……..y_{n}\) then \(k\left( y_{1}+y_{2}+y_{3}…….y_{n} \right) =ky\) . Where k are constants. 
  • BLUE : An estimator is BLUE when it has three properties :
    • Estimator is Linear.
    • Estimator is Unbiased.
    • Estimator is Best

So an estimator is called BLUE when it includes best linear and unbiased property.

  • MSE Estimator : The meaning of MSE is minimum mean square error estimator. It is the combinations of unbiasedness and best properties. An estimator is called MSE when its mean square error is minimum.

The formula for calculating MSE is MSE(\(\hat\beta\)) = var \(\hat\beta\) + \(bias^2 \; \hat\beta\)

  • Sufficient Estimator : An estimator is called sufficient when it includes all above mentioned properties, but it is very difficult to find the example of sufficient estimator. Only arithmetic mean is considered as sufficient estimator.

Large Sample properties

A sample is called large when n tends to infinity. This property is called asymptotic property. The large sample properties are :

  • Asymptotic Unbiasedness : In a large sample if estimated value of parameter equal to its true value then it is called asymptotic unbiased. i.e., \( E(\hat\beta)\) when \((n\to\infty) = \beta \)
  • Consistency : An estimators called consistent when it fulfils  following two conditions
    • \(\hat\beta\)must be Asymptotic Unbiased.
    • The variance of \(\hat\beta\) must approach to Zero as n tends to infinity.
  • Asymptotic Efficiency : An estimator \(\beta\) is called asymptotic efficient when it fulfils following two conditions :
    • \(\hat\beta\) must be Consistent.
    • \(var(\hat\beta) < var(\beta^\ast) \), where \(\hat\beta\) and \(\beta^\ast\) are consistent estimators.

Leave a Reply