Lesson 3.2: Efficiency and the Cramér-Rao Lower Bound
We've learned that a good estimator should be unbiased (correct on average). But what about its precision? This lesson introduces Efficiency, the measure of an estimator's variance. We will define the theoretical 'speed limit' for how good an estimator can be—the Cramér-Rao Lower Bound (CRLB)—and define the 'best' estimators as those that achieve it.
Part 1: The Second Criterion - Precision
Unbiasedness is not enough. We need our estimates to be not only accurate on average, but also precise. We want an estimator that gives us answers that are tightly clustered around the true value.
The Dartboard Analogy: Steady Hands
Imagine two dart players, both of whom are **unbiased** (their darts are, on average, centered on the bullseye).
Player A (Inefficient)
Unbiased, but has high variance. Their darts are scattered widely all over the board, even though they average out to the center.
Player B (Efficient)
Unbiased and has low variance. Their darts form a tight, precise cluster right around the bullseye. This is the player you want on your team.
In statistics, we prefer Player B. **Efficiency** is the measure of the "shakiness" of our estimator's hand. It is simply the estimator's variance.
Part 2: Defining and Measuring Efficiency
2.1 Variance of an Estimator
The variance of an estimator measures how much the estimate is expected to vary from one sample to another.
A lower variance means the estimator is more precise, stable, and reliable.
2.2 Relative Efficiency
We can compare two *unbiased* estimators, and , by looking at the ratio of their variances.
Definition: Relative Efficiency
The relative efficiency of to is:
If this ratio is greater than 1, it means , so is the more efficient estimator.
Part 3: The 'Speed Limit' - Cramér-Rao Lower Bound (CRLB)
This raises a crucial question: Is there a limit to how efficient an estimator can be? Can we find an estimator with a variance of zero? The answer is no. The Cramér-Rao Lower Bound (CRLB) sets the theoretical limit.
The Core Idea: The CRLB is a mathematical "speed limit." It tells us the absolute minimum possible variance that *any* unbiased estimator can achieve for a given estimation problem. It is the pinnacle of precision.
Theorem: The Cramér-Rao Lower Bound (CRLB)
Let be an i.i.d. sample with PDF . Let be any unbiased estimator of .
Then the variance of must satisfy:
Where is the **Fisher Information** for a sample of size n.
Fisher Information measures how much "information" a sample carries about the unknown parameter . It is defined using the second derivative of the log-likelihood function.
For an i.i.d. sample of size , the total information is , where is the information from a single observation. The CRLB becomes:
Intuition: A distribution with a very sharp, pointy likelihood function (high curvature) has high Fisher Information—the data points strongly towards a specific . This leads to a lower CRLB and allows for more precise estimates.
We can now define the undisputed champion of estimators.
Minimum Variance Unbiased Estimator (MVUE)
An unbiased estimator is called **efficient** (or the MVUE) if its variance actually reaches the Cramér-Rao Lower Bound.
An MVUE is the best possible unbiased estimator. It's like a dart player who is not only perfectly aimed (unbiased) but also has the steadiest possible hands (minimum variance). You cannot do better.
What's Next? Two More Properties
We have now defined our two main criteria for a good estimator in finite samples: it should be **unbiased** (correct aim) and **efficient** (steady hands).
To complete our theory of estimation, we need to learn two more properties. The first is **Consistency**, our guarantee that the estimator improves with more data. The second is **Sufficiency**, a more subtle idea about whether our estimator uses all the available information in the sample.