Overview
Point estimation uses sample data to calculate a single value (point estimate) as the best guess for an unknown population parameter.
Key Terms
| Term | Definition |
|---|---|
| Parameter | Unknown population characteristic (, , ) |
| Estimator | Rule or formula used to calculate estimate |
| Estimate | Specific value calculated from data |
Common Point Estimators
| Parameter | Point Estimator |
|---|---|
| Population mean () | Sample mean () |
| Population variance () | Sample variance () |
| Population proportion () | Sample proportion () |
| Population SD () | Sample SD () |
Properties of Good Estimators
1. Unbiasedness
An estimator is unbiased if:
The expected value of the estimator equals the parameter.
- is unbiased for
- is unbiased for (with in denominator)
- is unbiased for
2. Efficiency
Among unbiased estimators, the most efficient has the smallest variance.
3. Consistency
As , the estimator converges to the true parameter:
4. Sufficiency
Uses all relevant information in the sample.
Bias and Variance
For unbiased estimators: MSE = Variance
Methods of Estimation
Method of Moments
Match sample moments to population moments:
Maximum Likelihood Estimation (MLE)
Find that maximizes the likelihood of observing the data:
Examples
Example 1: Estimating Mean
Sample data: 12, 15, 18, 21, 24
Point estimate of is 18.
Example 2: Estimating Proportion
In 200 voters, 120 support a candidate:
Point estimate of is 0.60 (60%).
Example 3: Unbiased Variance
Why divide by ?
If we used :
Using :
Standard Error of Estimators
| Estimator | Standard Error |
|---|---|
| or | |
| or |
Limitations
Point estimates:
- Give a single value, not a range
- Don't indicate uncertainty or precision
- May be exactly wrong (just one value)
→ This motivates interval estimation (confidence intervals)