Point estimation refers to the process of estimating a parameter from a probability distribution, based on observed data from the distribution. It is one of the core topics in mathematical statistics. In this chapter, we will explore the most common methods of point estimation: the method of moments, the method of maximum likelihood, and Bayes' estimators. We also study important properties of estimators, including sufficiency and completeness, and the basic question of whether an estimator is the best possible one.

- Estimators
- The Method of Moments
- Maximum Likelihood
- Bayes Estimators
- Best Unbiased Estimators
- Sufficient, Complete and Ancillary Statistics

- Normal Estimation Experiment
- Uniform Estimation Experiment
- Gamma Estimation Experiment
- Beta Estimation Experiment
- Pareto Estimation Experiment
- Beta Coin Experiment

- Introduction to Probability and Mathematical Statistics. Lee J Bain and Max Engelhardt
- Statistical Inference. George Casella and Rober L Berger
- Statistics. David Freedman, Robert Pisani and Robert Purves
- An Introduction to Mathematical Statistics and Its Applications. Richard J Larsen and Morris L Marx
- Elementary Statistics. Mario Triola
- Introductory Statistics. Neil A Weiss
- Wikipedia statistics portal
- Wolfram MathWorld articles on probability and statistics

Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise.

—John Tukey, Annals of Mathematical Statistics,**33**(1962).