Catálogo de publicaciones - libros

Compartir en
redes sociales


A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-193

Anders Hald

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Probability Theory and Stochastic Processes; History of Mathematical Sciences; Statistical Theory and Methods

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-0-387-46408-4

ISBN electrónico

978-0-387-46409-1

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer Science+Business Media, LLC 2007

Cobertura temática

Tabla de contenidos

The Three Revolutions in Parametric Statistical Inference

Anders Hald

The three revolutions in parametric statistical inference are due to Laplace [148], Gauss and Laplace (1809–1811), and Fisher [67].

- The Three Revolutions in Parametric Statistical Inference | Pp. 1-8

James Bernoulli’s Law of Large Numbers for the Binomial, 1713, and Its Generalization

Anders Hald

James Bernoulli (1654–1705) graduated in theology from the University of Basel in 1676; at the same time he studied mathematics and astronomy. For the next seven years he spent most of his time traveling as tutor and scholar in Switzerland, France, the Netherlands, England, and Germany. Returning to Basel in 1683 he lectured on mathematics and experimental physics and in 1687 he became professor of mathematics at the university. He and his younger brother John made essential contributions to Leibniz’s new infinitesimal calculus. He left his great work on probability (The Art of Conjecturing) unfinished; it was published in 1713.

Part I - Binomial Statistical Inference | Pp. 11-15

De Moivre’s Normal Approximation to the Binomial, 1733, and Its Generalization

Anders Hald

Abraham de Moivre (1667–1754) was of a French Protestant family; from 1684 he studied mathematics in Paris. The persecution of the French Protestants caused him at the age of 21 to seek asylum in England. For the rest of his life he lived in London, earning his livelihood as a private tutor of mathematics and later also as a consultant to gamblers and insurance brokers. He became a prominent mathematician and a Fellow of the Royal Society in 1697, but he never got a university appointment as he had hoped. He wrote three outstanding books: (1730), containing papers on mathematics and probability theory; (1718, 1738, 1756); and (1725, 1743, 1750, 1752), each new edition being an enlarged version of the previous one. His Doctrine contained new solutions to old problems and an astounding number of new results; it was the best textbook on probability theory until Laplace [159]. Here we only discuss his two proofs of Bernoulli’s law of large numbers and his two approximations to the binomial.

Part I - Binomial Statistical Inference | Pp. 17-24

Bayes’s Posterior Distribution of the Binomial Parameter and His Rule for Inductive Inference, 1764

Anders Hald

The English physician and philosopher David Hartley (1705–1757), founder of the Associationist school of psychologists, discusses some elementary applications of probability theory in his [118]. On the limit theorems he writes (pp. 338–339):

An ingenious Friend has communicated to me a Solution of the inverse Problem, in which he has shewn what the Expectation is, when an event has happened times, and failed times, that the original Ratio of the Causes for the Happening or Failing of an Event should deviate in any given Degree from that of to . And it appears from this Solution, that where the Number of Trials is very great, the Deviation must be inconsiderable: Which shews that we may hope to determine the Proportions, and, by degrees, the whole Nature, of unknown Causes, by a su cient Observation of their Effects.

Part I - Binomial Statistical Inference | Pp. 25-29

Laplace’s Theory of Inverse Probability, 1774–1786

Anders Hald

Pierre Simon Laplace (1749–1827) was born into a middle-class family at a small town in Normandy, where he spent his first 16 years. His father destined him for an ecclesiastical career and sent him to the University of Caen, where he matriculated in the Faculty of Arts with the intention to continue in the Faculty of Theology. However, after two years of study he left for Paris in 1768 bringing along a letter of recommendation from his mathematics teacher to d’Alembert. After having tested his abilities, d’Alembert secured him a post as teacher of mathematics at the école Militaire. He lived in Paris for the rest of his life.

Part II - Statistical Inference by Inverse Probability | Pp. 33-46

A Nonprobabilistic Interlude: The Fitting of Equations to Data, 1750–1805

Anders Hald

We consider the model , where the represent the observations of a phenomenon, whose variation depends on the observed values of the , the s are unknown parameters, and the s random errors, distributed symmetrically about zero. Denoting the true value of by η, the model may be described as a mathematical law giving the dependent variable η as a function of the independent variables , ..., with unknown errors of observation equal to ε = − η.

Part II - Statistical Inference by Inverse Probability | Pp. 47-53

Gauss’s Derivation of the Normal Distribution and the Method of Least Squares, 1809

Anders Hald

Carl Friedrich Gauss (1777–1855) was born into a humble family in Brunswick, Germany. His extraordinary talents were noted at an early age, and his father allowed him to enter the local Gymnasium in 1788, where he excelled in mathematics and numerical calculations as well as in languages. Impressed by his achievements, a professor at the local Collegium Carolinum recommended him to the Duke of Brunswick, who gave Gauss a stipend, which made it possible for him to concentrate on study and research from 1792 until 1806, when the Duke died. For three years Gauss studied mathematics and classics at the Collegium Carolinum; in 1795 he went to the University of Göttingen and continued his studies for another three years. From 1798 he worked on his own in Brunswick until he in 1807 became professor in astronomy and director of the observatory in Göttingen, where he remained for the rest of his life.

Part II - Statistical Inference by Inverse Probability | Pp. 55-61

Credibility and Confidence Intervals by Laplace and Gauss

Anders Hald

It follows from Laplace’s 1774 and 1785 papers that the large-sample inverse probability limits for are given by the relation for > 0. In 1812 ([159], II, §16) he uses the normal approximation to the binomial to find large-sample direct probability limits for the relative frequency as Noting that = + () so that and neglecting terms of the order of as in the two formulas above he solves the inequality (8.2) with respect to and obtains for > 0

Part II - Statistical Inference by Inverse Probability | Pp. 63-66

The Multivariate Posterior Distribution

Anders Hald

Irénée Jules Bienaymé (1796–1878) proposes to generalize Laplace’s inverse probability analysis of the binomial. Using the principle of inverse probability on the multinomial he gets the posterior distribution where the ns are nonnegative integers and Σ = . In normed form this distribution is today called the Dirichlet distribution. The posterior mode is = /, Σ = 1.

Part II - Statistical Inference by Inverse Probability | Pp. 67-68

Edgeworth’s Genuine Inverse Method and the Equivalence of Inverse and Direct Probability in Large Samples, 1908 and 1909

Anders Hald

Francis Ysidro Edgeworth (1845–1926) was a complex personality with wide-ranging interests in both the humanities and the natural sciences. For several years he studied the classics at the universities of Dublin and Oxford, next he studied commercial law and qualified as a barrister, and finally he studied logic and mathematics on his own, using the acquired knowledge to write important books on ethics, utility, and economics.

Part II - Statistical Inference by Inverse Probability | Pp. 69-72