Everipedia Logo
Everipedia is now IQ.wiki - Join the IQ Brainlist and our Discord for early access to editing on the new platform and to participate in the beta testing.
Precision (statistics)

Precision (statistics)

In statistics, precision is the reciprocal of the variance, and the precision matrix (also known as concentration matrix) is the matrix inverse of the covariance matrix.[1][2][3] Thus, if we are considering a single random variable in isolation, its precision is the inverse of its variance: p=1/σ². Some particular statistical models define the term precision differently.

One particular use of the precision matrix is in the context of Bayesian analysis of the multivariate normal distribution: for example, Bernardo & Smith prefer to parameterise the multivariate normal distribution in terms of the precision matrix, rather than the covariance matrix, because of certain simplifications that then arise.[4] For instance, if both the prior and the likelihood have Gaussian form, and the precision matrix of both of these exist (because their covariance matrix is full rank and thus invertible), then the precision matrix of the posterior will simply be the sum of the precision matrices of the prior and the likelihood.

As the inverse of a Hermitian matrix, the precision matrix of real-valued random variables, if it exists, is positive definite and symmetrical.

Another reason the precision matrix may be useful is that if two dimensions i and j of a multivariate normal are conditionally independent, then the ij and ji elements of the precision matrix are 0. This means that precision matrices tend to be sparse when many of the dimensions are conditionally independent, which can lead to computational efficiencies when working with them. It also means that precision matrices are closely related to the idea of partial correlation.

History

The term precision in this sense (“mensura praecisionis observationum”) first appeared in the works ofGauss(1809) “Theoria motus corporum coelestium in sectionibus conicis solem ambientium” (page 212). Gauss’s definition differs from the modern one by a factor of. He writes, for the density function of a normal random variable with precision h,

Later Whittaker & Robinson (1924) “Calculus of observations” called this quantity the modulus, but this term has dropped out of use.[5]

References

[1]
Citation Linkopenlibrary.orgDeGroot, Morris H. (1969). Optimal Statistical Decisions. New York: McGraw-Hill. p. 56.
Sep 29, 2019, 6:12 PM
[2]
Citation Linkbooks.google.comDavidson, Russell; MacKinnon, James G. (1993). Estimation and Inference in Econometrics. New York: Oxford University Press. p. 144. ISBN 0-19-506011-3.
Sep 29, 2019, 6:12 PM
[3]
Citation Linkopenlibrary.orgDodge, Y. (2003). The Oxford Dictionary of Statistical Terms. Oxford University Press. ISBN 0-19-920613-9.
Sep 29, 2019, 6:12 PM
[4]
Citation Linkopenlibrary.orgBernardo, J. M. & Smith, A.F.M. (2000) Bayesian Theory, Wiley ISBN 0-471-49464-X
Sep 29, 2019, 6:12 PM
[5]
Citation Linkjeff560.tripod.com"Earliest known uses of some of the words in mathematics".
Sep 29, 2019, 6:12 PM
[6]
Citation Linkbooks.google.comEstimation and Inference in Econometrics
Sep 29, 2019, 6:12 PM
[7]
Citation Linkjeff560.tripod.com"Earliest known uses of some of the words in mathematics"
Sep 29, 2019, 6:12 PM
[8]
Citation Linken.wikipedia.orgThe original version of this page is from Wikipedia, you can edit the page right here on Everipedia.Text is available under the Creative Commons Attribution-ShareAlike License.Additional terms may apply.See everipedia.org/everipedia-termsfor further details.Images/media credited individually (click the icon for details).
Sep 29, 2019, 6:12 PM