Está en la página 1de 6

Wishart distribution - Wikipedia https://en.wikipedia.

org/wiki/Wishart_distribution

Wishart distribution
In statistics, the Wishart distribution is a generalization to multiple dimensions of the
chi-squared distribution, or, in the case of non-integer degrees of freedom, of the gamma
Wishart
distribution. It is named in honor of John Wishart, who first formulated the distribution in Notation X ~ Wp(V, n)
1928.[1] Parameters n > p − 1 degrees of
It is a family of probability distributions defined over symmetric, nonnegative-definite freedom (real)
matrix-valued random variables (“random matrices”). These distributions are of great V > 0 scale matrix (p × p
importance in the estimation of covariance matrices in multivariate statistics. In Bayesian pos. def)
statistics, the Wishart distribution is the conjugate prior of the inverse covariance-matrix Support X(p × p) positive definite
of a multivariate-normal random-vector. matrix
PDF

Contents
Γp is the multivariate
Definition
gamma function
Occurrence
tr is the trace function
Probability density function
Use in Bayesian statistics Mean
Choice of parameters Mode (n − p − 1)V for n ≥ p + 1
Properties Variance
Log-expectation
Log-variance Entropy see below
Entropy
CF
Cross-entropy
KL-divergence
Characteristic function

Theorem
Corollary 1
Corollary 2

Estimator of the multivariate normal distribution


Bartlett decomposition
Marginal distribution of matrix elements
The range of the shape parameter
Relationships to other distributions
See also
References
External links

Definition
Suppose X is an n × p matrix, each row of which is independently drawn from a p-variate normal distribution with zero mean:

Then the Wishart distribution is the probability distribution of the p × p random matrix

known as the scatter matrix. One indicates that S has that probability distribution by writing

The positive integer n is the number of degrees of freedom. Sometimes this is written W(V, p, n). For n ≥ p the matrix S is invertible with

1 de 6 30/05/2018 13:08
Wishart distribution - Wikipedia https://en.wikipedia.org/wiki/Wishart_distribution

probability 1 if V is invertible.

If p = V = 1 then this distribution is a chi-squared distribution with n degrees of freedom.

Occurrence
The Wishart distribution arises as the distribution of the sample covariance matrix for a sample from a multivariate normal distribution. It occurs
frequently in likelihood-ratio tests in multivariate statistical analysis. It also arises in the spectral theory of random matrices and in
multidimensional Bayesian analysis.[2] It is also encountered in wireless communications, while analyzing the performance of Rayleigh fading
MIMO wireless channels .[3]

Probability density function


The Wishart distribution can be characterized by its probability density function as follows:

Let X be a p × p symmetric matrix of random variables that is positive definite. Let V be a (fixed) positive definite matrix of size p × p.

Then, if n ≥ p, X has a Wishart distribution with n degrees of freedom if it has a probability density function given by

where denotes determinant and Γp(·) is the multivariate gamma function defined as

In fact the above definition can be extended to any real n > p − 1. If n ≤ p − 1, then the Wishart no longer has a density—instead it represents a
singular distribution that takes values in a lower-dimension subspace of the space of p × p matrices.[4]

Use in Bayesian statistics


In Bayesian statistics, in the context of the multivariate normal distribution, the Wishart distribution is the conjugate prior to the precision matrix
Ω = Σ−1, where Σ is the covariance matrix.

Choice of parameters
The least informative, proper Wishart prior is obtained by setting n = p.

The prior mean of Wp(V, n) is nV, suggesting that a reasonable choice for V would be n−1Σ0, where Σ0 is some prior guess for the covariance
matrix.

Properties

Log-expectation
The following formula plays a role in variational Bayes derivations for Bayes networks involving the Wishart distribution:

[5]

where is the multivariate digamma function (the derivative of the log of the multivariate gamma function).

Log-variance
The following variance computation could be of help in Bayesian statistics:

where is the trigamma function. This comes up when computing the Fisher information of the Wishart random variable.

2 de 6 30/05/2018 13:08
Wishart distribution - Wikipedia https://en.wikipedia.org/wiki/Wishart_distribution

Entropy
The information entropy of the distribution has the following formula:[5]

where B(V, n) is the normalizing constant of the distribution:

This can be expanded as follows:

Cross-entropy
The cross entropy of two Wishart distributions with parameters and with parameters is

Note that when we recover the entropy.

KL-divergence
The Kullback–Leibler divergence of from is

Characteristic function
The characteristic function of the Wishart distribution is

In other words,

3 de 6 30/05/2018 13:08
Wishart distribution - Wikipedia https://en.wikipedia.org/wiki/Wishart_distribution

where E[⋅] denotes expectation. (Here Θ and I are matrices the same size as V(I is the identity matrix); and i is the square root of −1).[6]

Theorem
If a p × p random matrix X has a Wishart distribution with m degrees of freedom and variance matrix V — write — and C is a
q × p matrix of rank q, then [7]

Corollary 1
If z is a nonzero p × 1 constant vector, then:[7]

In this case, is the chi-squared distribution and (note that is a constant; it is positive because V is positive definite).

Corollary 2
Consider the case where zT = (0, ..., 0, 1, 0, ..., 0) (that is, the j-th element is one and all others zero). Then corollary 1 above shows that

gives the marginal distribution of each of the elements on the matrix's diagonal.

George Seber points out that the Wishart distribution is not called the “multivariate chi-squared distribution” because the marginal distribution
of the off-diagonal elements is not chi-squared. Seber prefers to reserve the term multivariate for the case when all univariate marginals belong to
the same family.[8]

Estimator of the multivariate normal distribution


The Wishart distribution is the sampling distribution of the maximum-likelihood estimator (MLE) of the covariance matrix of a multivariate
normal distribution.[9] A derivation of the MLE uses the spectral theorem.

Bartlett decomposition
The Bartlett decomposition of a matrix X from a p-variate Wishart distribution with scale matrix V and n degrees of freedom is the
factorization:

where L is the Cholesky factor of V, and:

where and nij ~ N(0, 1) independently.[10] This provides a useful method for obtaining random samples from a Wishart
distribution.[11]

Marginal distribution of matrix elements


Let V be a 2 × 2 variance matrix characterized by correlation coefficient −1 < ρ < 1 and L its lower Cholesky factor:

Multiplying through the Bartlett decomposition above, we find that a random sample from the 2 × 2 Wishart distribution is

4 de 6 30/05/2018 13:08
Wishart distribution - Wikipedia https://en.wikipedia.org/wiki/Wishart_distribution

The diagonal elements, most evidently in the first element, follow the χ2 distribution with n degrees of freedom (scaled by σ2) as expected. The
off-diagonal element is less familiar but can be identified as a normal variance-mean mixture where the mixing density is a χ2 distribution. The
corresponding marginal probability density for the off-diagonal element is therefore the variance-gamma distribution

where Kν(z) is the modified Bessel function of the second kind.[12] Similar results may be found for higher dimensions, but the interdependence
of the off-diagonal correlations becomes increasingly complicated. It is also possible to write down the moment-generating function even in the
noncentral case (essentially the nth power of Craig (1936)[13] equation 10) although the probability density becomes an infinite sum of Bessel
functions.

The range of the shape parameter


It can be shown [14] that the Wishart distribution can be defined if and only if the shape parameter n belongs to the set

This set is named after Gindikin, who introduced it[15] in the seventies in the context of gamma distributions on homogeneous cones. However,
for the new parameters in the discrete spectrum of the Gindikin ensemble, namely,

the corresponding Wishart distribution has no Lebesgue density.

Relationships to other distributions


The Wishart distribution is related to the inverse-Wishart distribution, denoted by , as follows: If X ~ Wp(V, n) and if we do the change
of variables C = X−1, then . This relationship may be derived by noting that the absolute value of the Jacobian
determinant of this change of variables is |C|p+1, see for example equation (15.15) in.[16]
In Bayesian statistics, the Wishart distribution is a conjugate prior for the precision parameter of the multivariate normal distribution, when the
mean parameter is known.[17]
A generalization is the multivariate gamma distribution.
A different type of generalization is the normal-Wishart distribution, essentially the product of a multivariate normal distribution with a Wishart
distribution.

See also
Chi-squared distribution Inverse-Wishart distribution
Complex Wishart distribution Multivariate gamma distribution
F-distribution Student's t-distribution
Gamma distribution Wilks' lambda distribution
Hotelling's T-squared distribution

References
1. Wishart, J. (1928). "The generalised product moment distribution in samples from a normal multivariate population". Biometrika. 20A (1–2):
32–52. doi:10.1093/biomet/20A.1-2.32 (https://doi.org/10.1093%2Fbiomet%2F20A.1-2.32). JFM 54.0565.02 (https://zbmath.org
/?format=complete&q=an:54.0565.02). JSTOR 2331939 (https://www.jstor.org/stable/2331939).
2. Gelman, Andrew (2003). Bayesian Data Analysis (http://www.stat.columbia.edu/~gelman/book/) (2nd ed.). Boca Raton, Fla.: Chapman &
Hall. p. 582. ISBN 158488388X. Retrieved 3 June 2015.
3. Zanella, A.; Chiani, M.; Win, M.Z. (April 2009). "On the marginal distribution of the eigenvalues of wishart matrices". IEEE Transactions on
Communications. 57 (4): 1050–1060. doi:10.1109/TCOMM.2009.04.070143 (https://doi.org/10.1109%2FTCOMM.2009.04.070143).
4. Uhlig, H. (1994). "On Singular Wishart and Singular Multivariate Beta Distributions". The Annals of Statistics. 22: 395–405. doi:10.1214/aos
/1176325375 (https://doi.org/10.1214%2Faos%2F1176325375).
5. C.M. Bishop, Pattern Recognition and Machine Learning, Springer 2006, p. 693.

5 de 6 30/05/2018 13:08
Wishart distribution - Wikipedia https://en.wikipedia.org/wiki/Wishart_distribution

6. Anderson, T. W. (2003). An Introduction to Multivariate Statistical Analysis (3rd ed.). Hoboken, N. J.: Wiley Interscience. p. 259.
ISBN 0-471-36091-0.
7. Rao, C. R. (1965). Linear Statistical Inference and its Applications. Wiley. p. 535.
8. Seber, George A. F. (2004). Multivariate Observations. Wiley. ISBN 978-0471691211.
9. Chatfield, C.; Collins, A. J. (1980). Introduction to Multivariate Analysis. London: Chapman and Hall. pp. 103–108. ISBN 0-412-16030-7.
10. Anderson, T. W. (2003). An Introduction to Multivariate Statistical Analysis (3rd ed.). Hoboken, N. J.: Wiley Interscience. p. 257.
ISBN 0-471-36091-0.
11. Smith, W. B.; Hocking, R. R. (1972). "Algorithm AS 53: Wishart Variate Generator". Journal of the Royal Statistical Society, Series C. 21 (3):
341–345. JSTOR 2346290 (https://www.jstor.org/stable/2346290).
12. Pearson, Karl; Jeffery, G. B.; Elderton, Ethel M. (December 1929). "On the Distribution of the First Product Moment-Coefficient, in Samples
Drawn from an Indefinitely Large Normal Population". Biometrika. Biometrika Trust. 21: 164–201. doi:10.2307/2332556 (https://doi.org
/10.2307%2F2332556). JSTOR 2332556 (https://www.jstor.org/stable/2332556).
13. Craig, Cecil C. (1936). "On the Frequency Function of xy" (http://projecteuclid.org/euclid.aoms/1177732541). Ann. Math. Statist. 7: 1–15.
doi:10.1214/aoms/1177732541 (https://doi.org/10.1214%2Faoms%2F1177732541).
14. Peddada and Richards, Shyamal Das; Richards, Donald St. P. (1991). "Proof of a Conjecture of M. L. Eaton on the Characteristic Function of
the Wishart Distribution,". Annals of Probability. 19 (2): 868–874. doi:10.1214/aop/1176990455 (https://doi.org
/10.1214%2Faop%2F1176990455).
15. Gindikin, S.G. (1975). "Invariant generalized functions in homogeneous domains,". Funct. Anal. Appl. 9 (1): 50–52. doi:10.1007/BF01078179
(https://doi.org/10.1007%2FBF01078179).
16. Dwyer, Paul S. (1967). "Some Applications of Matrix Derivatives in Multivariate Analysis". J. Amer. Statist. Assoc. 62 (318): 607–625.
JSTOR 2283988 (https://www.jstor.org/stable/2283988).
17. Bishop, C. M. (2006). Pattern Recognition and Machine Learning. Springer.

External links
A C++ library for random matrix generator (https://github.com/zweng/rmg)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Wishart_distribution&oldid=838228744"

This page was last edited on 25 April 2018, at 18:32.

Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the
Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.

6 de 6 30/05/2018 13:08

También podría gustarte