A random vector X=(X1,...,Xn) follows a multivariate normal distribution, also sometimes called a multivariate Gaussian distribution (in honor of Carl Friedrich Gauss, who was not the first to write about the normal distribution), if it satisfies the following equivalent conditions:

  • every linear combination Y=a1X1 + ... + anXn is normally distributed;

  • there is a random vector Z=(Z1, ..., Zm), whose components are independent standard normal random variables, a vector μ = (μ1, ..., μn) and an n-by-m matrix A such that X = A Z + μ.

  • there is a vector μ and a symmetric, positive semi-definite matrix Γ such that the characteristic function of X is

φX(u)=exp(iμTu − (½) uT Γ u).

The following is not quite equivalent to the conditions above, since it fails to allow for a singular matrix as the variance:

  • there is a vector μ=(μ1, ..., μn) and a symmetric, positive semidefinite matrix Γ such that X has density

fX(x1, ..., xn) dx1 ... dxn = (det(2πΓ))−1/2 exp ½((xμ)TΓ−1(xμ)) dx1...dxn.

The vector μ in these conditions is the expected value of X and the matrix Γ=ATA is the covariance matrix of the components Xi. It is important to realize that the covariance matrix must be allowed to be singular. That case arises frequently in statistics; for example, in the distribution of the vector of residuals in ordinary linear regression problems. Note also that the Xi are in general not independent; they can be seen as the result of applying the linear transformation A to a collection of independent Gaussian variables Z.

Proof?

Multivariate Gaussian density

Recall characteristic function of a random vector.

Recall characterizations of gaussian random variables.

Calculate characteristic function of Z in terms of characteristic function of X.

Deduce characteristic functional of X in terms of mean vector and covariance matrix.