This article is not about GaussMarkov processes.
In statistics, the GaussMarkov theorem states that in a linear model in which the errors have expectation zero and are uncorrelated and have equal variances, the best linear unbiased estimators of the coefficients are the leastsquares estimators. More generally, the best linear unbiased estimator of any linear combination of the coefficients is its leastsquares estimator. The errors are not assumed to be normally distributed, nor are they assumed to be independent (but only uncorrelated  a weaker condition), nor are they assumed to be identically distributed (but only homoscedastic  a weaker condition, defined below).
More explicitly, and more concretely, suppose we have

for
i = 1, . . . ,
n, where β
_{0} and β
_{1} are nonrandom but
unobservable parameters,
x_{i} are nonrandom and observable, ε
_{i} are random, and so
Y_{i} are random. (We set
x in lowercase because it is not random, and
Y in capital because it is random.) The random variables ε
_{i} are called the "errors". The
GaussMarkov assumptions state that
(i.e., all errors have the same variance; that is "homoscedasticity"), and
for ; that is "uncorrelatedness."
A
linear unbiased estimator of β
_{1} is a linear combination

in which the coefficients
c_{i} are not allowed depend on the earlier coefficients β
_{i}, since those are not observable, but are allowed to depend on
x_{i}, since those are observable, and whose expected value remains β
_{1} even if the values of β
_{i} change. (The dependence of the coefficients on the
x_{i} is typically nonlinear; the estimator is linear in that which is random; that is why this is
"linear" regression.) The
mean squared error of such an estimator is

i.e., it is the expectation of the square of the difference between the estimator and the parameter to be estimated. (The mean squared error of an estimator coincides with the estimator's variance if the estimator is unbiased; for biased estimators the mean squared error is the sum of the variance and the square of the bias.) The
best linear unbiased estimator is the one with the smallest mean squared error. The "leastsquares estimators" of β
_{0} and β
_{1} are the functions and of the
Ys and the
xs that make the
sum of squares of residuals

as small as possible.
The main idea of the proof is that the leastsquares estimators are uncorrelated with every linear unbiased estimator of zero, i.e., with every linear combination

whose coefficients do not depend upon the unobservable β
_{i} but
whose expected value remains zero regardless of how the values of β
_{1} and β
_{2} change.
External links
For a brief history of the theorem and an explanation of its name see the entry on the GaussMarkov theorem in