## Degrees of freedom (statistics) - Wikipedia, the free encyclopedia

--------------------

** Degrees of freedom (statistics) **

In statistics, the number of *degrees of freedom* is the number of values
in the final calculation of a statistic that are free to vary.^[1]

The number of independent ways by which a dynamic system can move without
violating any constraint imposed on it, is called degree of freedom. In
other words, the degree of freedom can be defined as the minimum number of
independent coordinates which can specify the position of the system
completely.

Estimates of statistical parameters can be based upon different amounts of
information or data. The number of independent pieces of information that
go into the estimate of a parameter is called the degrees of freedom. In
general, the degrees of freedom of an estimate of a parameter is equal to
the number of independent scores that go into the estimate minus the number
of parameters used as intermediate steps in the estimation of the parameter
itself (which, in sample variance, is one, since the sample mean is the
only intermediate step).^[2]

Mathematically, degrees of freedom is the number of dimensions of the
domain of a random vector, or essentially the number of 'free' components:
how many components need to be known before the vector is fully determined.

The term is most often used in the context of linear models (linear
regression, analysis of variance), where certain random vectors are
constrained to lie in linear subspaces, and the number of degrees of
freedom is the dimension of the subspace. The degrees of freedom are also
commonly associated with the squared lengths (or "sum of squares" of the