Originally the term Zipf's law meant the observation of Harvard linguist George Kingsley Zipf that the frequency of use of the the nth-most-frequently-used word in any natural language is inversely proportional to n.

Mathematically, that is impossible if there are infinitely many words in a language, since (letting c > 0 denote the constant of proportionality that would make the sum of all relative frequencies equal to 1) we have

Empirical studies have found that in English, the frequencies of the approximately 1000 most-frequently-used words are approximately proportional to 1/n1+ε where ε is just slightly more than zero. After about the 1000th word, the frequencies drop off faster.

[A scholarly reference to support this assertion about word frequencies should be added here.]

As long as the exponent 1 + ε exceeds 1, it is possible for such a law to hold with infinitely many words, since if s > 1 then

The value of this sum is ζ(s), where ζ is Riemann's zeta function.

The term Zipf's law has consequently come to be used to refer to frequency distributions of "rank data" in which the relative frequency of the nth-ranked item is given by the Zeta distribution, 1/(nsζ(s)), where s > 1 is a parameter indexing this family of probability distributions. Indeed, the term Zipf's law sometimes simply means the zeta distribution, since probability distributions are sometimes called "laws".

A more general law proposed by Benoit Mandelbrot has frequencies

This is the Zipf-Mandelbrot law. The "constant" inthis case is the reciprocal of the Hurwitz zeta function evaluated at s.

Zipf's law is an experimental law, not a theoretical one. The causes of Zipfian distributions in real life are a matter of some controversy. However, Zipfian distributions are commonly observed in many kinds of phenomena.

Zipf's law is often demonstrated by scatterplotting the data, with the axes being log(rank order) and log(frequency). If the points are close to a single straight line, the distribution follows Zipf's law.

Examples of collections approximately obeying Zipf's law:

  • frequency of accesses to web pages
    • in particular the access counts on the Wikipedia most popular page, with b approximately equal to 0.3
    • page access counts on Polish Wikipedia (data for late July 2003) approximately obey Zipf's law with b about 0.5
  • words in the English language
  • sizes of settlements
  • income distribution amongst individuals
  • size of earthquakes

It has been pointed out (see external link below) that Zipfian distributions can also be regarded as being Pareto distributions with an exchange of variables.

Table of contents
1 See also
2 Further reading
3 External links

See also

Further reading

External links