Astronomers describe the distribution of galaxies in the universe by means of a correlation function. By default, correlation function refers to the two-point autocorrelation function. For a given distance, the two-point autocorrelation function is a function of one variable (distance) which describes the probability that two galaxies are separated by this particular distance. It can be thought of as a lumpiness factor - the higher the value for some distance scale, the more lumpy the universe is at that distance scale.
The following definition (from Peebles 1980) is often cited:
- Given a random galaxy in a location, the correlation function describes the probability that another galaxy will be found within a given distance.
The n-point autocorrelation functions for n greater than 2 or cross-correlation functions for particular object types are defined similarly to the two-point autocorrelation function.
The correlation function is important for theoretical models of cosmology because it provides a means of testing models which assume different things about the contents of the universe. Computer models which calculate the formation of galaxies seem to favor cold dark matter as the model with the most support.