In the fields of algorithm analysis and computational complexity theory, the runtime or space requirements of an algorithm are expressed as a function of the problem size. The problem size measures the size, in some sense, of the input to the algorithm. The problem size has to be cleanly defined before an algorithm analysis can be attempted.

For many problems, the problem size is taken to be the number of bits required to encode the input. For instance, if the problem is to square a given integer, we would typically measure the input size as the logarithm of the input integer (since that describes how many bits are needed to encode the integer in binary notation). However, often the encoding of the input is not canonical; if for instance the problem is one in graph theory, then different problem sizes can be defined, since a graph can be encoded as a list of edges or alternatively as an adjacency matrix.