In computer science, algorithms work-in-place if they transform a data structure using a minimal, constant amount of extra memory (or disk) space. The input is overwritten with the output.

For example, sorting algorithms that can rearrange arrays into a desired order in-place include:

Quicksort is commonly described as an in-place algorithm, but is not in fact one. Most implementations require O(log n) space to support its divide-and-conquer recursion.

In computational complexity theory, in-place algorithms have O(1) space complexity.

Functional programming languages often discourage or don't support in-place algorithms that overwrite data (rather than merely constructing new data). This is a type of side effect. Note that it is possible in principle to carefully construct in-place algorithms that don't modify data (unless the data is no longer being used), but this is rarely done in practice. See purely functional data structures.