In mathematics, a smooth function is one that is infinitely differentiable, i.e., has derivatives of all finite orders.

For example, the exponential function is trivially smooth because the derivative of the exponential function is the exponential function itself.

It is often useful to construct smooth functions that are zero outside a given interval, but not inside it. This is possible; on the other hand it is impossible that a power series can have that property. This shows that there is a large gap between smooth and analytic functions; so that Taylor's theorem cannot in general be applied to expand smooth functions.

To give an explicit construct of such functions, we can start with a function such as f(x) = exp(-1/x), defined initially for x > 0. Not only do we have f(x) -> 0 as x -> 0 from above, we have P(x)f(x) -> 0 for any polynomial P - because exponential growth with a negative exponent dominates. That means that setting f(x) = 0 for x < 0 gives a smooth function. Combinations such as f(x)f(1-x) can then be made with any required interval as support; in this case the interval [0,1]. Such functions have an extremely slow 'lift-off' from 0.

Thinking in terms of complex analysis, a function like g(z) = exp(-1/z2) is smooth for z taking real values, but has an essential singularity at z = 0. That is, the behaviour near z = 0 is bad; but it happens that one cannot see that looking at real arguments alone.

This article is a stub. You can help Wikipedia by fixing it.