A dynamical system is a deterministic process in which a variable's value changes over time according to a well-defined rule which only involves the variable's current value.

Table of contents
1 Dynamical systems and chaos theory
2 Types of dynamical systems
3 Examples of dynamical systems

Dynamical systems and chaos theory

This branch of mathematics deals with the long-term qualitative behavior of dynamical systems. Here, the focus is not on finding precise solutions to the equations defining the dynamical system (which is often hopeless), but rather to answer questions like "Will the system settle down to a steady state in the long term, and if so, what are the possible steady states?", or "Does the long-term behavior of the system depend on its initial condition?"

An important goal is to describe the fixed points, or steady states of a given dynamical systems; these are values of the variable which won't change over time. Some of these fixed points are attractive, meaning that if the system starts out in a nearby state, it will converge towards the fixed point.

Similarly, one is interested in periodic points, states of the system which repeat themselves after several timesteps. Periodic points can also be attractive. Sarkovskii's theorem is an interesting statement about the number of periodic points of a one-dimensional discrete dynamical system.

Even simple nonlinear dynamical systems often exhibit almost random, completely unpredictable behavior that has been called chaos. The branch of dynamical systems which deals with the clean definition and investigation of chaos is called chaos theory.

Types of dynamical systems

A dynamical system is called discrete if time is measured in discrete steps; these are modeled as recursive relations as for instance in the logistic map

where n denotes the discrete time steps and x is the variable changing over time. If time is measured continuously, the resulting continuous dynamical systems are expressed as ordinary differential equations, for instance

where x is the variable that changes with time t.

The changing variable x is often a real number, but can also be a vector in Rk.

We distinguish between linear dynamical systems and nonlinear dynamical systems. In linear systems, the right-hand-side of the equation is an expression which depends linearly on x, as in

If two solutions to a linear system are given, then their sum is also a solution ("superposition principle"). In general, the solutions form a vector space, which allows to use linear algebra and simplifies the analysis significantly. For linear continuous systems, the method of Laplace transform can also be used to transform the differential equation into an algebraic equation.

The two examples given earlier are nonlinear systems. These are much harder to analyze and often exhibit a phenomenon known as chaos which marks complete unpredictability; see also nonlinearity.

Examples of dynamical systems

See also: List of dynamical system topics