Actions

::Dynamical system

::concepts



{{#invoke:Hatnote|hatnote}} {{#invoke:redirect hatnote|redirect}}

The Lorenz attractor arises in the study of the Lorenz Oscillator, a dynamical system.

In mathematics, a dynamical system is a set of relationships among two or more measurable quantities, in which a fixed rule describes how the quantities evolve over time in response to their own values. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, and the number of fish each springtime in a lake.

At any given time a dynamical system has a state given by a set of real numbers (a vector) that can be represented by a point in an appropriate state space (a geometrical manifold). The evolution rule of the dynamical system is a function that describes what future states follow from the current state. Often the function is deterministic; in other words, for a given time interval only one future state follows from the current state;<ref>Strogatz, S. H. (2001). Nonlinear dynamics and chaos: with applications to physics, biology and chemistry. Perseus publishing.</ref><ref>Katok, A., & Hasselblatt, B. (1995). Introduction to the modern theory of dynamical systems. Cambridge, Cambridge.</ref> however, some systems are stochastic, in that random events also affect the evolution of the state variables.


Dynamical system sections
Intro  Overview  History  Basic definitions  Linear dynamical systems  Local dynamics  Bifurcation theory  Ergodic systems   Examples of dynamical systems    Multidimensional generalization    See also   References   Further reading    External links   

PREVIOUS: IntroNEXT: Overview
<<>>

Systems::system    Space::point    Phase::''x''    Theory::systems    Title::author    ''t''::theorem

{{#invoke:Hatnote|hatnote}} {{#invoke:redirect hatnote|redirect}}

The Lorenz attractor arises in the study of the Lorenz Oscillator, a dynamical system.

In mathematics, a dynamical system is a set of relationships among two or more measurable quantities, in which a fixed rule describes how the quantities evolve over time in response to their own values. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, and the number of fish each springtime in a lake.

At any given time a dynamical system has a state given by a set of real numbers (a vector) that can be represented by a point in an appropriate state space (a geometrical manifold). The evolution rule of the dynamical system is a function that describes what future states follow from the current state. Often the function is deterministic; in other words, for a given time interval only one future state follows from the current state;<ref>Strogatz, S. H. (2001). Nonlinear dynamics and chaos: with applications to physics, biology and chemistry. Perseus publishing.</ref><ref>Katok, A., & Hasselblatt, B. (1995). Introduction to the modern theory of dynamical systems. Cambridge, Cambridge.</ref> however, some systems are stochastic, in that random events also affect the evolution of the state variables.


Dynamical system sections
Intro  Overview  History  Basic definitions  Linear dynamical systems  Local dynamics  Bifurcation theory  Ergodic systems   Examples of dynamical systems    Multidimensional generalization    See also   References   Further reading    External links   

PREVIOUS: IntroNEXT: Overview
<<>>