
Chaos theory describes the behavior of certain dynamical systems – that is, systems whose state evolves with time – that may exhibit dynamics that are highly sensitive to initial conditions (popularly referred to as the butterfly effect). As a result of this sensitivity, which manifests itself as an exponential growth of perturbations in the initial conditions, the behavior of chaotic systems appears to be random. This happens even though these systems are deterministic, meaning that their future dynamics are fully defined by their initial conditions, with no random elements involved. This behavior is known as deterministic chaos, or simply chaos.
Chaos theory is a field of study in mathematics, with applications in several disciplines including physics, engineering, economics, biology, and philosophy. Chaos theory studies the behavior of dynamical systems that are highly sensitive to initial conditions, an effect which is popularly referred to as the butterfly effect. Small differences in initial conditions (such as those due to rounding errors in numerical computation) yield widely diverging outcomes for such dynamical systems, rendering longterm prediction impossible in general.[1] This happens even though these systems are deterministic, meaning that their future behavior is fully determined by their initial conditions, with no random elements involved.[2] In other words, the deterministic nature of these systems does not make them predictable.[3][4] This behavior is known as deterministic chaos, or simply chaos.
Chaotic behavior can be observed in many natural systems, such as climate models and weather.[5][6] Explanation of such behavior may be sought through analysis of a chaotic mathematical model, or th ough analytical techniques such as recurrence plots and Poincare maps.
Randomness means different things in various fields. Commonly, it means lack of pattern or predictability in events.
The Oxford English Dictionary defines "random" as "Having no definite aim or purpose; not sent or guided in a particular direction; made, done, occurring, etc., without method or conscious choice; haphazard." This concept of randomness suggests a nonorder or noncoherence in a sequence of symbols or steps, such that there is no intelligible pattern or combination.
Applied usage in science, mathematics and statistics recognizes a lack of predictability when referring to randomness, but admits regularities in the occurrences of events whose outcomes are not certain. For example, when throwing two dice and counting the total, we can say that a sum of 7 will randomly occur twice as often as 4. This view, where randomness simply refers to situations where the certainty of the outcome is at issue, applies to concepts of chance, probability, and information entropy. In these situations, randomness implies a measure of uncertainty, and notions of haphazardness are irrelevant.
The fields of mathematics, probability, and statistics use formal definitions of randomness. In statistics, a random variable is an assignment of a numerical value to each possible outcome of an event space. This association facilitates the identification and the calculation of probabilities of the events. A random process is a sequence of random variables describing a process whose outcomes do not follow a deterministic pattern, but follow an evolution described by probability distributions. These and other constructs are extremely useful in probability theory.

