From Wikipedia, the free encyclopedia - View original article

For use in technical analysis of financial instruments, see Stochastic oscillator.

This article may require cleanup to meet Wikipedia's quality standards. (September 2010) |

In probability theory, a purely **stochastic** system is one whose state is non-deterministic (i.e., "random") so that the subsequent state of the system is determined probabilistically. Any system or process that must be analyzed using probability theory is stochastic at least in part.^{[1]}^{[2]} Stochastic systems and processes play a fundamental role in mathematical models of phenomena in many fields of science, engineering, and economics.

Stochastic comes from the Greek word στόχος, which means "aim". It also denotes a target stick; the pattern of arrows around a target stick stuck in a hillside is representative of what is stochastic.

The use of the term *stochastic* to mean *based on the theory of probability* goes back to a 1917 publication by Ladislaus Bortkiewicz (1868–1931).^{[3]} Bortkiewicz used it in the sense of *making conjectures* that the Greek term has borne since the days of the ancient philosophers, and after the title of *Ars Conjectandi* that Jakob Bernoulli gave to his work (published in 1713) on probability theory.^{[4]}

In mathematics, specifically in probability theory, the field of stochastic processes has been^{[when?]} a major area of research.^{[citation needed]}

A stochastic matrix is a matrix that has non-negative real entries that sum to one in each column, row, or both.

In artificial intelligence, stochastic programs work by using probabilistic methods to solve problems, as in simulated annealing, stochastic neural networks, stochastic optimization, genetic algorithms, and genetic programming. A problem itself may be stochastic as well, as in planning under uncertainty.

One of the simplest continuous-time stochastic processes is Brownian motion. This was first observed by botanist Robert Brown while looking through a microscope at pollen grains in water.

The name "Monte Carlo" for the stochastic Monte Carlo method was popularized by physics researchers Stanislaw Ulam, Enrico Fermi, John von Neumann, and Nicholas Metropolis, among others. The name is a reference to the Monte Carlo Casino in Monaco where Ulam's uncle would borrow money to gamble.^{[5]} The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino.

Random methods of computation and experimentation (generally considered forms of stochastic simulation) can be arguably traced back to the earliest pioneers of probability theory (see, e.g., Buffon's needle, and the work on small samples by William Sealy Gosset), but are more specifically traced to the pre-electronic computing era. The general difference usually described about a Monte Carlo form of simulation is that it systematically "inverts" the typical mode of simulation, treating deterministic problems by *first* finding a probabilistic analog (see Simulated annealing). Previous methods of simulation and statistical sampling generally did the opposite: using simulation to test a previously understood deterministic problem. Though examples of an "inverted" approach do exist historically, they were not considered a general method until the popularity of the Monte Carlo method spread.

Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Monte Carlo methods were central to the simulations required for the Manhattan Project, though were severely limited by the computational tools at the time. Therefore, it was only after electronic computers were first built (from 1945 on) that Monte Carlo methods began to be studied in depth. In the 1950s they were used at Los Alamos for early work relating to the development of the hydrogen bomb, and became popularized in the fields of physics, physical chemistry, and operations research. The RAND Corporation and the U.S. Air Force were two of the major organizations responsible for funding and disseminating information on Monte Carlo methods during this time, and they began to find a wide application in many different fields.

Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling.

In biological systems, introducing stochastic "noise" has been found^{[by whom?]} to help improve the signal strength of the internal feedback loops for balance and other vestibular communication. It has been found to help diabetic and stroke patients with balance control.^{[6]} Many biochemical events also lend themselves to stochastic analysis. Gene expression, for example, has a stochastic component through the molecular collisions — as during binding and unbinding of RNA polymerase to a gene promoter — via the solution's Brownian motion.

Stochastic effect, or "chance effect" is one classification of radiation effects that refers to the random, statistical nature of the damage. In contrast to the deterministic effect, severity is independent of dose. Only the *probability* of an effect increases with dose.

Simonton (2003, *Psych Bulletin*) argues that creativity in science (of scientists) is a constrained stochastic behaviour such that new theories in all sciences are, at least in part, the product of a stochastic process.

The results of a stochastic process (statistics) can only be known after computing it.

**Stochastic ray tracing** is the application of Monte Carlo simulation to the computer graphics ray tracing algorithm. "Distributed ray tracing samples the integrand at many randomly chosen points and averages the results to obtain a better approximation. It is essentially an application of the Monte Carlo method to 3D computer graphics, and for this reason is also called *Stochastic ray tracing*."^{[citation needed]}

Although most computers are deterministic machines, their complexity makes deterministic analysis impossible. Consequently, **stochastic forensics** analyzes computer crime by viewing computers as stochastic processes.

In music, mathematical processes based on probability can generate **stochastic** elements.

Stochastic processes may be used in music to compose a fixed piece or may be produced in performance. Stochastic music was pioneered by Iannis Xenakis, who coined the term *stochastic music*. Specific examples of mathematics, statistics, and physics applied to music composition are the use of the statistical mechanics of gases in *Pithoprakta*, statistical distribution of points on a plane in *Diamorphoses*, minimal constraints in *Achorripsis*, the normal distribution in *ST/10* and *Atrées*, Markov chains in *Analogiques*, game theory in *Duel* and *Stratégie*, group theory in *Nomos Alpha* (for Siegfried Palm), set theory in *Herma* and *Eonta*,^{[7]} and Brownian motion in *N'Shima*.^{[citation needed]} Xenakis frequently used computers to produce his scores, such as the *ST* series including *Morsima-Amorsima* and *Atrées*, and founded CEMAMu. Earlier, John Cage and others had composed *aleatoric* or indeterminate music, which is created by chance processes but does not have the strict mathematical basis (Cage's *Music of Changes*, for example, uses a system of charts based on the *I-Ching*).

When color reproductions are made, the image is separated into its component colors by taking multiple photographs filtered for each color. One resultant film or plate represents each of the cyan, magenta, yellow, and black data. Color printing is a binary system, where ink is either present or not present, so all color separations to be printed must be translated into dots at some stage of the work-flow. Traditional line screens which are amplitude modulated had problems with moiré but were used until stochastic screening became available. A stochastic (or frequency modulated) dot pattern creates a sharper image.

Non-deterministic approaches in language studies are largely inspired by the work of Ferdinand de Saussure, for example, in functionalist linguistic theory, which argues that competence is based on performance.^{[8]}^{[9]} This distinction in functional theories of grammar should be carefully distinguished from the *langue* and *parole* distinction. To the extent that linguistic knowledge is constituted by experience with language, grammar is argued to be probabilistic and variable rather than fixed and absolute. This conception of grammar as probabilistic and variable follows from the idea that one's competence changes in accordance with one's experience with language. Though this conception has been contested,^{[10]} it has also provided the foundation for modern statistical natural language processing^{[11]} and for theories of language learning and change.^{[12]}

Stochastic social science theory is similar to systems theory in that events are interactions of systems, although with a marked emphasis on unconscious processes. The event creates its own conditions of possibility, rendering it unpredictable if simply for the amount of variables involved. Stochastic social science theory can be seen as an elaboration of a kind of 'third axis' in which to situate human behavior alongside the traditional 'nature vs. nurture' opposition. See Julia Kristeva on her usage of the 'semiotic', Luce Irigaray on reverse Heideggerian epistemology, and Pierre Bourdieu on polythetic space for examples of stochastic social science theory.^{[citation needed]}

Uncertainty assessment for future performance predictions of wells in oil reservoirs is performed using stochastic methods.^{[13]}

Manufacturing processes are assumed to be stochastic processes. This assumption is largely valid for either continuous or batch manufacturing processes. Testing and monitoring of the process is recorded using a process control chart which plots a given process control parameter over time. Typically a dozen or many more parameters will be tracked simultaneously. Statistical models are used to define limit lines which define when corrective actions must be taken to bring the process back to its intended operational window.

This same approach is used in the service industry where parameters are replaced by processes related to service level agreements.

The financial markets use stochastic models to represent the seemingly random behaviour of assets such as stocks, commodities, relative currency prices (i.e., the price of one currency compared to that of another, such as the price of US Dollar compared to that of the Euro), and interest rates. These models are then used by quantitative analysts to value options on stock prices, bond prices, and on interest rates, see Markov models. Moreover, it is at the heart of the insurance industry.

The marketing and the changing movement of audience tastes and preferences, as well as the solicitation of and the scientific appeal of certain film and television debuts (i.e., their opening weekends, word-of-mouth, top-of-mind knowledge among surveyed groups, star name recognition and other elements of social media outreach and advertising), are determined in part by stochastic modeling. A recent attempt at repeat business analysis was done by Japanese scholars^{[citation needed]} and is part of the Cinematic Contagion Systems patented by Geneva Media Holdings, and such modeling has been used in data collection from the time of the original Nielsen ratings to modern studio and television test audiences.

**^**M. Kac & J. Logan, in*Fluctuation Phenomena*, eds. E.W. Montroll & J.L. Lebowitz, North-Holland, Amsterdam, 1976**^**E. Nelson,*Quantum Fluctuations*, Princeton University Press, Princeton, 1985**^***The Oxford English Dictionary*s.v. "stochastic" quotes: "Die an der Wahrscheinlichkeitstheorie orientierte, somit auf ‘das Gesetz der Grossen Zahlen’ sich gründende Betrachtung empirischer Vielheiten möge als Stochastik..bezeichnet werden." - 1917 L. von Bortkiewicz:*Die Iterationen*3**^**Jeff Miller et al. "Earliest Known Uses of Some of the Words of Mathematics (S)". Retrieved 2009-03-10.**^**Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" pg. 46, John Wiley & Sons, 2007**^**Priplata A. et al. Noise-Enhanced Balance Control in Patients with Diabetes and Patients with Stroke. Ann Neurol 2006;59:4–12. doi:10.1002/ana.20670 PMID 16287079.**^**Ilias Chrissochoidis, Stavros Houliaras, and Christos Mitsakis, "Set theory in Xenakis'*EONTA*", in*International Symposium Iannis Xenakis*, ed. Anastasia Georgaki and Makis Solomos (Athens: The National and Kapodistrian University, 2005), 241–249.**^**Newmeyer, Frederick. 2001. "The Prague School and North American functionalist approaches to syntax" Journal of Linguistics 37, pp. 101-126."Since most American functionalists adhere to this trend, I will refer to it and its practitioners with the initials `USF'. Some of the more prominent USFs are Joan Bybee, William Croft, Talmy Givon, John Haiman, Paul Hopper, Marianne Mithun and Sandra Thompson. In its most extreme form (Hopper 1987, 1988), USF rejects the Saussurean dichotomies such as langue vs. parôle. For early interpretivist approaches to focus, see Chomsky (1971) and Jackendoff (1972). parole and synchrony vs. diachrony. All adherents of this tendency feel that the Chomskyan advocacy of a sharp distinction between competence and performance is at best unproductive and obscurantist; at worst theoretically unmotivated. "**^**Bybee, Joan. "Usage-based phonology." p. 213 in Darnel, Mike (ed). 1999. Functionalism and Formalism in Linguistics: General papers. John Benjamins Publishing Company**^**Chomsky (1959). Review of Skinner's Verbal Behavior, Language, 35: 26-58**^**Manning and Schütze, (1999) Foundations of Statistical Natural Language Processing, MIT Press. Cambridge, MA**^**Bybee (2007) Frequency of use and the organization of language. Oxford: Oxford University Press**^**Gharib Shirangi, M., History matching production data and uncertainty assessment with an efficient TSVD parameterization algorithm, Journal of Petroleum Science and Engineering, http://www.sciencedirect.com/science/article/pii/S0920410513003227

- See the stochastic process of an 8-foot-tall (2.4 m) Probability Machine comparing stock market returns to the randomness of the beans dropping through the quincunx pattern on YouTube. from Index Funds Advisors IFA.com
*Formalized Music: Thought and Mathematics in Composition*by Iannis Xenakis, ISBN 1-57647-079-2*Frequency and the Emergence of Linguistic Structure*by Joan Bybee and Paul Hopper (eds.), ISBN 1-58811-028-1/ISBN 90-272-2948-1 (Eur.)

- The dictionary definition of stochastic at Wiktionary

- Software

- Intermorphic Noatikl, Noatikl is a stochastic / trans-generative music creativity system for Mac and Windows with VST, AU unit plugins, and is successor to Koan.
- Intermorphic Mixtikl, Mixtikl is a 12 track generative music lab with integrated Noatikl stochastic engine for iPhone, iPad, iPod touch, Mac and Windows with web browser, VST and AU unit plugins.