Consumer price index: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Monkbot
en>Goethean
Undid revision 597334603 by 115.118.14.215 (talk) undo blanking
Line 1: Line 1:
: ''For a matrix whose elements are stochastic, see [[Random matrix]]


In [[mathematics]], a '''stochastic matrix''' (also termed '''probability matrix''', '''transition matrix''',<ref>{{cite doi|10.1007/0-387-21525-5_1}}</ref> '''substitution matrix''', or '''Markov matrix''') is a [[matrix (mathematics)|matrix]] used to describe the transitions of a [[Markov chain]]. Each of its entries is a [[nonnegative]] [[real number]] representing a [[probability]]. It has found use in [[probability theory]], [[statistics]] and [[linear algebra]], as well as [[computer science]] and [[population genetics]].
There are several different definitions and types of stochastic matrices:


:A '''right stochastic matrix''' is a square matrix of nonnegative real numbers, with each row summing to 1.
Dalton is [http://www.bing.com/search?q=what%27s+written&form=MSNNWS&mkt=en-us&pq=what%27s+written what's written] for his birth certificate however he never really liked that name. The preference hobby for him and his kids is farming but he's been bringing on new things a short time ago. Auditing is where his primary income originates from. Massachusetts is where or even and his wife in [http://Data.gov.uk/data/search?q=real+time real time]. He's not godd at design but consider want to check our website: http://Prometeu.net/<br><br>Here is my weblog: [http://Prometeu.net/ clash of clans hack cydia]
 
:A '''left stochastic matrix''' is a square matrix of nonnegative real numbers, with each column summing to 1.
 
:A '''[[doubly stochastic matrix]]''' is a square matrix of nonnegative real numbers with each row and column summing to 1.
 
In the same vein, one may define '''[[Probability vector|stochastic vector]]''' (also called '''probability vector''') as a [[Euclidean vector|vector]] whose elements are nonnegative real numbers which sum to 1. Thus, each row of a right stochastic matrix (or column of a left stochastic matrix) is a stochastic vector.
 
A common convention in English language mathematics literature is to use [[row vector]]s of probabilities and right stochastic matrices rather than [[column vector]]s of probabilities and left stochastic matrices; this article follows that convention.
 
==Definition and properties==
A stochastic matrix describes a [[Markov chain]] <math>\boldsymbol{X}_{t}</math> over a [[finite set|finite]] [[Probability space|state space]] ''S''.
 
If the [[probability]] of moving from <math>i</math> to <math>j</math> in one time step is <math>Pr(i|j)=P_{i,j}</math>, the stochastic matrix ''P'' is given by using <math>P_{i,j}</math> as the <math>i^{th}</math> row and <math>j^{th}</math> column element, e.g.,
 
:<math>P=\left(\begin{matrix}p_{1,1}&p_{1,2}&\dots&p_{1,j}&\dots\\
p_{2,1}&p_{2,2}&\dots&p_{2,j}&\dots\\
\vdots&\vdots&\ddots&\vdots&\ddots\\
p_{i,1}&p_{i,2}&\dots&p_{i,j}&\dots\\
\vdots&\vdots&\ddots&\vdots&\ddots
\end{matrix}\right).</math>
 
Since the probability of transitioning from state <math>i</math> to some state must be 1, this matrix is a right stochastic matrix, so that
 
:<math>\sum_j P_{i,j}=1.\,</math>
 
The probability of transitioning from <math>i</math> to <math>j</math> in two steps is then given by the <math>(i,j)^{th}</math> element of the square of <math>P</math>:
 
:<math>\left(P ^{2}\right)_{i,j}.</math>
 
In general the probability transition of going from any state to another state in a finite Markov chain given by the matrix <math>P</math> in ''k'' steps is given by <math>P^k</math>.
 
An initial distribution is given as a [[row vector]].
 
A stationary probability vector <math>\boldsymbol{\pi}</math> is defined as a vector that does not change under application of the transition matrix; that is, it is defined as a left [[eigenvector]] of the probability matrix, associated with [[eigenvalue]] 1:
 
:<math>\boldsymbol{\pi}P=\boldsymbol{\pi}.</math>
 
The [[Perron–Frobenius theorem]] ensures that every stochastic matrix has such a vector, and that the largest absolute value of an eigenvalue is always 1. In general, there may be several such vectors.  However, for a matrix with strictly positive entries, this vector is unique and can be computed by observing that for any <math>i</math> we have the following limit,
 
:<math>\lim_{k\rightarrow\infty}\left(P^k \right)_{i,j}=\boldsymbol{\pi}_j,</math>
 
where <math>\boldsymbol{\pi}_{j}</math> is the <math>j^{th}</math> element of the row vector <math>\boldsymbol{\pi}</math>. This implies that the long-term probability of being in a state <math>j</math> is independent of the initial state <math>i</math>. That either of these two computations give one and the same stationary vector is a form of an [[ergodic theorem]], which is generally true in a wide variety of [[dissipative dynamical system]]s: the system evolves, over time, to a [[stationary state]]. Intuitively, a stochastic matrix represents a Markov chain with no sink states, this implies that the application of the stochastic matrix to a probability distribution would redistribute the probability mass of the original distribution while preserving its total mass. If this process is applied repeatedly the distribution converges to a stationary distribution for the Markov chain.
 
== Example: the cat and mouse ==
 
Suppose you have a timer and a row of five adjacent boxes, with a cat in the first box and a mouse in the fifth box at time zero. The cat and the mouse both jump to a random adjacent box when the timer advances. E.g. if the cat is in the second box and the mouse in the fourth one, the probability is one fourth that the cat will be in the first box and the mouse in the fifth after the timer advances. If the cat is in the first box and the mouse in the fifth one, the probability is one that the cat will be in box two and the mouse will be in box four after the timer advances. The cat eats the mouse if both end up in the same box, at which time the game ends. The [[random variable]] ''K'' gives the number of time steps the mouse stays in the game.
 
The [[Markov chain]] that represents this game contains the following five states specified by the combination of positions (cat,mouse):
 
* State 1: (1,3)
* State 2: (1,5)
* State 3: (2,4)
* State 4: (3,5)
* State 5: game over: (2,2), (3,3) & (4,4).
 
We use a stochastic matrix to represent the [[transition probabilities]] of this system (note that rows and columns in this matrix are indexed by the possible states listed above),
 
:<math> P =
\begin{bmatrix}
    0  & 0    & 1/2  &  0  & 1/2 \\
    0  & 0    & 1    &  0  & 0 \\
  1/4  & 1/4  & 0    & 1/4  & 1/4 \\
    0  & 0    & 1/2  &  0  & 1/2 \\
    0  & 0    & 0    &  0  & 1
\end{bmatrix}.</math>
 
===Long-term averages===
 
No matter what the initial state the cat will eventually catch the mouse and a stationary state '''&pi;''' = (0,0,0,0,1) is approached as a limit. To compute the long term average or expected value of a stochastic variable Y, for each state S<sub>j</sub> and time t<sub>k</sub> there is a contribution of Y<sub>j,k</sub>·P(S=S<sub>j</sub>,t=t<sub>k</sub>). Survival can be treated as a binary variable with Y=1 for a surviving state and Y=0 for the terminated stated. The states with Y=0 don't contribute to the long term average.
 
===Phase-type representation===
[[Image:Mousesurvival.jpg|thumb|right|350px|The survival function of the mouse. Note the mouse will survive at least the first time step.]]
 
As State 5 is an absorbing state the distribution of time to absorption is [[discrete phase-type distribution|discrete phase-type distributed]]. Suppose the system starts in state 2, represented by the vector <math>[0,1,0,0,0]</math>. The states where the mouse has perished don't contribute to the survival average so state five can be ignored. The initial state and transition matrix can be reduced to,
 
:<math>\boldsymbol{\tau}=[0,1,0,0]</math>
 
and,
 
:<math>T=\begin{bmatrix}
    0  & 0    & 1/2  &  0\\
    0  & 0    & 1    &  0\\
  1/4  & 1/4  &  0  & 1/4\\
    0  & 0    & 1/2  &  0
\end{bmatrix}\,,</math>  note that <math>(I-T)^{-1}\boldsymbol{1}
=\begin{bmatrix}2.75 \\ 4.5 \\ 3.5 \\ 2.75\end{bmatrix}\,,</math>
where <math>I</math> is the [[identity matrix]], and <math>\mathbf{1}</math> represents a column matrix of all ones that acts as a sum over states.
 
Since each state is occupied for one step of time the expected time of the mouse's survival is just the [[Matrix polynomial#Matrix geometrical series|sum]] of the probability of occupation over all surviving states and steps in time,
 
:<math>E[K]=\boldsymbol{\tau}(I+T+T^2+\cdots)\boldsymbol{1}=\boldsymbol{\tau}(I-T)^{-1}\boldsymbol{1}=4.5.</math>
 
Higher order moments are given by
 
:<math>E[K(K-1)\dots(K-n+1)]=n!\boldsymbol{\tau}(I-{T})^{-n}{T}^{n-1}\mathbf{1}\,.</math>
 
==See also==
* [[Muirhead's inequality]]
* [[Perron–Frobenius theorem]]
* [[Doubly stochastic matrix]]
* [[Discrete phase-type distribution]]
* [[Probabilistic automaton]]
* [[Models of DNA evolution]]
* [[Markov kernel]], the equivalent of a stochastic matrix over a continuous state space
 
==References==
{{Reflist}}
*  G. Latouche, V. Ramaswami. ''Introduction to Matrix Analytic Methods in Stochastic Modeling'', 1st edition. Chapter 2: PH Distributions; ASA SIAM, 1999.
 
[[Category:Matrices]]
[[Category:Markov models]]

Revision as of 18:03, 27 February 2014


Dalton is what's written for his birth certificate however he never really liked that name. The preference hobby for him and his kids is farming but he's been bringing on new things a short time ago. Auditing is where his primary income originates from. Massachusetts is where or even and his wife in real time. He's not godd at design but consider want to check our website: http://Prometeu.net/

Here is my weblog: clash of clans hack cydia