|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| In [[probability theory]], the '''craps principle''' is a theorem about [[Event (probability theory)|event]] [[probabilities]] under repeated [[Independent and identically-distributed random variables|iid]] trials. Let <math>E_1</math> and <math>E_2</math> denote two [[mutually exclusive]] events which might occur on a given trial. Then for each trial, the [[conditional probability]] that <math>E_1</math> occurs given that <math>E_1</math> or <math>E_2</math> occur is
| | Jayson Berryhill is how I'm known as and my spouse doesn't like it at all. He is an info officer. She is truly fond of caving but she doesn't have the time recently. Her family members lives in Alaska but her husband wants them to transfer.<br><br>my blog post ... phone psychic readings ([http://modenpeople.co.kr/modn/qna/292291 http://modenpeople.co.kr/]) |
| | |
| :<math>\operatorname{P}\left[E_1\mid E_1\cup E_2\right]=\frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}</math>
| |
| | |
| The events <math>E_1</math> and <math>E_2</math> need not be [[collectively exhaustive]].
| |
| ==Proof==
| |
| Since <math>E_1</math> and <math>E_2</math> are mutually exclusive,
| |
| | |
| :<math> \operatorname{P}[E_1\cup E_2]=\operatorname{P}[E_1]+\operatorname{P}[E_2]</math>
| |
| | |
| Also due to mutual exclusion,
| |
| | |
| :<math> E_1\cap(E_1\cup E_2)=E_1</math>
| |
| | |
| By [[conditional probability]],
| |
| | |
| :<math> \operatorname{P}[E_1\cap(E_1\cup E_2)]=\operatorname{P}\left[E_1\mid E_1\cup E_2\right]\operatorname{P}\left[E_1\cup E_2\right]</math>
| |
| | |
| Combining these three yields the desired result.
| |
| | |
| ==Application==
| |
| | |
| If the trials are repetitions of a game between two players, and the events are
| |
| | |
| :<math>E_1:\mathrm{ player\ 1\ wins}</math>
| |
| :<math>E_2:\mathrm{ player\ 2\ wins}</math>
| |
| | |
| then the craps principle gives the respective conditional probabilities of each player winning a certain repetition, given that someone wins (i.e., given that a [[draw (tie)|draw]] does not occur). In fact, the result is only affected by the relative marginal probabilities of winning <math>\operatorname{P}[E_1]</math> and <math>\operatorname{P}[E_2]</math> ; in particular, the probability of a draw is irrelevant.
| |
| | |
| ===Stopping===
| |
| If the game is played repeatedly until someone wins, then the conditional probability above turns out to be the probability that the player wins the game.
| |
| | |
| ==Etymology==
| |
| If the game being played is [[craps]], then this principle can greatly simplify the computation of the probability of winning in a certain scenario. Specifically, if the first roll is a 4, 5, 6, 8, 9, or 10, then the dice are repeatedly re-rolled until one of two events occurs:
| |
| :<math>E_1:\textrm{ the\ original\ roll\ (called\ 'the\ point')\ is\ rolled\ (a\ win) }</math>
| |
| :<math>E_2:\textrm{ a\ 7\ is\ rolled\ (a\ loss) }</math>
| |
| | |
| Since <math>E_1</math> and <math>E_2</math> are mutually exclusive, the craps principle applies. For example, if the original roll was a 4, then the probability of winning is
| |
| | |
| :<math>\frac{3/36}{3/36 + 6/36}=\frac{1}{3}</math>
| |
| | |
| This avoids having to sum the [[infinite series]] corresponding to all the possible outcomes:
| |
| | |
| :<math>\sum_{i=0}^{\infty}\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]</math> | |
| | |
| Mathematically, we can express the probability of rolling <math>i</math> ties followed by rolling the point:
| |
| | |
| :<math>\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]
| |
| = (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]
| |
| </math>
| |
| | |
| The summation becomes an infinite [[geometric series]]:
| |
| | |
| :<math>\sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]
| |
| = \operatorname{P}[E_1] \sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i
| |
| </math>
| |
| | |
| ::<math> = \frac{\operatorname{P}[E_1]}{1-(1-\operatorname{P}[E_1]-\operatorname{P}[E_2])}
| |
| = \frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}
| |
| </math>
| |
| | |
| which agrees with the earlier result.
| |
| | |
| ==References==
| |
| {{cite book |author=Pitman, Jim |title=Probability |publisher=Springer-Verlag |location=Berlin |year=1993 |pages= |isbn=0-387-97974-3 |oclc= |doi=}}
| |
| | |
| [[Category:Statistical theorems]]
| |
| [[Category:Probability theory]]
| |
| [[Category:Statistical principles]]
| |
Jayson Berryhill is how I'm known as and my spouse doesn't like it at all. He is an info officer. She is truly fond of caving but she doesn't have the time recently. Her family members lives in Alaska but her husband wants them to transfer.
my blog post ... phone psychic readings (http://modenpeople.co.kr/)