Torsion constant: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Greglocock
rvv
en>Optimale
Reverted to revision 586469992 by Greglocock (talk): Remove unexplained change inconsistent with reference. (TW)
Line 1: Line 1:
In [[probability theory]], the '''craps principle''' is a theorem about [[Event (probability theory)|event]] [[probabilities]] under repeated [[Independent and identically-distributed random variables|iid]] trials. Let <math>E_1</math> and <math>E_2</math> denote two [[mutually exclusive]] events which might occur on a given trial. Then for each trial, the [[conditional probability]] that <math>E_1</math> occurs given that <math>E_1</math> or <math>E_2</math> occur is
Andera is what you can call her but she by no means truly favored that name. My husband doesn't like it the way I do but what I truly like performing is  love [http://gcjcteam.org/index.php?mid=etc_video&document_srl=696611&sort_index=regdate&order_type=desc psychic solutions by lynne] readings ([http://formalarmour.com/index.php?do=/profile-26947/info/ formalarmour.com]) caving but I don't have the time recently. Alaska is the only location I've been residing in but now I'm contemplating other choices. My day job is a travel agent.<br><br>Review my web blog: free online tarot card readings ([http://srncomm.com/blog/2014/08/25/relieve-that-stress-find-a-new-hobby/ click for more info])
 
:<math>\operatorname{P}\left[E_1\mid E_1\cup E_2\right]=\frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}</math>
 
The events <math>E_1</math> and <math>E_2</math> need not be [[collectively exhaustive]].
==Proof==
Since <math>E_1</math> and <math>E_2</math> are mutually exclusive,
 
:<math> \operatorname{P}[E_1\cup E_2]=\operatorname{P}[E_1]+\operatorname{P}[E_2]</math>
 
Also due to mutual exclusion,
 
:<math> E_1\cap(E_1\cup E_2)=E_1</math>
 
By [[conditional probability]],
 
:<math> \operatorname{P}[E_1\cap(E_1\cup E_2)]=\operatorname{P}\left[E_1\mid E_1\cup E_2\right]\operatorname{P}\left[E_1\cup E_2\right]</math>
 
Combining these three yields the desired result.
 
==Application==
 
If the trials are repetitions of a game between two players, and the events are
 
:<math>E_1:\mathrm{ player\ 1\ wins}</math>
:<math>E_2:\mathrm{ player\ 2\ wins}</math>
 
then the craps principle gives the respective conditional probabilities of each player winning a certain repetition, given that someone wins (i.e., given that a [[draw (tie)|draw]] does not occur). In fact, the result is only affected by the relative marginal probabilities of winning <math>\operatorname{P}[E_1]</math> and <math>\operatorname{P}[E_2]</math> ; in particular, the probability of a draw is irrelevant.
 
===Stopping===
If the game is played repeatedly until someone wins, then the conditional probability above turns out to be the probability that the player wins the game.
 
==Etymology==
If the game being played is [[craps]], then this principle can greatly simplify the computation of the probability of winning in a certain scenario. Specifically, if the first roll is a 4, 5, 6, 8, 9, or 10, then the dice are repeatedly re-rolled until one of two events occurs:
:<math>E_1:\textrm{ the\ original\ roll\ (called\ 'the\ point')\ is\ rolled\ (a\ win) }</math>
:<math>E_2:\textrm{ a\ 7\ is\ rolled\ (a\ loss) }</math>
 
Since <math>E_1</math> and <math>E_2</math> are mutually exclusive, the craps principle applies. For example, if the original roll was a 4, then the probability of winning is
 
:<math>\frac{3/36}{3/36 + 6/36}=\frac{1}{3}</math>
 
This avoids having to sum the [[infinite series]] corresponding to all the possible outcomes:
 
:<math>\sum_{i=0}^{\infty}\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]</math>
 
Mathematically, we can express the probability of rolling <math>i</math> ties followed by rolling the point:
 
:<math>\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]
= (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]
</math>
 
The summation becomes an infinite [[geometric series]]:
 
:<math>\sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]
= \operatorname{P}[E_1] \sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i
</math>
 
::<math> = \frac{\operatorname{P}[E_1]}{1-(1-\operatorname{P}[E_1]-\operatorname{P}[E_2])}
= \frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}
</math>
 
which agrees with the earlier result.
 
==References==
{{cite book |author=Pitman, Jim |title=Probability |publisher=Springer-Verlag |location=Berlin |year=1993 |pages= |isbn=0-387-97974-3 |oclc= |doi=}}
 
[[Category:Statistical theorems]]
[[Category:Probability theory]]
[[Category:Statistical principles]]

Revision as of 15:47, 25 February 2014

Andera is what you can call her but she by no means truly favored that name. My husband doesn't like it the way I do but what I truly like performing is love psychic solutions by lynne readings (formalarmour.com) caving but I don't have the time recently. Alaska is the only location I've been residing in but now I'm contemplating other choices. My day job is a travel agent.

Review my web blog: free online tarot card readings (click for more info)