|
|
Line 1: |
Line 1: |
| {{about|the form of [[Bayes' theorem]]|the decision rule|Bayes estimator|the use of Bayes factor in model selection|Bayes factor}}
| | House containers turn garden artistry (recyclers rejoice). A picayune ingeniousness you seat ferment items that in the beginning were not intentional as [https://www.gov.uk/search?q=planters planters] merely experience gravid potential into garden containers. For instance: <br>1) Fully grown pliant company bowls (ordinarily reserved for chips and dip) buns be utilitarian and colourful.<br><br>2) Orotund metal or moldable buckets are magnetic because they are deeply and all-embracing with the added incentive of [https://Www.google.com/search?hl=en&gl=us&tbm=nws&q=handles handles] (slowly to move).<br><br>3) Cable baskets (caning is dependable also) - equitable air with shaping or moss<br><br>4) Large quondam toys comparable a knock down truck or black Maria - specially if they calm down receive wheels.<br><br>But unrivalled dominion to play along - no affair what the corporeal - it mustiness undergo drain holes.<br><br>As a container nurseryman you do undergo an easier metre and then the "traditional gardener". You toilet form exercise of the smallest of quad. Littler quad means less variables to distribute with. Omission the laborious make for of tilling and adding dirty amendments to the beds. Harvesting is easier - you tooshie play the dope to you! Almost never cause weeds. Containers are nomadic - regroup them into satisfying combinations and assume vantage of the changing flavor. <br><br>What around that onetime saying -"location, location location"? Anyone with a atom of outdoor infinite bottom bask the therapeutical and rewarding quest of gardening. Balcony or porch, deck of cards or terrace<br>level front end or gage stairs - are places to expose twelvemonth round of drinks interest group. Fill up the pots with scented smelling plants and receive your visitors with a heavenly fragrance.It is an easy agency to institute the garden nearer to you! Or a windowpane corner deep-rooted with herbs - well-fixed picking for the salad! Level taboo in the garden integrate garden containers into the whole scheme, occupy in seasonal worker gaps and admit reward of specialized micro-environments. Set up to each one container where it fits Charles Herbert Best and act it apart when it is retiring its flush. <br><br>At that place are many with child books that throne extend advice on preparing and selecting plants, choosing containers from suspension field goal to windowpane boxes and creating assorted themes. Books that backside be a pathfinder for anyone wish to transubstantiate level the smallest recession with style and confidence.<br>Gain effective usage of your infinite with container gardening, where you pot get ripe things to eat up or equitable tone at! Regular with ace container and a small provision you put up consume a garden for wholly seasons!<br><br>If you liked this article therefore you would like to receive more info relating to [http://www.200box.com/%E0%B8%95%E0%B8%B9%E0%B9%89%E0%B8%84%E0%B8%AD%E0%B8%99%E0%B9%80%E0%B8%97%E0%B8%99%E0%B9%80%E0%B8%99%E0%B8%AD%E0%B8%A3%E0%B9%8C%E0%B8%AA%E0%B8%B3%E0%B9%80%E0%B8%A3%E0%B9%87%E0%B8%88%E0%B8%A3%E0%B8%B9/ บ้านตู้คอนเทนเนอร์] nicely visit our website. |
| | |
| {{Bayesian statistics}}
| |
| In [[probability theory]] and applications, '''Bayes' rule''' relates the [[odds]] of event <math>A_1</math> to event <math>A_2</math>, before (prior to) and after (posterior to) [[Conditional probability|conditioning]] on another event <math>B</math>. The odds on <math>A_1</math> to event <math>A_2</math> is simply the ratio of the probabilities of the two events. The prior odds is the ratio of the unconditional or prior probabilities, the posterior odds is the ratio of conditional or posterior probabilities given the event <math>B</math>. The relationship is expressed in terms of the '''likelihood ratio''' or '''Bayes factor''', <math>\Lambda</math>. By definition, this is the ratio of the conditional probabilities of the event <math>B</math> given that <math>A_1</math> is the case or that <math>A_2</math> is the case, respectively. The rule simply states: '''posterior odds equals prior odds times Bayes factor''' (Gelman et al., 2005, Chapter 1).
| |
| | |
| When arbitrarily many events <math>A</math> are of interest, not just two, the rule can be rephrased as '''posterior is proportional to prior times likelihood''', <math>P(A|B)\propto P(A) P(B|A)</math> where the proportionality symbol means that the left hand side is proportional to (i.e., equals a constant times) the right hand side as <math>A</math> varies, for fixed or given <math>B</math> (Lee, 2012; Bertsch McGrayne, 2012). In this form it goes back to Laplace (1774) and to Cournot (1843); see Fienberg (2005).
| |
| | |
| Bayes' rule is an equivalent way to formulate [[Bayes' theorem]]. If we know the odds for and against <math>A</math> we also know the probabilities of <math>A</math>. It may be preferred to Bayes' theorem in practice for a number of reasons.
| |
| | |
| Bayes' rule is widely used in [[statistics]], [[science]] and [[engineering]], for instance in [[Bayesian model selection|model selection]], probabilistic [[expert systems]] based on [[Bayes networks]], [[statistical proof]] in legal proceedings, email spam filters, and so on (Rosenthal, 2005; Bertsch McGrayne, 2012). As an elementary fact from the calculus of probability, Bayes' rule tells us how unconditional and conditional probabilities are related whether we work with a [[frequentist interpretation of probability]] or a [[Bayesian probability|Bayesian interpretation of probability]]. Under the Bayesian interpretation it is frequently applied in the situation where <math>A_1</math> and <math>A_2</math> are competing hypotheses, and <math>B</math> is some observed evidence. The rule shows how one's judgement on whether <math>A_1</math> or <math>A_2</math> is true should be updated on observing the evidence <math>B</math> (Gelman et al., 2003).
| |
| | |
| ==The rule== | |
| | |
| ===Single event===
| |
| | |
| Given events <math>A_1</math>, <math>A_2</math> and <math>B</math>, Bayes' rule states that the conditional odds of <math>A_1:A_2</math> given <math>B</math> are equal to the marginal odds of <math>A_1:A_2</math> multiplied by the [[Bayes factor]] or [[likelihood ratio]] <math>\Lambda</math>:
| |
| | |
| :<math>O(A_1:A_2|B) = \Lambda(A_1:A_2|B) \cdot O(A_1:A_2) ,</math>
| |
| | |
| where
| |
| | |
| :<math>\Lambda(A_1:A_2|B) = \frac{P(B|A_1)}{P(B|A_2)}.</math>
| |
| | |
| Here, the odds and conditional odds, also known as prior odds and posterior odds, are defined by
| |
| | |
| :<math>O(A_1:A_2) = \frac{P(A_1)}{P(A_2)},</math>
| |
| | |
| :<math>O(A_1:A_2|B) = \frac{P(A_1|B)}{P(A_2|B)}.</math>
| |
| | |
| In the special case that <math>A_1 = A</math> and <math>A_2 = \neg A</math>, one writes <math>O(A)=O(A:\neg A)</math>, and uses a similar abbreviation for the Bayes factor and for the conditional odds. The odds on <math>A</math> is by definition the odds for and against <math>A</math>. Bayes' rule can then be written in the abbreviated form
| |
| | |
| :<math>O(A|B) = O(A) \cdot \Lambda(A|B) ,</math>
| |
| | |
| or in words: the posterior odds on <math>A</math> equals the prior odds on <math>A</math> times the likelihood ratio for <math>A</math> given information <math>B</math>. In short, '''posterior odds equals prior odds times likelihood ratio'''.
| |
| | |
| The rule is frequently applied when <math>A_1 = A</math> and <math>A_2 = \neg A</math> are two competing hypotheses concerning the cause of some event <math>B</math>. The prior odds on <math>A</math>, in other words, the odds between <math> A</math> and <math> \neg A</math>, expresses our initial beliefs concerning whether or not <math> A</math> is true. The event <math>B</math> represents some evidence, information, data, or observations. The likelihood ratio is the ratio of the chances of observing <math>B</math> under the two hypotheses <math>A</math> and <math>\neg A</math>. The rule tells us how our prior beliefs concerning whether or not <math> A</math> is true needs to be updated on receiving the information <math>B</math>.
| |
| | |
| ===Many events===
| |
| If we think of <math>A</math> as arbitrary and <math>B</math> as fixed then we can rewrite Bayes' theorem <math>P(A|B)=P(A)P(B|A)/P(B)</math> in the form <math>P(A|B) \propto P(A)P(B|A)</math> where the proportionality symbol means that, as <math>A</math> varies but keeping <math>B</math> fixed, the left hand side is equal to a constant times the right hand side.
| |
| | |
| In words '''posterior is proportional to prior times likelihood'''. This version of Bayes' theorem was first called "Bayes' rule" by Cournot (1843). Cournot popularized the earlier work of Laplace (1774) who had independently discovered Bayes' rule. The work of Bayes was published posthumously (1763) but remained more or less unknown till Cournot drew attention to it; see Fienberg (2006).
| |
| | |
| Bayes' rule may be preferred to the usual statement of Bayes' theorem for a number of reasons. One is that it is intuitively simpler to understand. Another reason is that normalizing probabilities is sometimes unnecessary: one sometimes only needs to know ratios of probabilities. Finally, doing the normalization is often easier to do after simplifying the product of prior and likelihood by deleting any factors which do not depend on <math>A</math>, so we do not need to actually compute the denominator <math>P(B)</math> in the usual statement of Bayes' theorem <math>P(A|B)=P(A)P(B|A)/P(B)</math>.
| |
| | |
| In [[Bayesian statistics]], Bayes' rule is often applied with a so-called [[improper prior]], for instance, a uniform probability distribution over all real numbers. In that case, the prior distribution does not exist as a probability measure within conventional probability theory, and Bayes' theorem itself is not available.
| |
| | |
| ===Series of events===
| |
| | |
| Bayes' rule may be applied a number of times. Each time we observe a new event, we update the odds between the events of interest, say <math>A_1</math> and <math>A_2</math> by taking account of the new information. For two events (information, evidence) <math>B</math> and <math>C</math>,
| |
| | |
| :<math> O(A_1:A_2|B \cap C) = \Lambda(A_1:A_2|B \cap C) \cdot \Lambda(A_1:A_2|B) \cdot O(A_1:A_2) ,</math>
| |
| | |
| where | |
| | |
| :<math>\Lambda(A_1:A_2|B) = \frac{P(B|A_1)}{P(B|A_2)} ,</math>
| |
| :<math>\Lambda(A_1:A_2|B \cap C) = \frac{P(C|A_1 \cap B)}{P(C|A_2 \cap B)} .</math>
| |
| | |
| In the special case of two complementary events <math>A</math> and <math>\neg A</math>, the equivalent notation is
| |
| | |
| :<math> O(A|B,C) = \Lambda(A|B \cap C) \cdot \Lambda(B|A) \cdot O(A).</math>
| |
| | |
| ==Derivation==
| |
| | |
| Consider two instances of [[Bayes' theorem]]:
| |
| | |
| :<math>P(A_1|B) = \frac{1}{P(B)} \cdot P(B|A_1) \cdot P(A_1),</math>
| |
| :<math>P(A_2|B) = \frac{1}{P(B)} \cdot P(B|A_2) \cdot P(A_2).</math>
| |
| | |
| Combining these gives
| |
| :<math>\frac{P(A_1|B)}{P(A_2|B)} = \frac{P(B|A_1)}{P(B|A_2)} \cdot \frac{P(A_1)}{P(A_2)}.</math>
| |
| | |
| Now defining
| |
| | |
| :<math>O(A_1:A_2|B) \triangleq \frac{P(A_1|B)}{P(A_2|B)}</math>
| |
| | |
| :<math>O(A_1:A_2) \triangleq \frac{P(A_1)}{P(A_2)}</math>
| |
| | |
| :<math>\Lambda(A_1:A_2|B) \triangleq \frac{P(B|A_1)}{P(B|A_2)},</math>
| |
| this implies
| |
| | |
| :<math>O(A_1:A_2|B) = \Lambda(A_1:A_2|B) \cdot O(A_1:A_2).</math>
| |
| | |
| A similar derivation applies for conditioning on multiple events, using the appropriate [[Bayes' theorem#Further extensions|extension of Bayes' theorem]]
| |
| | |
| ==Examples==
| |
| | |
| ===Frequentist example=== <!-- Currently references Bayes' theorem for convenience - should later be replaced with complete example -->
| |
| | |
| Consider the [[Bayes' theorem#Drug testing|drug testing example]] in the article on Bayes' theorem.
| |
| | |
| The same results may be obtained using Bayes' rule. The prior odds on an individual being a drug-user are 199 to 1 against, as <math>\textstyle 0.5%=\frac{1}{200}</math> and <math>\textstyle 99.5%=\frac{199}{200}</math>. The [[Bayes factor]] when an individual tests positive is <math>\textstyle \frac{0.99}{0.01} = 99:1</math> in favour of being a drug-user: this is the ratio of the probability of a drug-user testing positive, to the probability of a non-drug user testing positive. The posterior odds on being a drug user are therefore <math>\textstyle 1 \times 99 : 199 \times 1 = 99:199</math>, which is very close to <math>\textstyle 100:200 = 1:2</math>. In round numbers, only one in three of those testing positive are actually drug-users.
| |
| | |
| ===Model selection===
| |
| | |
| {{main|Bayesian model selection}}
| |
| | |
| == External links ==
| |
| | |
| * Sharon Bertsch McGrayne (2012), "The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy", Yale University Press.
| |
| * Andrew Gelman, John B. Carlin, Hal S. Stern, and Donald B. Rubin (2003), "Bayesian Data Analysis", Second Edition, CRC Press.
| |
| * Stephen E. Fienberg (2006), "When did Bayesian inference become "Bayesian"?"", ''Bayesian analysis'' vol. 1, nr. 1, pp. 1-40.
| |
| * Peter M. Lee (2012), "Bayesian Statistics: An Introduction", Wiley.
| |
| * [http://www.inference.phy.cam.ac.uk/mackay/itila/ The on-line textbook: Information Theory, Inference, and Learning Algorithms], by [[David J.C. MacKay]], discusses Bayesian model comparison in Chapters 3 and 28.
| |
| * <span class="citation" id=refRosenthal2005b>Rosenthal, Jeffrey S. (2005): ''Struck by Lightning: the Curious World of Probabilities''. Harper Collings 2005, ISBN 978-0-00-200791-7.</span>
| |
| * Stone, JV (2013). Chapter 1 of book [http://jim-stone.staff.shef.ac.uk/BookBayes2012/BayesRuleBookMain.html "Bayes’ Rule: A Tutorial Introduction"], University of Sheffield, Psychology.
| |
| * Pierre Bessière, Emmanuel Mazer, Juan-Manuel Ahuactzin and Kamel Mekhnacha (2013), "[http://www.crcpress.com/product/isbn/9781439880326 Bayesian Programming]", CRC Press
| |
| | |
| [[Category:Bayesian inference|Rule]]
| |
| [[Category:Model selection]]
| |
| [[Category:Statistical ratios]]
| |
| | |
| [[ar:عامل بايز]]
| |
| [[ja:ベイズ因子]]
| |
House containers turn garden artistry (recyclers rejoice). A picayune ingeniousness you seat ferment items that in the beginning were not intentional as planters merely experience gravid potential into garden containers. For instance:
1) Fully grown pliant company bowls (ordinarily reserved for chips and dip) buns be utilitarian and colourful.
2) Orotund metal or moldable buckets are magnetic because they are deeply and all-embracing with the added incentive of handles (slowly to move).
3) Cable baskets (caning is dependable also) - equitable air with shaping or moss
4) Large quondam toys comparable a knock down truck or black Maria - specially if they calm down receive wheels.
But unrivalled dominion to play along - no affair what the corporeal - it mustiness undergo drain holes.
As a container nurseryman you do undergo an easier metre and then the "traditional gardener". You toilet form exercise of the smallest of quad. Littler quad means less variables to distribute with. Omission the laborious make for of tilling and adding dirty amendments to the beds. Harvesting is easier - you tooshie play the dope to you! Almost never cause weeds. Containers are nomadic - regroup them into satisfying combinations and assume vantage of the changing flavor.
What around that onetime saying -"location, location location"? Anyone with a atom of outdoor infinite bottom bask the therapeutical and rewarding quest of gardening. Balcony or porch, deck of cards or terrace
level front end or gage stairs - are places to expose twelvemonth round of drinks interest group. Fill up the pots with scented smelling plants and receive your visitors with a heavenly fragrance.It is an easy agency to institute the garden nearer to you! Or a windowpane corner deep-rooted with herbs - well-fixed picking for the salad! Level taboo in the garden integrate garden containers into the whole scheme, occupy in seasonal worker gaps and admit reward of specialized micro-environments. Set up to each one container where it fits Charles Herbert Best and act it apart when it is retiring its flush.
At that place are many with child books that throne extend advice on preparing and selecting plants, choosing containers from suspension field goal to windowpane boxes and creating assorted themes. Books that backside be a pathfinder for anyone wish to transubstantiate level the smallest recession with style and confidence.
Gain effective usage of your infinite with container gardening, where you pot get ripe things to eat up or equitable tone at! Regular with ace container and a small provision you put up consume a garden for wholly seasons!
If you liked this article therefore you would like to receive more info relating to บ้านตู้คอนเทนเนอร์ nicely visit our website.