|
|
Line 1: |
Line 1: |
| In [[statistics]], '''Basu's theorem''' states that any [[completeness (statistics)|boundedly complete]] [[sufficient statistic]] is [[statistical independence|independent]] of any [[ancillary statistic]]. This is a 1955 result of [[Debabrata Basu]].<ref>Basu (1955)</ref>
| | Friends contact him Royal Cummins. Climbing is what love doing. Years in the past we moved to Arizona but my wife desires us to transfer. Managing individuals is what I do in my working day job.<br><br>Visit my web site :: [http://www.gauhaticommercecollege.in/content/dont-allow-auto-repair-best-you http://www.gauhaticommercecollege.in/content/dont-allow-auto-repair-best-you] |
| | |
| It is often used in statistics as a tool to prove independence of two statistics, by first demonstrating one is complete sufficient and the other is ancillary, then appealing to the theorem.{{Citation needed|date=December 2009}} An example of this is to show that the sample mean and sample variance of a normal distribution are independent statistics, which is done in the Examples section below. This property (independent of sample mean and sample variance) characterizes normal distributions.
| |
| | |
| == Statement ==
| |
| Let ''P<sub>θ</sub>'' be a family of distributions on a [[measurable space]] (''X, Σ''). Then if ''T'' is a boundedly complete sufficient statistic for ''θ'', and ''A'' is ancillary to ''θ'', then ''T'' is independent of ''A''.
| |
| | |
| === Proof ===
| |
| Let ''P<sub>θ</sub><sup>T</sup>'' and ''P<sub>θ</sub><sup>A</sup>'' be the [[marginal distribution]]s of ''T'' and ''A'' respectively.
| |
| | |
| :<math>P_\theta^A(B) = P_\theta (A^{-1} B) = \int_{T(X)} P_\theta(A^{-1}B | T=t) \ P_\theta^T (dt) \,</math>
| |
| | |
| The ''P<sub>θ</sub><sup>A</sup>'' does not depend on ''θ'' because ''A'' is ancillary. Likewise, ''P<sub>θ</sub>''(·|''T = t'') does not depend on ''θ'' because ''T'' is sufficient. Therefore:
| |
| | |
| :<math> \int_{T(X)} \big[ P(A^{-1}B | T=t) - P^A(B) \big] \ P_\theta^T (dt) = 0 \,</math>
| |
| | |
| Note the integrand (the function inside the intergal) is a function of ''t'' and not ''θ''. Therefore, since ''T'' is boundedly complete:
| |
| | |
| :<math>P(A^{-1}B | T=t) = P^A(B) \quad \text{for all }t\,</math>
| |
| | |
| Therefore, ''A'' is independent of ''T''.
| |
| | |
| ==Example==
| |
| | |
| ===Independence of sample mean and sample variance of a normal distribution===
| |
| Let ''X''<sub>1</sub>, ''X''<sub>2</sub>, ..., ''X''<sub>''n''</sub> be [[Independent and identically-distributed random variables|independent, identically distributed]] [[Normal distribution|normal]] [[random variable]]s with [[mean]] ''μ'' and [[variance]] ''σ''<sup>2</sup>.
| |
| | |
| Then with respect to the parameter ''μ'', one can show that
| |
| | |
| :<math>\widehat{\mu}=\frac{\sum X_i}{n},\,</math>
| |
| | |
| the sample mean, is a complete sufficient statistic – it is all the information one can derive to estimate ''μ,'' and no more – and
| |
| | |
| :<math>\widehat{\sigma}^2=\frac{\sum \left(X_i-\bar{X}\right)^2}{n-1},\,</math>
| |
| | |
| the sample variance, is an ancillary statistic – its distribution does not depend on ''μ.''
| |
| | |
| Therefore, from Basu's theorem it follows that these statistics are independent.
| |
| | |
| This independence result can also be proven by [[Cochran's theorem]].
| |
| | |
| Further, this property (that the sample mean and sample variance of the normal distribution are independent) ''[[characterization (mathematics)|characterizes]]'' the normal distribution – no other distribution has this property.<ref>{{cite journal
| |
| |doi=10.2307/2983669
| |
| |first=R.C. |last=Geary |authorlink=Roy C. Geary
| |
| |year=1936
| |
| |title=The Distribution of the "Student's" Ratio for the Non-Normal Samples
| |
| |journal=Supplement to the Journal of the Royal Statistical Society
| |
| |volume=3 |issue=2 |pages=178–184
| |
| |jfm=63.1090.03
| |
| |jstor=2983669
| |
| }}</ref>
| |
| | |
| ==Notes==
| |
| {{Reflist}}
| |
| {{More footnotes|date=December 2009}}
| |
| | |
| ==References==
| |
| * {{cite journal
| |
| |last=Basu |first=D. |authorlink=Debabrata Basu
| |
| |title=On Statistics Independent of a Complete Sufficient Statistic
| |
| |journal= [[Sankhya (journal)|Sankhyā]]
| |
| |volume=15 |issue=4 |year=1955 |pages=377–380
| |
| |mr=74745
| |
| |zbl=0068.13401
| |
| |jstor=25048259
| |
| }}
| |
| * Mukhopadhyay, Nitis (2000). ''Probability and Statistical Inference''. Statistics: A Series of Textbooks and Monographs. '''162'''. Florida: CRC Press USA. ISBN 0-8247-0379-0.
| |
| * {{cite journal |doi=10.2307/2685927 |last=Boos |first=Dennis D.|coauthors=Oliver, Jacqueline M. Hughes |year=Aug |month=1998 |title=Applications of Basu's Theorem |journal=[[The American Statistician]] |volume=52 |issue=3 |pages=218–221 |publisher=[[American Statistical Association]] |location=Boston |mr=1650407 |jstor=2685927}}
| |
| * {{cite journal|authorlink=Malay Ghosh|title=Basu's Theorem with Applications: A Personalistic Review|
| |
| first=Malay|last=Ghosh | journal=Sankhyā: the Indian Journal of Statistics, Series A|volume=64|number=3|date=October 2002|pages=509–531 | mr=1985397 |jstor=25051412}}
| |
| | |
| {{Statistics|state=collapsed}}
| |
| | |
| [[Category:Statistical theorems]]
| |
| [[Category:Statistical inference]]
| |
| [[Category:Articles containing proofs]]
| |
Friends contact him Royal Cummins. Climbing is what love doing. Years in the past we moved to Arizona but my wife desires us to transfer. Managing individuals is what I do in my working day job.
Visit my web site :: http://www.gauhaticommercecollege.in/content/dont-allow-auto-repair-best-you