Distribution constant: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Addbot
m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:q5283195
en>January2009
clarify header
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
In [[probability theory]], '''Donsker's theorem''', named after [[Monroe D. Donsker]], identifies a certain [[stochastic process]] as a limit of [[empirical process]]es. It is sometimes called the '''functional central limit theorem'''.
Nice to meet you, I am Marvella Shryock. Years ago we moved to North Dakota and I adore each day living right here. Hiring is his occupation. To play baseball is the hobby he will never stop doing.<br><br>my web-site; at home std test ([http://www.january-yjm.com/xe/index.php?mid=video&document_srl=182582 simply click the up coming web site])
 
A centered and scaled version of [[empirical distribution function]] ''F''<sub>''n''</sub> defines an [[empirical process]]
 
: <math> G_n(x)= \sqrt n ( F_n(x) - F(x) ) \, </math>
 
indexed by ''x''&nbsp;∈&nbsp;'''R'''.
 
'''Theorem''' (Donsker, Skorokhod, Kolmogorov) The sequence of ''G''<sub>''n''</sub>(''x''), as random elements of the [[Skorokhod space]] <math>\mathcal{D}(-\infty,\infty)</math>, [[convergence in distribution|converges in distribution]] to a [[Gaussian process]] ''G'' with zero mean and covariance given by
: <math>\operatorname{cov}[G(s), G(t)] = E[G(s) G(t)] = \min\{F(s), F(t)\} - F(s)F(t). \,</math>
The process ''G''(''x'') can be written as ''B''(''F''(''x'')) where ''B'' is a standard [[Brownian bridge]] on the unit interval.
 
==History==
By the classical [[central limit theorem]], for fixed ''x'', the random variable ''G''<sub>''n''</sub>(''x'') [[converges in distribution]] to a [[normal distribution|Gaussian (normal)]] [[random variable]] ''G''(''x'') with zero mean and variance ''F''(''x'')(1&nbsp;−&nbsp;''F''(''x'')) as the sample size ''n'' grows.
 
Kolmogorov (1933) showed that when ''F'' is [[continuous function|continuous]], the supremum <math>\scriptstyle\sup_t G_n(t)</math> and supremum of absolute value, <math>\scriptstyle\sup_t |G_n(t)|</math> [[convergence in distribution|converges in distribution]] to the laws of the same functionals of the [[Brownian bridge]] ''B''(''t''), see the [[Kolmogorov–Smirnov test]]. In 1949 Doob asked whether the convergence in distribution held for more general functionals, thus formulating a problem of [[weak convergence of measures|weak convergence]] of random functions in a suitable [[function space]].<ref>{{cite journal
|first=Joseph L. |last=Doob|authorlink=Joseph L. Doob
|title=Heuristic approach to the Kolmogorov–Smirnov theorems
|journal=[[Annals of Mathematical Statistics]]
|volume=20 |issue= |pages=393–403 |year=1949
|doi=10.1214/aoms/1177729991 |mr=30732 | zbl = 0035.08901
}}</ref>
 
In 1952 Donsker stated and proved (not quite correctly)<ref name="dudley1999">{{cite book
|first=R.M. |last=Dudley|authorlink=Richard M. Dudley
|title=Uniform Central Limit Theorems
|publisher=Cambridge University Press
|year=1999
|isbn=0-521-46102-2
}}</ref> a general extension for the Doob-Kolmogorov heuristic approach. In the original paper, Donsker proved that the  convergence in law of ''G<sub>n</sub>'' to the Brownian bridge holds for [[uniform distribution (continuous)|Uniform[0,1]]] distributions with respect to uniform convergence in ''t'' over the interval [0,1].<ref>{{cite journal
|first=M. D. |last=Donsker |authorlink=Monroe D. Donsker
|title=Justification and extension of Doob's heuristic approach to the Kolmogorov–Smirnov theorems
|journal=[[Annals of Mathematical Statistics]]
|volume=23 |issue= |pages=277–281 |year=1952
|doi=10.1214/aoms/1177729445 |mr=47288 | zbl = 0046.35103
}}</ref>
 
However Donsker's formulation was not quite correct because of the problem of measurability of the functionals of discontinuous processes. In 1956 Skorokhod and Kolmogorov defined a separable metric ''d'', called the ''Skorokhod metric'', on the space of [[cadlag function]]s on [0,1], such that convergence for ''d'' to a continuous function is equivalent to convergence for the sup norm, and showed that ''G<sub>n</sub>'' converges in law in <math>\mathcal{D}[0,1]</math> to the Brownian bridge.
 
Later Dudley reformulated Donsker's result to avoid the problem of measurability and the need of the Skorokhod metric. One can prove<ref name="dudley1999" /> that there exist ''X<sub>i</sub>'', iid uniform in [0,1] and a sequence of sample-continuous Brownian bridges ''B''<sub>''n''</sub>, such that
:<math>\|G_n-B_n\|_\infty</math>
is measurable and [[convergence in probability|converges in probability]] to 0. An improved version of this result, providing more detail on the rate of convergence, is the [[Komlós–Major–Tusnády approximation]].
 
==See also==
*[[Glivenko–Cantelli theorem]]
*[[Kolmogorov–Smirnov test]]
 
== References ==
{{reflist}}
 
{{DEFAULTSORT:Donsker's Theorem}}
[[Category:Probability theorems]]
[[Category:Statistical theorems]]
[[Category:Empirical process]]

Latest revision as of 03:39, 8 May 2014

Nice to meet you, I am Marvella Shryock. Years ago we moved to North Dakota and I adore each day living right here. Hiring is his occupation. To play baseball is the hobby he will never stop doing.

my web-site; at home std test (simply click the up coming web site)