Solving quadratic equations with continued fractions: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Hmains
m copyedit, clarity edits, MOS implementation, and/or AWB general fixes using AWB
 
en>Addbot
m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:q1212495
Line 1: Line 1:
I would like to introduce myself to you, I am Jayson Simcox but I don't like when people use my full name. Doing ballet is something she would never give up. Office supervising is what she does for a living. Mississippi is the only place I've been residing in but I will have to transfer in a yr or two.<br><br>Look into  [http://help.ksu.edu.sa/node/65129 best psychic] my webpage; real [http://kard.dk/?p=24252 online psychic readings] readings ([http://www.prayerarmor.com/uncategorized/dont-know-which-kind-of-hobby-to-take-up-read-the-following-tips/ http://www.prayerarmor.com/])
 
In [[statistics]], the '''probability integral transform''' or '''transformation''' relates to the result that data values that are modelled as being [[random variable]]s from any given [[continuous distribution]] can be converted to random variables having a [[uniform distribution]].<ref name=Dodge/> This holds exactly provided that the distribution being used is the true distribution of the random variables; if the distribution is one fitted to the data the result will hold approximately in large samples.
 
The result is sometimes modified or extended so that the result of the transformation is a standard distribution other than the uniform distribution, such as the [[exponential distribution]].
 
==Applications==
 
One use for the probability integral transform in statistical [[data analysis]] is to provide the basis for testing whether a set of observations can reasonably be modelled as arising from a specified distribution. Specifically, the probability integral transform is applied to construct an equivalent set of values, and a test is then made of whether a uniform distribution is appropriate for the constructed dataset. Examples of this are [[P-P plot]]s and [[Kolmogorov-Smirnov test]]s.
 
A second use for the transformation is in the theory related to [[Copula (statistics)|copulas]] which are a means of both defining and working with distributions for statistically dependent multivariate data. Here the problem of defining or manipulating a [[joint probability distribution]] for a set of random variables is simplified or reduced in apparent complexity by applying the probability integral transform to each of the components and then working with a joint distribution for which the marginal variables have uniform distributions.
 
A third use is based on applying the inverse of the probability integral transform to convert random variables from a uniform distribution to have a selected distribution: this is known as [[inverse transform sampling]].
 
==Examples==
 
Suppose that a random variable ''X'' has a [[continuous distribution]] for which the [[cumulative distribution function]] is ''F''<sub>''X''</sub>. Then the random variable ''Y'' defined as
 
:<math>Y=F_X(X) \,,</math>
has a uniform distribution.<ref name=Dodge>Dodge, Y. (2003) ''The Oxford Dictionary of Statistical Terms'', OUP. ISBN 0-19-920613-9</ref>
 
For an illustrative example, let ''X'' be a random variable with a standard normal distribution N(0,1) where <math>\operatorname{erf}(),</math> is the [[error function]]. Then its CDF is
 
:<math>\Phi(x) = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^x e^{-t^2/2} \, dt
            = \frac12\Big[\, 1 + \operatorname{erf}\Big(\frac{x}{\sqrt{2}}\Big)\,\Big],\quad x\in\mathbb{R}.
\,</math>
Then the new random variable ''Y'', defined by ''Y''=&Phi;(''X''), is uniformly distributed.
 
If X has an [[exponential distribution]] with unit mean, then
:<math>F(x)=1-\exp(-x),</math>
and the immediate result of the probability integral transform is that
:<math>Y=1-\exp(-X)</math>
has a uniform distribution. However, the symmetry of the uniform distribution can then be used to show that
:<math>Y'=\exp(-X)</math>
also has a uniform distribution.
 
==References==
<references/>
 
{{DEFAULTSORT:Probability Integral Transform}}
[[Category:Theory of probability distributions]]

Revision as of 12:57, 16 March 2013

In statistics, the probability integral transform or transformation relates to the result that data values that are modelled as being random variables from any given continuous distribution can be converted to random variables having a uniform distribution.[1] This holds exactly provided that the distribution being used is the true distribution of the random variables; if the distribution is one fitted to the data the result will hold approximately in large samples.

The result is sometimes modified or extended so that the result of the transformation is a standard distribution other than the uniform distribution, such as the exponential distribution.

Applications

One use for the probability integral transform in statistical data analysis is to provide the basis for testing whether a set of observations can reasonably be modelled as arising from a specified distribution. Specifically, the probability integral transform is applied to construct an equivalent set of values, and a test is then made of whether a uniform distribution is appropriate for the constructed dataset. Examples of this are P-P plots and Kolmogorov-Smirnov tests.

A second use for the transformation is in the theory related to copulas which are a means of both defining and working with distributions for statistically dependent multivariate data. Here the problem of defining or manipulating a joint probability distribution for a set of random variables is simplified or reduced in apparent complexity by applying the probability integral transform to each of the components and then working with a joint distribution for which the marginal variables have uniform distributions.

A third use is based on applying the inverse of the probability integral transform to convert random variables from a uniform distribution to have a selected distribution: this is known as inverse transform sampling.

Examples

Suppose that a random variable X has a continuous distribution for which the cumulative distribution function is FX. Then the random variable Y defined as

Y=FX(X),

has a uniform distribution.[1]

For an illustrative example, let X be a random variable with a standard normal distribution N(0,1) where erf(), is the error function. Then its CDF is

Φ(x)=12πxet2/2dt=12[1+erf(x2)],x.

Then the new random variable Y, defined by Y=Φ(X), is uniformly distributed.

If X has an exponential distribution with unit mean, then

F(x)=1exp(x),

and the immediate result of the probability integral transform is that

Y=1exp(X)

has a uniform distribution. However, the symmetry of the uniform distribution can then be used to show that

Y=exp(X)

also has a uniform distribution.

References

  1. 1.0 1.1 Dodge, Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9