Loop space: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Subh83
en>DesolateReality
mNo edit summary
 
Line 1: Line 1:
{{Dablink|See [[method of moments (probability theory)]] for an account of a technique for proving convergence in distribution.}}
57 years old Microbiologist Archie Lal from Thetford Mines, enjoys to spend some time hunting, new launch property singapore and fashion. Finds the world an enjoyable place having spent 4 weeks at Arabian Oryx Sanctuary Delisted.<br><br>Here is my homepage - [http://networkinginfo.com/node/47609 The Skywoods launch]
{{Unreferenced|date=December 2009}}
 
In [[statistics]] and [[econometrics]], the '''method of moments''' is a method of [[estimation]] of population [[statistical parameter|parameters]]. One starts with deriving equations that related the population [[moment (mathematics)|moments]] (i.e., the [[expected value]]s of powers of the [[random variable]] under consideration) to the parameters of interest. Then a sample is drawn and the population moments are estimated from the sample. The equations are then solved for the parameters of interest, using the sample moments in place of the (unknown) population moments. This results in estimates of those parameters.
 
==Method==
Suppose that the problem is to estimate <math>k</math> unknown parameters <math>\theta_{1}, \theta_{2}, \dots, \theta_{k}</math> characterizing the [[probability distribution|distribution]] <math>f_{W}(w; \theta)</math> of the random variable <math>W</math>. Suppose the first <math>k</math> moments of the true distribution (the "population moments") can be expressed as functions of the  <math>\theta</math>s:
 
:<math>\mu_{1} \equiv E[W^1]=g_{1}(\theta_{1}, \theta_{2}, \dots, \theta_{k}) , </math>
:<math>\mu_{2} \equiv E[W^2]=g_{2}(\theta_{1}, \theta_{2}, \dots, \theta_{k})  ,</math>
:::<math>\vdots </math>
:<math>\mu_{k} \equiv E[W^k]=g_{k}(\theta_{1}, \theta_{2}, \dots, \theta_{k})  .</math>
 
Suppose a sample of size <math>n</math> is drawn, resulting in the values <math>w_1, \dots, w_n</math>. For <math>j=1,\dots,k</math>, let
:<math>\hat{\mu_{j}}=\frac{1}{n}\sum_{i=1}^{n} w_{i}^{j}</math>
be the j-th sample moment, an estimate of <math>\mu_{j}</math>. The method of moments estimator for <math>\theta_{1}, \theta_{2}, \dots, \theta_{k}</math>  denoted by <math>\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}</math> is defined as the solution (if there is one) to the equations:{{citation needed|date=September 2011}}
:<math>\hat \mu_{1} = g_{1}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) ,</math>
:<math>\hat \mu_{2} = g_{2}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) ,</math>
:::<math>\vdots  </math>
:<math>\hat \mu_{k} = g_{k}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) .</math>
 
==Example==
Suppose ''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub> are [[independent identically distributed]] [[random variables]] with a [[gamma distribution]] with [[probability density function]]
 
:<math>{x^{\alpha-1} e^{-x/\beta} \over \beta^\alpha\, \Gamma(\alpha)} \,\! </math>
 
for ''x'' > 0, and 0 for ''x'' < 0.
 
The first moment, i.e., the [[expected value]], of a random variable with this probability distribution is
 
:<math>\operatorname{E}(X_1)=\alpha\beta\,</math>
 
and the second moment, i.e., the expected value of its square, is
 
:<math>\operatorname{E}(X_1^2)=\beta^2\alpha(\alpha+1).\,</math>
 
These are the "[[Statistical population|population]] moments".
 
The first and second "[[Sample (statistics)|sample]] moments" ''m''<sub>1</sub> and ''m''<sub>2</sub> are respectively
 
:<math>m_{1} = {X_1+\cdots+X_n \over n} \,\!</math>
 
and
 
:<math>m_{2} = {X_1^2+\cdots+X_n^2 \over n}.\,\!</math>
 
Equating the population moments with the sample moments, we get
 
:<math>\alpha\beta = m_{1} \,\!</math>
 
and
 
:<math>\beta^2\alpha(\alpha+1) = m_{2}.\,\!</math>
 
Solving these two equations for α and β, we get
 
:<math>\alpha={ m_{1}^2
\over m_{2} - m_{1}^2}\,\!</math>
 
and
 
:<math>\beta={ m_{2} - m_{1}^2 \over m_{1}}.\,\!</math>
 
We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and β.
 
==Advantages and disadvantages of this method==
The method of moments is fairly simple and yields [[consistent estimator]]s (under very weak assumptions), though these estimators are often [[bias of an estimator|biased]].
 
In some respects, when estimating parameters of a known family of probability distributions, this method was superseded by [[Ronald Fisher|Fisher]]'s method of [[maximum likelihood]], because maximum likelihood estimators have higher probability of being close to the quantities to be estimated and are more often unbiased.
 
However, in some cases, as in the above example of the gamma distribution, the likelihood equations may be intractable without computers, whereas the method-of-moments estimators can be quickly and easily calculated by hand.
 
Estimates by the method of moments may be used as the first approximation to the solutions of the likelihood equations, and successive improved approximations may then be found by the [[Newton&ndash;Raphson method]].  In this way the method of moments and the method of maximum likelihood are symbiotic.
 
In some cases, infrequent with large samples but not so infrequent with small samples, the estimates given by the method of moments are outside of the parameter space; it does not make sense to rely on them then.  That problem never arises in the method of maximum likelihood.  Also, estimates by the method of moments are not necessarily [[sufficiency (statistics)|sufficient statistics]], i.e., they sometimes fail to take into account all relevant information in the sample.
 
When estimating other structural parameters (e.g., parameters of a [[utility|utility function]], instead of parameters of a known probability distribution), appropriate probability distributions may not be known, and moment-based estimates may be preferred to maximum likelihood estimation.
 
==See also==
* [[Generalized method of moments]]
 
{{Statistics}}
 
{{DEFAULTSORT:Method Of Moments (Statistics)}}
[[Category:Fitting probability distributions]]

Latest revision as of 17:00, 9 May 2014

57 years old Microbiologist Archie Lal from Thetford Mines, enjoys to spend some time hunting, new launch property singapore and fashion. Finds the world an enjoyable place having spent 4 weeks at Arabian Oryx Sanctuary Delisted.

Here is my homepage - The Skywoods launch