Beta Cephei variable: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>WikitanvirBot
m r2.7.1) (Robot: Adding lt:Cefėjo Beta tipo žvaigždės
 
en>EmausBot
m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:Q764463
 
Line 1: Line 1:
The title of the writer is Numbers. Playing baseball is the hobby he will by no means stop performing. Hiring has been my profession for some time but I've already utilized for another 1. California is our birth location.<br><br>Look into my web-site :: [http://www.blaze16.com/blog/255682 at home std testing]
{{Probability distribution|
  name      =Generalized inverse Gaussian|
  type      =density|
  pdf_image  =[[Image:GIG distribution pdf.svg|325px|Probability density plots of GIG distributions]]|
  cdf_image  =|
  parameters =''a'' > 0, ''b'' > 0, ''p'' real|
  support    =''x'' > 0|
  pdf        =<math>f(x) = \frac{(a/b)^{p/2}}{2 K_p(\sqrt{ab})} x^{(p-1)} e^{-(ax + b/x)/2}</math>|
  cdf        =|
  mean      =<math>\frac{\sqrt{b}\ K_{p+1}(\sqrt{a b}) }{ \sqrt{a}\ K_{p}(\sqrt{a b})}</math>|
  median    =|
  mode      =<math>\frac{(p-1)+\sqrt{(p-1)^2+ab}}{a}</math>|
  variance  =<math>\left(\frac{b}{a}\right)\left[\frac{K_{p+2}(\sqrt{ab})}{K_p(\sqrt{ab})}-\left(\frac{K_{p+1}(\sqrt{ab})}{K_p(\sqrt{ab})}\right)^2\right]</math>|
  skewness  =|
  kurtosis  =|
  entropy    =|
  mgf        =<math>\left(\frac{a}{a-2t}\right)^{\frac{p}{2}}\frac{K_p(\sqrt{b(a-2t})}{K_p(\sqrt{ab})}</math>|
  char      =<math>\left(\frac{a}{a-2it}\right)^{\frac{p}{2}}\frac{K_p(\sqrt{b(a-2it})}{K_p(\sqrt{ab})}</math>|
}}
In [[probability theory]] and [[statistics]], the '''generalized inverse Gaussian distribution''' ('''GIG''')  is a three-parameter family of continuous [[probability distribution]]s with [[probability density function]]
 
:<math>f(x) = \frac{(a/b)^{p/2}}{2 K_p(\sqrt{ab})} x^{(p-1)} e^{-(ax + b/x)/2},\qquad x>0,</math>
 
where ''K<sub>p</sub>'' is a [[modified Bessel function]] of the second kind, ''a''&nbsp;>&nbsp;0, ''b''&nbsp;>&nbsp;0 and ''p'' a real parameter. It is used extensively in [[geostatistics]], statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.<ref>
{{Cite book
  | last = Seshadri
  | first = V.
  | contribution = Halphen's laws
  | editor-last = Kotz
  | editor-first = S.
  | editor2-last = Read
  | editor2-first = C. B.
  | editor3-last = Banks
  | editor3-first = D. L.
  | title = Encyclopedia of Statistical Sciences, Update Volume 1
  | pages = 302–306
  | publisher = Wiley
  | place = New York
  | year = 1997
  | postscript = <!--None-->}}
</ref><ref>{{cite doi|10.1061/(ASCE)1084-0699(1999)4:3(189)}}</ref><ref>Étienne Halphen was the uncle of the mathematician [[Georges Henri Halphen]].</ref>
It was rediscovered and popularised by [[Ole Barndorff-Nielsen]], who called it the generalized inverse Gaussian distribution. It is also known as the '''Sichel distribution''', after [[Herbert Sichel]]. Its statistical properties are discussed in Bent Jørgensen's lecture notes.<ref>
{{cite book
  | last = Jørgensen
  | first = Bent
  | title = Statistical Properties of the Generalized Inverse Gaussian Distribution
  | publisher = Springer-Verlag
  | year = 1982
  | location = New York–Berlin
  | series = Lecture Notes in Statistics
  | volume = 9
  | isbn = 0-387-90665-7
  | id = {{MathSciNet | id = 0648107}}}}</ref>
 
==Special cases==
The [[Inverse Gaussian distribution|inverse Gaussian]] and [[Gamma distribution|gamma]] distributions are special cases of the generalized inverse Gaussian distribution for ''p'' = -1/2 and ''b'' = 0, respectively.<ref name=JKB/>  Specifically, an inverse Gaussian distribution of the form
: <math> f(x;\mu,\lambda) = \left[\frac{\lambda}{2 \pi x^3}\right]^{1/2} \exp{\frac{-\lambda (x-\mu)^2}{2 \mu^2 x}}</math>
is a GIG with <math>a = \lambda/\mu^2</math>, <math>b = \lambda</math>, and <math>p=-1/2</math>.  A Gamma distribution of the form
:<math>g(x;\alpha,\beta) = \beta^{\alpha}\frac{1}{\Gamma(\alpha)} x^{\alpha-1} e^{-\beta x} </math>
is a GIG with <math>a = 2 \beta</math>, <math>b = 0</math>, and <math>p = \alpha</math>.
 
Other special cases include the [[inverse-gamma distribution]], for ''a''=0, and the [[hyperbolic distribution]], for ''p''=0.<ref name=JKB/>
 
==Entropy ==
The entropy of the generalized inverse Gaussian distribution  is given as{{citation needed|date=February 2012}}
:<math>H(f(x))=\frac{1}{2} \log \left(\frac{b}{a}\right)+\log \left(2 K_p\left(\sqrt{a b}\right)\right)-
(p-1) \frac{\left[\frac{d}{d\nu}K_\nu\left(\sqrt{ab}\right)\right]_{\nu=p}}{K_p\left(\sqrt{a b}\right)}+\frac{\sqrt{a b}}{2 K_p\left(\sqrt{a b}\right)}\left( K_{p+1}\left(\sqrt{a b}\right) + K_{p-1}\left(\sqrt{a b}\right)\right)
</math>
 
where <math>\left[\frac{d}{d\nu}K_\nu\left(\sqrt{a b}\right)\right]_{\nu=p}</math> is a derivative of the modified Bessel function of the second kind with respect to the order <math>\nu</math> evaluated at <math>\nu=p</math>
 
==Conjugate prior for Gaussian==
The GIG distribution is [[conjugate prior|conjugate]] to the [[normal distribution]] when serving as the mixing distribution in a [[normal variance-mean mixture]].<ref>Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.</ref><ref>Barndorf-Nielsen, O.E., 1997. ''Normal Inverse Gaussian Distributions and stochastic volatility modelling''. Scand. J. Statist. 24, 1–13.</ref> Let the prior distribution for some hidden variable, say <math>z</math>, be GIG:
:<math>
P(z|a,b,p) = \text{GIG}(z|a,b,p)
</math>
and let there be <math>T</math> observed data points, <math>X=x_1,\ldots,x_T</math>, with normal likelihood function, conditioned on <math>z</math>:
:<math>
P(X|z,\alpha,\beta) = \prod_{i=1}^T N(x_i|\alpha+\beta z,z)
</math>
where <math>N(x|\mu,v)</math> is the normal distribution, with mean <math>\mu</math> and variance <math>v</math>. Then the posterior for <math>z</math>, given the data is also GIG:
:<math>
P(z|X,a,b,p,\alpha,\beta) = \text{GIG}(z|p-\tfrac{T}{2},a+T\beta^2,b+S)
</math>
where <math>\textstyle S = \sum_{i=1}^T (x_i-\alpha)^2</math>.<ref group=note>Due to the conjugacy, these details can be derived without solving integrals, by noting that
:<math>P(z|X,a,b,p,\alpha,\beta)\propto P(z|a,b,p)P(X|z,\alpha,\beta)</math>.
Omitting all factors independent of <math>z</math>, the right-hand-side can be simplified to give an ''un-normalized'' GIG distribution, from which the posterior parameters can be identified.</ref>
 
==Notes==
{{reflist|group=note}}
 
 
==References==
{{reflist|refs=
 
<ref name=JKB>{{Citation | last1=Johnson | first1=Norman L. | last2=Kotz | first2=Samuel | last3=Balakrishnan | first3=N. | title=Continuous univariate distributions. Vol. 1 | publisher=[[John Wiley & Sons]] | location=New York | edition=2nd | series=Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics | isbn=978-0-471-58495-7 | mr= 1299979| year=1994 |pages=284–285}}</ref>
 
}}
 
== See also ==
*[[Inverse Gaussian distribution]]
*[[Gamma distribution]]
 
<!-- (This article will continue to be developed, by reference to the [[Normal Distribution]]. -->
 
{{ProbDistributions|continuous-semi-infinite}}
 
{{DEFAULTSORT:Generalized Inverse Gaussian Distribution}}
[[Category:Continuous distributions]]
[[Category:Exponential family distributions]]
[[Category:Probability distributions]]

Latest revision as of 13:22, 17 April 2013

Template:Probability distribution In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.[1][2][3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. It is also known as the Sichel distribution, after Herbert Sichel. Its statistical properties are discussed in Bent Jørgensen's lecture notes.[4]

Special cases

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = -1/2 and b = 0, respectively.[5] Specifically, an inverse Gaussian distribution of the form

is a GIG with , , and . A Gamma distribution of the form

is a GIG with , , and .

Other special cases include the inverse-gamma distribution, for a=0, and the hyperbolic distribution, for p=0.[5]

Entropy

The entropy of the generalized inverse Gaussian distribution is given asPotter or Ceramic Artist Truman Bedell from Rexton, has interests which include ceramics, best property developers in singapore developers in singapore and scrabble. Was especially enthused after visiting Alejandro de Humboldt National Park.

where is a derivative of the modified Bessel function of the second kind with respect to the order evaluated at

Conjugate prior for Gaussian

The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.[6][7] Let the prior distribution for some hidden variable, say , be GIG:

and let there be observed data points, , with normal likelihood function, conditioned on :

where is the normal distribution, with mean and variance . Then the posterior for , given the data is also GIG:

where .[note 1]

Notes

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.


References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

See also


55 yrs old Metal Polisher Records from Gypsumville, has interests which include owning an antique car, summoners war hack and spelunkering. Gets immense motivation from life by going to places such as Villa Adriana (Tivoli).

my web site - summoners war hack no survey ios

  1. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  2. Template:Cite doi
  3. Étienne Halphen was the uncle of the mathematician Georges Henri Halphen.
  4. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  5. 5.0 5.1 Cite error: Invalid <ref> tag; no text was provided for refs named JKB
  6. Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.
  7. Barndorf-Nielsen, O.E., 1997. Normal Inverse Gaussian Distributions and stochastic volatility modelling. Scand. J. Statist. 24, 1–13.


Cite error: <ref> tags exist for a group named "note", but no corresponding <references group="note"/> tag was found