Special values of L-functions: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>R. J. Mathar
References: One more ref
 
en>Enyokoyama
Line 1: Line 1:
Ship Surveyor Vandermolen from Richibucto Road, really likes body building, Seo and crochet. Is motivated how big the globe is after  planing a trip to Venice and its Lagoon.<br><br>Stop by my web page ... [http://seo.9uo.com seo london masterclass]
In [[probability theory]] and [[directional statistics]], a '''wrapped probability distribution''' is a continuous [[probability distribution]] that describes data points that lie on a unit [[n-sphere|''n''-sphere]]. In one dimension, a wrapped distribution will consist of points on the [[unit circle]].
 
Any [[probability density function]] (pdf) <math>p(\phi)</math> on the line can be "wrapped" around the circumference of a circle of unit radius.<ref name="Mardia99">{{cite book |title=Directional Statistics |last=Mardia |first=Kantilal |authorlink=Kantilal Mardia |coauthors=Jupp, Peter E. |year=1999|publisher=Wiley |location= |isbn=978-0-471-95333-3 |url=http://www.amazon.com/Directional-Statistics-Kanti-V-Mardia/dp/0471953334/ref=sr_1_1?s=books&ie=UTF8&qid=1311003484&sr=1-1#reader_0471953334 |accessdate=2011-07-19}}</ref> That is, the pdf of the wrapped variable
 
:<math>\theta=\phi \mod 2\pi</math> in some interval of length <math>2\pi</math>
 
is
: <math>
p_w(\theta)=\sum_{k=-\infty}^\infty {p(\theta+2\pi k)}.
</math>
 
which is a [[periodic summation|periodic sum]] of period <math>2\pi</math>. The preferred interval is generally <math>(-\pi<\theta\le\pi)</math> for which <math>\ln(e^{i\theta})=\arg(e^{i\theta})=\theta</math>
 
==Theory==
In most situations, a process involving [[circular statistics]] produces angles (<math>\phi</math>) which lie in the interval from negative infinity to positive infinity, and are described by an "unwrapped" probability density function <math>p(\phi)</math>. However, a measurement will yield a "measured" angle <math>\theta</math> which lies in some interval of length <math>2\pi</math> (for example <math>[0,2\pi)</math>). In other words, a measurement cannot tell if the "true" angle <math>\phi</math> has been measured or whether a "wrapped" angle <math>\phi+2\pi a</math> has been measured where ''a'' is some unknown integer. That is:
 
:<math>\theta=\phi+2\pi a.</math>
 
If we wish to calculate the expected value of some function of the measured angle it will be:
 
:<math>\langle f(\theta)\rangle=\int_{-\infty}^\infty p(\phi)f(\phi+2\pi a)d\phi.</math>
 
We can express the integral as a sum of integrals over periods of <math>2\pi</math> (e.g. 0 to <math>2\pi</math>):
 
:<math>\langle f(\theta)\rangle=\sum_{k=-\infty}^\infty \int_{2\pi k}^{2\pi(k+1)} p(\phi)f(\phi+2\pi a)d\phi.</math>
 
Changing the variable of integration to <math>\theta'=\phi-2\pi k</math> and exchanging the order of integration and summation, we have
 
:<math>\langle f(\theta)\rangle= \int_0^{2\pi} p_w(\theta')f(\theta'+2\pi a')d\theta'</math>
 
where <math>p_w(\theta')</math> is the pdf of the "wrapped" distribution and ''a' '' is another unknown integer (a'=a+k). It can be seen that the unknown integer ''a' '' introduces an ambiguity into the expectation value of <math>f(\theta)</math>. A particular instance of this problem is encountered when attempting to take the [[Mean of circular quantities|mean of a set of measured angles]]. If, instead of the measured angles, we introduce the parameter <math>z=e^{i\theta}</math> it is seen that ''z'' has an unambiguous relationship to the "true" angle <math>\phi</math> since:
 
:<math>z=e^{i\theta}=e^{i\phi}.</math>
 
Calculating the expectation value of a function of ''z'' will yield unambiguous answers:
 
:<math>\langle f(z)\rangle= \int_0^{2\pi} p_w(\theta')f(e^{i\theta'})d\theta'</math>
 
and it is for this reason that the ''z'' parameter is the preferred statistical variable to use in circular statistical analysis rather than the measured angles <math>\theta</math>. This suggests, and it is shown below, that the wrapped distribution function may itself be expressed as a function of ''z'' so that:
 
:<math>\langle f(z)\rangle= \oint p_w(z)f(z)\,dz</math>
 
where <math>p_w(z)</math> is defined such that <math>p_w(\theta)\,d\theta=p_w(z)\,dz</math>. This concept can be extended to the multivariate context by an extension of the simple sum to a number of <math>F</math> sums that cover all dimensions in the feature space:
: <math>
p_w(\vec\theta)=\sum_{k_1=-\infty}^{\infty}{p(\vec\theta+2\pi k_1\mathbf{e}_1+\dots+2\pi k_F\mathbf{e}_F)}
</math>
where <math>\mathbf{e}_k=(0,\dots,0,1,0,\dots,0)^{\mathsf{T}}</math> is the <math>k</math>th Euclidean basis vector.
 
==Expression in terms of characteristic functions==
A fundamental wrapped distribution is the [[Dirac comb]] which is a wrapped delta function:
 
:<math>\Delta_{2\pi}(\theta)=\sum_{k=-\infty}^{\infty}{\delta(\theta+2\pi k)}.</math>
 
Using the delta function, a general wrapped distribution can be written
 
:<math>p_w(\theta)=\sum_{k= -\infty}^{\infty}\int_{-\infty}^\infty p(\theta')\delta(\theta-\theta'+2\pi k)\,d\theta'.</math>
 
Exchanging the order of summation and integration, any wrapped distribution can be written as the convolution of the "unwrapped" distribution and a Dirac comb:
 
:<math>p_w(\theta)=\int_{-\infty}^\infty p(\theta')\Delta_{2\pi}(\theta-\theta')\,d\theta'.</math>
 
The Dirac comb may also be expressed as a sum of exponentials, so we may write:
 
:<math>p_w(\theta)=\frac{1}{2\pi}\,\int_{-\infty}^\infty p(\theta')\sum_{n=-\infty}^{\infty}e^{in(\theta-\theta')}\,d\theta'</math>
 
again exchanging the order of summation and integration,
 
:<math>p_w(\theta)=\frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty}\int_{-\infty}^\infty p(\theta')e^{in(\theta-\theta')}\,d\theta'</math>
 
using the definition of <math>\phi(s)</math>, the [[Characteristic function (probability theory)|characteristic function]] of <math>p(\theta)</math> yields a [[Laurent series]] about zero for the wrapped distribution in terms of the characteristic function of the unwrapped distribution:<ref name="Mardia72">{{Cite book|title=Statistics of Directional Data |last=Mardia |first=K. |year=1972 |publisher=Academic press |location=New York}}
</ref>
 
:<math>p_w(\theta)=\frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty} \phi(-n)\,e^{in\theta} = \frac{1}{2\pi}\,\sum_{n=-\infty}^{\infty} \phi(-n)\,z^n </math>
 
or
 
:<math>p_w(z)=\frac{1}{2\pi i}\,\sum_{n=-\infty}^{\infty} \phi(-n)\,z^{n-1}. </math>
 
By analogy with linear distributions, the <math>\phi(m)</math> are referred to as the characteristic function of the wrapped distribution<ref name="Mardia72"/> (or perhaps more accurately, the characteristic [[sequence]]). This is an instance of the [[Poisson summation formula]] and it can be seen that the Fourier coefficients of the Fourier series for the wrapped distribution are just the Fourier coefficients of the Fourier transform of the unwrapped distribution at integer values.
 
==Moments==
The moments of the wrapped distribution <math>p_w(z)</math> are defined as:
 
:<math>
\langle z^m \rangle = \oint p_w(z)z^m \, dz.
</math>
 
Expressing <math>p_w(z)</math> in terms of the characteristic function and exchanging the order of integration and summation yields:
 
:<math>
\langle z^m \rangle = \frac{1}{2\pi i}\sum_{n=-\infty}^\infty \phi(-n)\oint z^{m+n-1}\,dz.
</math>
 
From the [[Residue (complex analysis)|theory of residues]] we have
 
:<math>
\oint z^{m+n-1}\,dz = 2\pi i \delta_{m+n}
</math>
 
where <math>\delta_k</math> is the [[Kronecker delta]] function. It follows that the moments are simply equal to the characteristic function of the unwrapped distribution for integer arguments:
 
:<math>
\langle z^m \rangle = \phi(m).
</math>
 
== Entropy ==
The [[Entropy (information theory)|information entropy]] of a circular distribution with probability density <math>f_w(\theta)</math> is defined as:<ref name="Mardia99"/>
 
:<math>H = -\int_\Gamma f_w(\theta)\,\ln(f_w(\theta))\,d\theta</math>
 
where <math>\Gamma</math> is any interval of length <math>2\pi</math>. If both the probability density and its logarithm can be expressed as a [[Fourier series]] (or more generally, any [[integral transform]] on the circle) then the orthogonality property may be used to obtain a series representation for the entropy which may reduce to a [[closed form expression|closed form]].
 
The moments of the distribution <math>\phi(n)</math> are the Fourier coefficents for the Fourier series expansion of the probability density:
 
:<math>f_w(\theta)=\frac{1}{2\pi}\sum_{n=-\infty}^\infty \phi_n e^{-in\theta}</math>
 
If the logarithm of the probability density can also be expressed as a Fourier series:
 
:<math>\ln(f_w(\theta))=\sum_{m=-\infty}^\infty c_m e^{im\theta}</math>
 
where
 
:<math>c_m=\frac{1}{2\pi}\int_\Gamma \ln(f_w(\theta))e^{-i m \theta}\,d\theta</math>
 
Then, exchanging the order of integration and summation, the entropy may be written as:
 
:<math>H=-\frac{1}{2\pi}\sum_{m=-\infty}^\infty\sum_{n=-\infty}^\infty c_m \phi_n \int_\Gamma e^{i(m-n)\theta}\,d\theta</math>
 
Using the orthogonality of the Fourier basis, the integral may be reduced to:
 
:<math>H=-\sum_{n=-\infty}^\infty c_n \phi_n</math>
 
For the particular case when the probability density is symmetric about the mean, <math>c_{-m}=c_m</math> and the logarithm may be written:
 
:<math>\ln(f_w(\theta))= c_0 + 2\sum_{m=1}^\infty c_m \cos(m\theta)</math>
 
and
 
:<math>c_m=\frac{1}{2\pi}\int_\Gamma \ln(f_w(\theta))\cos(m\theta)\,d\theta</math>
 
and, since normalization requires that <math>\phi_0=1</math>, the entropy may be written:
 
:<math>H=-c_0-2\sum_{n=1}^\infty c_n \phi_n</math>
 
==See also==
* [[Wrapped normal distribution]]
* [[Wrapped Cauchy distribution]]
 
{{More footnotes|date=July 2011}}
 
==References==
<references/>
* {{Cite book|title=Statistics of Earth Science Data |last=Borradaile |first=Graham |year=2003 |publisher=Springer |isbn=978-3-540-43603-4 |url=http://books.google.com/books?id=R3GpDglVOSEC&printsec=frontcover&source=gbs_navlinks_s#v=onepage&q=&f=false |accessdate=31 December 2009}}
* {{Cite book|title=Statistical Analysis of Circular Data |last=Fisher |first=N. I. |year=1996 |publisher=Cambridge University Press |location= |isbn=978-0-521-56890-6 |url=http://books.google.com/books?id=IIpeevaNH88C&dq=%22circular+variance%22+fisher&source=gbs_navlinks_s |accessdate=2010-02-09}}
 
==External links==
* [http://www.codeproject.com/Articles/190833/Circular-Values-Math-and-Statistics-with-Cplusplus Circular Values Math and Statistics with C++11], A C++11 infrastructure for circular values (angles, time-of-day, etc.) mathematics and statistics
 
{{ProbDistributions|directional}}
{{Use dmy dates|date=September 2010}}
 
{{DEFAULTSORT:Wrapped Distribution}}
[[Category:Types of probability distributions]]
[[Category:Directional statistics]]

Revision as of 15:41, 26 August 2013

In probability theory and directional statistics, a wrapped probability distribution is a continuous probability distribution that describes data points that lie on a unit n-sphere. In one dimension, a wrapped distribution will consist of points on the unit circle.

Any probability density function (pdf) p(ϕ) on the line can be "wrapped" around the circumference of a circle of unit radius.[1] That is, the pdf of the wrapped variable

θ=ϕmod2π in some interval of length 2π

is

pw(θ)=k=p(θ+2πk).

which is a periodic sum of period 2π. The preferred interval is generally (π<θπ) for which ln(eiθ)=arg(eiθ)=θ

Theory

In most situations, a process involving circular statistics produces angles (ϕ) which lie in the interval from negative infinity to positive infinity, and are described by an "unwrapped" probability density function p(ϕ). However, a measurement will yield a "measured" angle θ which lies in some interval of length 2π (for example [0,2π)). In other words, a measurement cannot tell if the "true" angle ϕ has been measured or whether a "wrapped" angle ϕ+2πa has been measured where a is some unknown integer. That is:

θ=ϕ+2πa.

If we wish to calculate the expected value of some function of the measured angle it will be:

f(θ)=p(ϕ)f(ϕ+2πa)dϕ.

We can express the integral as a sum of integrals over periods of 2π (e.g. 0 to 2π):

f(θ)=k=2πk2π(k+1)p(ϕ)f(ϕ+2πa)dϕ.

Changing the variable of integration to θ=ϕ2πk and exchanging the order of integration and summation, we have

f(θ)=02πpw(θ)f(θ+2πa)dθ

where pw(θ) is the pdf of the "wrapped" distribution and a' is another unknown integer (a'=a+k). It can be seen that the unknown integer a' introduces an ambiguity into the expectation value of f(θ). A particular instance of this problem is encountered when attempting to take the mean of a set of measured angles. If, instead of the measured angles, we introduce the parameter z=eiθ it is seen that z has an unambiguous relationship to the "true" angle ϕ since:

z=eiθ=eiϕ.

Calculating the expectation value of a function of z will yield unambiguous answers:

f(z)=02πpw(θ)f(eiθ)dθ

and it is for this reason that the z parameter is the preferred statistical variable to use in circular statistical analysis rather than the measured angles θ. This suggests, and it is shown below, that the wrapped distribution function may itself be expressed as a function of z so that:

f(z)=pw(z)f(z)dz

where pw(z) is defined such that pw(θ)dθ=pw(z)dz. This concept can be extended to the multivariate context by an extension of the simple sum to a number of F sums that cover all dimensions in the feature space:

pw(θ)=k1=p(θ+2πk1e1++2πkFeF)

where ek=(0,,0,1,0,,0)T is the kth Euclidean basis vector.

Expression in terms of characteristic functions

A fundamental wrapped distribution is the Dirac comb which is a wrapped delta function:

Δ2π(θ)=k=δ(θ+2πk).

Using the delta function, a general wrapped distribution can be written

pw(θ)=k=p(θ)δ(θθ+2πk)dθ.

Exchanging the order of summation and integration, any wrapped distribution can be written as the convolution of the "unwrapped" distribution and a Dirac comb:

pw(θ)=p(θ)Δ2π(θθ)dθ.

The Dirac comb may also be expressed as a sum of exponentials, so we may write:

pw(θ)=12πp(θ)n=ein(θθ)dθ

again exchanging the order of summation and integration,

pw(θ)=12πn=p(θ)ein(θθ)dθ

using the definition of ϕ(s), the characteristic function of p(θ) yields a Laurent series about zero for the wrapped distribution in terms of the characteristic function of the unwrapped distribution:[2]

pw(θ)=12πn=ϕ(n)einθ=12πn=ϕ(n)zn

or

pw(z)=12πin=ϕ(n)zn1.

By analogy with linear distributions, the ϕ(m) are referred to as the characteristic function of the wrapped distribution[2] (or perhaps more accurately, the characteristic sequence). This is an instance of the Poisson summation formula and it can be seen that the Fourier coefficients of the Fourier series for the wrapped distribution are just the Fourier coefficients of the Fourier transform of the unwrapped distribution at integer values.

Moments

The moments of the wrapped distribution pw(z) are defined as:

zm=pw(z)zmdz.

Expressing pw(z) in terms of the characteristic function and exchanging the order of integration and summation yields:

zm=12πin=ϕ(n)zm+n1dz.

From the theory of residues we have

zm+n1dz=2πiδm+n

where δk is the Kronecker delta function. It follows that the moments are simply equal to the characteristic function of the unwrapped distribution for integer arguments:

zm=ϕ(m).

Entropy

The information entropy of a circular distribution with probability density fw(θ) is defined as:[1]

H=Γfw(θ)ln(fw(θ))dθ

where Γ is any interval of length 2π. If both the probability density and its logarithm can be expressed as a Fourier series (or more generally, any integral transform on the circle) then the orthogonality property may be used to obtain a series representation for the entropy which may reduce to a closed form.

The moments of the distribution ϕ(n) are the Fourier coefficents for the Fourier series expansion of the probability density:

fw(θ)=12πn=ϕneinθ

If the logarithm of the probability density can also be expressed as a Fourier series:

ln(fw(θ))=m=cmeimθ

where

cm=12πΓln(fw(θ))eimθdθ

Then, exchanging the order of integration and summation, the entropy may be written as:

H=12πm=n=cmϕnΓei(mn)θdθ

Using the orthogonality of the Fourier basis, the integral may be reduced to:

H=n=cnϕn

For the particular case when the probability density is symmetric about the mean, cm=cm and the logarithm may be written:

ln(fw(θ))=c0+2m=1cmcos(mθ)

and

cm=12πΓln(fw(θ))cos(mθ)dθ

and, since normalization requires that ϕ0=1, the entropy may be written:

H=c02n=1cnϕn

See also

Template:More footnotes

References

  1. 1.0 1.1 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  2. 2.0 2.1 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534

External links

55 yrs old Metal Polisher Records from Gypsumville, has interests which include owning an antique car, summoners war hack and spelunkering. Gets immense motivation from life by going to places such as Villa Adriana (Tivoli).

my web site - summoners war hack no survey ios 30 year-old Entertainer or Range Artist Wesley from Drumheller, really loves vehicle, property developers properties for sale in singapore singapore and horse racing. Finds inspiration by traveling to Works of Antoni Gaudí.