Zeta distribution: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>BeyondNormality
en>Phil Boswell
m →‎External links: convert dodgy URL to ID using AWB
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
In [[mathematics]], particularly [[linear algebra]] and [[functional analysis]], the '''spectral theorem''' is any of a number of results about [[linear operator]]s or about [[matrix (mathematics)|matrices]]. In broad terms the spectral [[theorem]] provides conditions  under which an [[Operator (mathematics)|operator]] or a matrix can be [[Diagonalizable matrix|diagonalized]] (that is, represented as a [[diagonal matrix]] in some basis).  This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces.  In general, the spectral theorem identifies a class of [[linear operator]]s that can be modelled by [[multiplication operator]]s, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative [[C*-algebra]]s. See also [[spectral theory]] for a historical perspective.
I am a exercising physician, and have been a Civil War buff for years. I've shared documents in Civil War Journal, Civil War News, and Civil War Weekly and am a associate of the Civil War Society. I remain in Richmond, Manitoba, with my partner, 3 children and four puppies.<br><br>Also visit my web site: [http://4Crv.com/5 all inclusive vacation package]
 
Examples of operators to which the spectral theorem applies are [[self-adjoint operator]]s or more generally [[normal operator]]s on [[Hilbert space]]s.
 
The spectral theorem also provides a [[canonical form|canonical]] decomposition, called the '''spectral decomposition''', '''eigenvalue decomposition''', or '''[[eigendecomposition of a matrix|eigendecomposition]]''', of the underlying vector space on which the operator acts.
 
[[Augustin Louis Cauchy]] proved the spectral theorem for [[Hermitian matrix|self-adjoint matrices]], i.e., that every real, symmetric matrix is diagonalizable. The spectral theorem as generalized by [[John von Neumann]] is today the most important result of operator theory. In addition, Cauchy was the first to be systematic about determinants.<ref>[http://www.sciencedirect.com/science/article/pii/0315086075900324 Cauchy and the spectral theory of matrices by Thomas Hawkins]</ref><ref>[http://www.mathphysics.com/opthy/OpHistory.html A Short History of Operator Theory by Evans M. Harrell II]</ref>
 
In this article we consider mainly the simplest kind of spectral theorem, that for a [[self-adjoint]] operator on a Hilbert space. However, as noted above, the spectral theorem also holds for normal operators on a Hilbert space.
 
== Finite-dimensional case ==<!-- This section is linked from [[Singular value decomposition]] -->
 
=== Hermitian maps and Hermitian matrices ===
We begin by considering a [[Hermitian matrix]] on '''C'''<sup>''n''</sup> or '''R'''<sup>''n''</sup>. More generally we consider a [[Hermitian operator|Hermitian map]] ''A'' on a finite-dimensional [[real number|real]] or [[complex number|complex]] [[inner product space]] ''V'' endowed with a positive definite Hermitian [[inner product]]. The Hermitian condition means
 
:<math> (\forall x,y\in V): \langle A x ,\, y \rangle =  \langle x ,\, A y \rangle .</math>
 
An equivalent condition is that ''A''* = ''A'' where ''A''* is the [[hermitian conjugate]] of ''A''. In the case that ''A'' is identified with an Hermitian matrix, the matrix of ''A''* can be identified with its [[conjugate transpose]]. If ''A'' is a real matrix, this is equivalent to ''A''<sup>T</sup> = ''A'' (that is, A is a [[symmetric matrix]]).
 
This condition easily implies that all eigenvalues of a Hermitian map are real: it is enough to apply it to the case when ''x''&nbsp;=&nbsp;''y'' is an eigenvector. (Recall that an [[eigenvector]] of a linear map ''A'' is a (non-zero) vector ''x'' such that ''Ax''&nbsp;=&nbsp;''λx'' for some scalar&nbsp;''λ''. The value ''λ'' is the corresponding [[eigenvalue]].)
 
'''Theorem'''. There exists an [[orthonormal basis]] of ''V'' consisting of eigenvectors of ''A''. Each eigenvalue is real.
 
We provide a sketch of a proof for the case where the underlying field of scalars is the [[complex number]]s.
 
By the [[fundamental theorem of algebra]], applied to the [[characteristic polynomial]] of ''A'', there is at least one eigenvalue λ<sub>1</sub> and eigenvector ''e''<sub>1</sub>. Then since
:<math>\lambda_1 \langle e_1, e_1 \rangle = \langle A (e_1), e_1 \rangle = \langle e_1, A(e_1) \rangle = \bar\lambda_1 \langle e_1, e_1 \rangle </math>
we find that λ<sub>1</sub> is real. Now consider the space ''K'' =&nbsp;span{''e''<sub>1</sub>}<sup>&perp;</sup>, the [[orthogonal complement]] of ''e''<sub>1</sub>. By Hermiticity, ''K'' is an [[invariant subspace]] of ''A''. Applying the same argument to ''K'' shows that ''A'' has an eigenvector ''e''<sub>2</sub> &isin; ''K''. Finite induction then finishes the proof.
 
The spectral theorem holds also for symmetric maps on finite-dimensional real inner product spaces, but the existence of an eigenvector does not follow immediately from the  [[fundamental theorem of algebra]]. The easiest way to prove it is probably to consider ''A'' as a Hermitian matrix and use the fact that all eigenvalues of a Hermitian matrix are real.
 
If one chooses the eigenvectors of ''A'' as an orthonormal basis, the matrix representation of ''A'' in this basis is diagonal. Equivalently, ''A'' can be written as a linear combination of pairwise orthogonal projections, called its '''spectral decomposition'''. Let
 
:<math> V_\lambda = \{\,v \in V: A v = \lambda v\,\}</math>
 
be the eigenspace corresponding to an eigenvalue &lambda;. Note that the definition does not depend on any choice of specific eigenvectors. ''V'' is the orthogonal direct sum of the spaces ''V''<sub>&lambda;</sub> where the index ranges over eigenvalues. Let ''P''<sub>&lambda;</sub> be the [[Orthogonal_projection#Orthogonal_projections|orthogonal projection]] onto ''V''<sub>&lambda;</sub> and ''&lambda;''<sub>1</sub>,&nbsp;...,&nbsp;''&lambda;''<sub>''m''</sub> the eigenvalues of ''A'', one can write its spectral decomposition thus:
 
:<math>A =\lambda_1 P_{\lambda_1} +\cdots+\lambda_m P_{\lambda_m}. \, </math>
 
The spectral decomposition is a special case of both the [[Schur decomposition]] and the [[singular value decomposition]].
 
=== Normal matrices ===
{{main|Normal matrix}}
The spectral theorem extends to a more general class of matrices. Let ''A'' be an operator on a finite-dimensional inner product space. ''A'' is said to be [[normal matrix|normal]]  if ''A''<sup>*</sup> ''A'' = ''A A''<sup>*</sup>. One can show that ''A'' is normal if and only if it is unitarily diagonalizable: By the [[Schur decomposition]], we have ''A'' = ''U T U''<sup>*</sup>, where ''U'' is unitary and ''T'' upper-triangular.
Since ''A'' is normal, ''T T''<sup>*</sup> = ''T''<sup>*</sup> ''T''. Therefore ''T'' must be diagonal since normal upper triangular matrices are diagonal. The converse is obvious.
 
In other words, ''A'' is normal if and only if there exists a [[unitary matrix]] ''U'' such that
 
:<math>A=U D U^* \;</math>
 
where ''D'' is a [[diagonal matrix]]. Then, the entries of the diagonal of ''D'' are the [[eigenvalue]]s of ''A''. The column vectors of ''U'' are the eigenvectors of ''A'' and they are orthonormal. Unlike the Hermitian case, the entries of ''D'' need not be real.
 
== Compact self-adjoint operators ==
{{main|Compact operator on Hilbert space}}
In Hilbert spaces in general, the statement of the spectral theorem for [[compact operator|compact]] [[self-adjoint operators]] is virtually the same as in the finite-dimensional case.
 
'''Theorem'''. Suppose ''A'' is a compact self-adjoint operator on a Hilbert space ''V''. There is an [[orthonormal basis]] of ''V'' consisting of eigenvectors of ''A''. Each eigenvalue is real.
 
As for Hermitian matrices, the key point is to prove the existence of at least one nonzero eigenvector. To prove this, we cannot rely on determinants to show existence of eigenvalues, but instead one can use a maximization argument analogous to the variational characterization of eigenvalues. The above spectral theorem holds for real or complex Hilbert spaces.
 
If the compactness assumption is removed, it is not true that every self adjoint operator has eigenvectors.
 
== Bounded self-adjoint operators ==
{{See also|Eigenfunction|Self-adjoint operator#Spectral theorem}}
 
The next generalization we consider is that of [[bounded operator|bounded]] self-adjoint operators on a Hilbert space. Such operators may have no eigenvalues: for instance let ''A'' be the operator of multiplication by ''t'' on ''L''<sup>2</sup>[0, 1], that is
 
:<math> [A \varphi](t) = t \varphi(t). \;</math>
 
Theorem:<ref>{{citation | last = Hall |first = B.C. |title = Quantum Theory for Mathematicians | page = 147 |publisher = Springer | year = 2013}}</ref> Let ''A''  be a bounded self-adjoint operator on a Hilbert space ''H''.  Then there is a [[measure space]] (''X'', &Sigma;, &mu;) and a real-valued [[ess sup|essentially bounded]] measurable function ''f'' on ''X'' and a unitary operator ''U'':''H'' &rarr; ''L''<sup>2</sup><sub>&mu;</sub>(''X'') such that
 
:<math> U^* T U = A \;</math>
 
where ''T'' is the [[multiplication operator]]:
 
:<math> [T \varphi](x) = f(x) \varphi(x). \;</math>
 
and <math>\|T\| = \|f\|_\infty</math>
 
This is the beginning of the vast research area of functional analysis called [[operator theory]]; see also the [[spectral measure#Spectral measure|spectral measure]].
 
There is also an analogous spectral theorem for bounded [[normal operator]]s on Hilbert spaces.  The only difference in the conclusion is that now <math>f</math> may be complex-valued.
 
An alternative formulation of the spectral theorem expresses the operator <math>A</math> as an integral of the coordinate function  over the operator's [[Eigenvector#Infinite dimensions|spectrum]] with respect to a [[projection-valued measure]].
 
: <math> A = \int_{\sigma(A)} \lambda \, d E_{\lambda} </math>
 
When the normal operator in question is [[compact operator|compact]], this version of the spectral theorem reduces to the finite-dimensional spectral theorem above, except that the operator is expressed as a linear combination of possibly infinitely many projections.
 
== General self-adjoint operators ==
Many important linear operators which occur in [[Mathematical analysis|analysis]], such as [[differential operators]], are unbounded. There is also a spectral theorem for [[self-adjoint operator]]s that applies in these cases.  To give an example, any constant coefficient differential operator is unitarily equivalent to a multiplication operator. Indeed the unitary operator that implements this equivalence is the [[Fourier transform]]; the multiplication operator is a type of [[Multiplier (Fourier analysis)|Fourier multiplier]].
 
In general, spectral theorem for self-adjoint operators may take several equivalent forms.
 
'''Spectral theorem in the form of multiplication operator'''. ''For each self-adjoint operator '''T''' acting in a Hilbert space '''H''', there exists a unitary operator, making an isometrically isomorphic mapping of the Hilbert space '''H''' onto the space '''L<sup>2</sup>(M, μ)''', where the operator '''T''' is represented as a multiplication operator.''
 
The Hilbert space ''H'' where a self-adjoint operator ''T'' acts may be decomposed into a direct sum of Hilbert spaces ''H<sub>i</sub>'' in such a way that the operator ''T'', narrowed to each space ''H<sub>i </sub>'', has a simple spectrum. It is possible to construct ''unique'' such decomposition (up to unitary equivalence), which is called an ''ordered spectral representation''.
 
== See also ==
* [[Spectral theory]]
* [[Matrix decomposition]]
* [[Canonical form]]
* [[Jordan normal form|Jordan decomposition]], of which the spectral decomposition is a special case.
* [[Singular value decomposition]], a generalisation of spectral theorem to arbitrary matrices.
* [[Eigendecomposition of a matrix]]
 
== References ==
{{reflist}}
* [[Sheldon Axler]], ''Linear Algebra Done Right'', Springer Verlag, 1997
* [[Paul Halmos]], [http://www.jstor.org/stable/2313117 "What Does the Spectral Theorem Say?"], ''American Mathematical Monthly'', volume 70, number 3 (1963), pages 241&ndash;247 [http://www.math.wsu.edu/faculty/watkins/Math502/pdfiles/spectral.pdf Other link]
* [[Michael C. Reed|M. Reed]] and [[Barry Simon|B. Simon]], ''Methods of Mathematical Physics'', vols I–IV, Academic Press 1972.
* [[Gerald Teschl|G. Teschl]], ''Mathematical Methods in Quantum Mechanics with Applications to Schrödinger Operators'', http://www.mat.univie.ac.at/~gerald/ftp/book-schroe/, American Mathematical Society, 2009.
*{{citation | last = Hall |first = B.C. |title = Quantum Theory for Mathematicians | year = 2013 |publisher = Springer}}
 
[[Category:Spectral theory|*]]
[[Category:Linear algebra]]
[[Category:Matrix theory]]
[[Category:Singular value decomposition]]
[[Category:Theorems in functional analysis]]

Latest revision as of 00:32, 11 May 2014

I am a exercising physician, and have been a Civil War buff for years. I've shared documents in Civil War Journal, Civil War News, and Civil War Weekly and am a associate of the Civil War Society. I remain in Richmond, Manitoba, with my partner, 3 children and four puppies.

Also visit my web site: all inclusive vacation package