Kronecker product: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Maschen
Properties: slight clean up, links
 
en>Ethically Yours
m WPCleaner v1.31b - Fixed using WP:WCW - Interwiki link written as an external link
Line 1: Line 1:
In [[mathematics]], '''operator theory''' is the branch of [[functional analysis]] that focuses on [[bounded linear operator]]s, but which includes [[closed operator]]s and [[nonlinear operator]]s.


Operator theory also includes the study of [[linear algebra|algebra]]s of operators.


  <br><br>Always consider the climate in your area when determining what involving plant life you wish to use. Your landscape will not look good if pick plants, that are not suited for that climate sarasota real estate. Make sure that any plants you choose will have the ability to thrive in your climate.<br><br>Clinton's speech in Perth was customised to include stories designed for a Perth market, including his memories of Perth switching on its lights at night for an american space mission re-entry and comments on the former US President's career as a mining engineer in Kalgoorlie.<br><br>Its hedge cutting opportunity. A light trim of box and yew hedges should be adequate but my beech hedge seems to survive a hard prune. Conifer hedges need regular attention have to be eliminated them more compact.<br><br>Clinton used the metaphor of the space between the invention in the club along with the shield to describe the present situation within the war against terrorism. He explained "this gap needs to closed". Metaphors can give intangible concepts more impact with viewers.<br><br>When looking at buying a home, many small home improvements can keep adding up as expenses. Sometimes, you needs to pay for utilities that would be covered in a rental set up.<br><br>Your lawn can be used care of once full week with a landscaper. A person are hire these types of do quite a few different designs in your front yards. Cross mowing, checkering, lines or just plain washing dishes are many different designs you get to pick creating your grass look in top pattern. A landscaper will rake all the chopped grass for you so your roots don't die.<br><br>Write across the serial amounts of your valuables like TV, PVR, computer, and so forth. You may use an engraver to mark your programs. Make a copy of the list and give it to another family member who doesn't live with you.<br><br>If you have any type of inquiries relating to where and the best ways to utilize [http://www.hedgingplants.com/ hedging plants], you can call us at our site.
==Single operator theory==
Single operator theory deals with the properties and classification of single operators. For example, the classification of [[normal operator]]s in terms of their [[spectrum of an operator|spectra]] falls into this category.
 
===Spectrum of operators===
{{Main|Spectral theorem}}
The '''spectral theorem''' is any of a number of results about [[linear operator]]s or about [[matrix (mathematics)|matrices]]. In broad terms the spectral [[theorem]] provides conditions under which an [[Operator (mathematics)|operator]] or a matrix can be [[Diagonalizable matrix|diagonalized]] (that is, represented as a [[diagonal matrix]] in some basis).  This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces.  In general, the spectral theorem identifies a class of [[linear operator]]s that can be modelled by [[multiplication operator]]s, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative [[C*-algebra]]s. See also [[spectral theory]] for a historical perspective.
 
Examples of operators to which the spectral theorem applies are [[self-adjoint operator]]s or more generally [[normal operator]]s on [[Hilbert space]]s.
 
The spectral theorem also provides a [[canonical form|canonical]] decomposition, called the '''spectral decomposition''', '''eigenvalue decomposition''', or '''[[eigendecomposition of a matrix|eigendecomposition]]''', of the underlying vector space on which the operator acts.
 
====Normal operators====
{{main|Normal matrix}}
 
Normal operators are important because the [[spectral theorem]] holds for them. Today, the class of normal operators is well-understood. Examples of normal operators are
* [[unitary operator]]s: ''N*'' = ''N<sup>−1</sup>
* [[Hermitian operator]]s (i.e., selfadjoint operators): ''N*'' = ''N''; (also, anti-selfadjoint operators: ''N*'' = −''N'')
* [[positive operator]]s: ''N'' = ''MM*''<!-- where M stands for what? -->
* [[Normal matrix|normal matrices]] can be seen as normal operators if one takes the Hilbert space to be '''C'''<sup>''n''</sup>.
 
The spectral theorem extends to a more general class of matrices. Let ''A'' be an operator on a finite-dimensional inner product space. ''A'' is said to be [[normal matrix|normal]]  if ''A''<sup>*</sup> ''A'' = ''A A''<sup>*</sup>. One can show that ''A'' is normal if and only if it is unitarily diagonalizable: By the [[Schur decomposition]], we have ''A'' = ''U T U''<sup>*</sup>, where ''U'' is unitary and ''T'' upper-triangular.
Since ''A'' is normal, ''T T''<sup>*</sup> = ''T''<sup>*</sup> ''T''. Therefore ''T'' must be diagonal since normal upper triangular matrices are diagonal. The converse is obvious.
 
In other words, ''A'' is normal if and only if there exists a [[unitary matrix]] ''U'' such that
 
:<math>A=U D U^* \;</math>
 
where ''D'' is a [[diagonal matrix]]. Then, the entries of the diagonal of ''D'' are the [[eigenvalue]]s of ''A''. The column vectors of ''U'' are the eigenvectors of ''A'' and they are orthonormal. Unlike the Hermitian case, the entries of ''D'' need not be real.
 
===Polar decomposition===
{{Main|Polar decomposition}}
The '''polar decomposition''' of any [[bounded linear operator]] ''A'' between complex [[Hilbert space]]s is a canonical factorization as the product of a [[partial isometry]] and a non-negative operator.
 
The polar decomposition for matrices generalizes as follows: if ''A'' is a bounded linear operator then there is a unique factorization of ''A'' as a product ''A'' = ''UP'' where ''U'' is a partial isometry, ''P'' is a non-negative self-adjoint operator and the initial space of ''U'' is the closure of the range of ''P''.
 
The operator ''U'' must be weakened to a partial isometry, rather than unitary, because of the following issues. If ''A'' is the [[shift operator|one-sided shift]] on ''l''<sup>2</sup>('''N'''), then |''A''| = {''A*A''}<sup>½</sup> = ''I''. So if ''A'' = ''U'' |''A''|, ''U'' must be ''A'', which is not unitary.
 
The existence of a polar decomposition is a consequence of [[Douglas' lemma]]:
 
:'''Lemma''' If ''A'', ''B'' are bounded operators on a Hilbert space ''H'', and ''A*A'' &le; ''B*B'', then there exists a contraction ''C'' such that ''A = CB''. Furthermore, ''C'' is unique if ''Ker''(''B*'') &sub; ''Ker''(''C'').
 
The operator ''C'' can be defined by ''C(Bh)'' = ''Ah'', extended by continuity to the closure of ''Ran''(''B''), and by zero on the orthogonal complement to all of ''H''. The lemma then follows since ''A*A'' ≤ ''B*B'' implies ''Ker''(''A'') ⊂ ''Ker''(''B'').
 
In particular. If ''A*A'' = ''B*B'', then ''C'' is a partial isometry, which is unique if ''Ker''(''B*'') ⊂ ''Ker''(''C'').
In general, for any bounded operator ''A'',
 
:<math>A^*A = (A^*A)^{\frac{1}{2}} (A^*A)^{\frac{1}{2}},</math>
 
where (''A*A'')<sup>½</sup> is the unique positive square root of ''A*A'' given by the usual [[functional calculus]]. So by the lemma, we have
 
:<math>A = U (A^*A)^{\frac{1}{2}}</math>
 
for some partial isometry ''U'', which is unique if ''Ker''(''A*'') ⊂ ''Ker''(''U''). Take ''P'' to be (''A*A'')<sup>½</sup> and one obtains the polar decomposition ''A'' = ''UP''. Notice that an analogous argument can be used to show ''A = P'U' '', where ''P' '' is positive and ''U' '' a partial isometry.
 
When ''H'' is finite dimensional, ''U'' can be extended to a unitary operator; this is not true in general (see example above). Alternatively, the polar decomposition can be shown using the operator version of [[singular value decomposition#Bounded operators on Hilbert spaces|singular value decomposition]].
 
By property of the [[continuous functional calculus]], ''|A|'' is in the [[C*-algebra]] generated by ''A''. A similar but weaker statement holds for the partial isometry: ''U'' is in the [[von Neumann algebra]] generated by ''A''. If ''A'' is invertible, the polar part ''U'' will be in the [[C*-algebra]] as well.
 
==Operator algebras==
The theory of [[operator algebra]]s brings [[algebra over a field|algebra]]s of operators such as [[C*-algebra]]s to the fore.
 
===C*-algebras===
{{Main|C*-algebra}}
A C*-algebra, ''A'', is a [[Banach algebra]] over the field of [[complex number]]s, together with a [[Map (mathematics)|map]] * : ''A'' → ''A''. One writes ''x*'' for the image of an element ''x'' of ''A''. The map * has the following properties:
 
* It is an [[Semigroup with involution|involution]], for every ''x'' in ''A''
::<math> x^{**} = (x^*)^* =  x </math>
 
* For all ''x'', ''y'' in ''A'':
::<math> (x + y)^* = x^* + y^* </math>
::<math> (x y)^* = y^* x^*</math>
 
* For every λ in '''C''' and every ''x'' in ''A'':
::<math> (\lambda x)^* = \overline{\lambda} x^* .</math>
 
* For all ''x'' in ''A'':
::<math>  \|x^* x \| = \|x\|\|x^*\|.</math>
 
'''Remark.''' The first three identities say that ''A'' is a [[*-algebra]]. The last identity is called the '''C* identity''' and is equivalent to:
 
<math>\|xx^*\| = \|x\|^2,</math>
 
The C*-identity is a very strong requirement. For instance, together with the [[spectral radius|spectral radius formula]], it implies that the C*-norm is uniquely determined by the algebraic structure:
 
::<math> \|x\|^2 = \|x^* x\| = \sup\{|\lambda| : x^* x - \lambda \,1 \text{ is not invertible} \}.</math>
 
==See also==
* [[Invariant subspace]]
* [[Functional calculus]]
* [[Spectral theory]]
** [[Resolvent formalism]]
* [[Compact operator]]
** [[Fredholm theory]] of [[integral equation]]s
***[[Integral operator]]
***[[Fredholm operator]]
* [[Self-adjoint operator]]
* [[Unbounded operator]]
** [[Differential operator]]
* [[Umbral calculus]]
* [[Contraction mapping]]
* [[Positive operator]] on a [[Hilbert space]]
* [[Perron–Frobenius theorem#Generalizations|Nonnegative operator]] on a [[ordered vector space|partially ordered vector space]]
 
==External links==
* [http://www.mathphysics.com/opthy/OpHistory.html History of Operator Theory]
==References==
* [[John B. Conway|Conway, J. B.]]: ''A Course in Functional Analysis'', 2nd edition, Springer-Verlag, 1994, ISBN 0-387-97245-5
*{{Cite isbn|9780582237438}}
 
[[Category:Operator theory| ]]

Revision as of 17:49, 27 January 2014

In mathematics, operator theory is the branch of functional analysis that focuses on bounded linear operators, but which includes closed operators and nonlinear operators.

Operator theory also includes the study of algebras of operators.

Single operator theory

Single operator theory deals with the properties and classification of single operators. For example, the classification of normal operators in terms of their spectra falls into this category.

Spectrum of operators

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church. The spectral theorem is any of a number of results about linear operators or about matrices. In broad terms the spectral theorem provides conditions under which an operator or a matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modelled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

Examples of operators to which the spectral theorem applies are self-adjoint operators or more generally normal operators on Hilbert spaces.

The spectral theorem also provides a canonical decomposition, called the spectral decomposition, eigenvalue decomposition, or eigendecomposition, of the underlying vector space on which the operator acts.

Normal operators

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church.

Normal operators are important because the spectral theorem holds for them. Today, the class of normal operators is well-understood. Examples of normal operators are

The spectral theorem extends to a more general class of matrices. Let A be an operator on a finite-dimensional inner product space. A is said to be normal if A* A = A A*. One can show that A is normal if and only if it is unitarily diagonalizable: By the Schur decomposition, we have A = U T U*, where U is unitary and T upper-triangular. Since A is normal, T T* = T* T. Therefore T must be diagonal since normal upper triangular matrices are diagonal. The converse is obvious.

In other words, A is normal if and only if there exists a unitary matrix U such that

A=UDU*

where D is a diagonal matrix. Then, the entries of the diagonal of D are the eigenvalues of A. The column vectors of U are the eigenvectors of A and they are orthonormal. Unlike the Hermitian case, the entries of D need not be real.

Polar decomposition

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church. The polar decomposition of any bounded linear operator A between complex Hilbert spaces is a canonical factorization as the product of a partial isometry and a non-negative operator.

The polar decomposition for matrices generalizes as follows: if A is a bounded linear operator then there is a unique factorization of A as a product A = UP where U is a partial isometry, P is a non-negative self-adjoint operator and the initial space of U is the closure of the range of P.

The operator U must be weakened to a partial isometry, rather than unitary, because of the following issues. If A is the one-sided shift on l2(N), then |A| = {A*A}½ = I. So if A = U |A|, U must be A, which is not unitary.

The existence of a polar decomposition is a consequence of Douglas' lemma:

Lemma If A, B are bounded operators on a Hilbert space H, and A*AB*B, then there exists a contraction C such that A = CB. Furthermore, C is unique if Ker(B*) ⊂ Ker(C).

The operator C can be defined by C(Bh) = Ah, extended by continuity to the closure of Ran(B), and by zero on the orthogonal complement to all of H. The lemma then follows since A*AB*B implies Ker(A) ⊂ Ker(B).

In particular. If A*A = B*B, then C is a partial isometry, which is unique if Ker(B*) ⊂ Ker(C). In general, for any bounded operator A,

A*A=(A*A)12(A*A)12,

where (A*A)½ is the unique positive square root of A*A given by the usual functional calculus. So by the lemma, we have

A=U(A*A)12

for some partial isometry U, which is unique if Ker(A*) ⊂ Ker(U). Take P to be (A*A)½ and one obtains the polar decomposition A = UP. Notice that an analogous argument can be used to show A = P'U' , where P' is positive and U' a partial isometry.

When H is finite dimensional, U can be extended to a unitary operator; this is not true in general (see example above). Alternatively, the polar decomposition can be shown using the operator version of singular value decomposition.

By property of the continuous functional calculus, |A| is in the C*-algebra generated by A. A similar but weaker statement holds for the partial isometry: U is in the von Neumann algebra generated by A. If A is invertible, the polar part U will be in the C*-algebra as well.

Operator algebras

The theory of operator algebras brings algebras of operators such as C*-algebras to the fore.

C*-algebras

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church. A C*-algebra, A, is a Banach algebra over the field of complex numbers, together with a map * : AA. One writes x* for the image of an element x of A. The map * has the following properties:

x**=(x*)*=x
  • For all x, y in A:
(x+y)*=x*+y*
(xy)*=y*x*
  • For every λ in C and every x in A:
(λx)*=λx*.
  • For all x in A:
x*x=xx*.

Remark. The first three identities say that A is a *-algebra. The last identity is called the C* identity and is equivalent to:

xx*=x2,

The C*-identity is a very strong requirement. For instance, together with the spectral radius formula, it implies that the C*-norm is uniquely determined by the algebraic structure:

x2=x*x=sup{|λ|:x*xλ1 is not invertible}.

See also

External links

References