Symplectic matrix: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
m →‎Symplectic transformations: block diagonal form is now below, not above
en>Chris Howard
added simple example (with internal link) into the lead (for the benefit of non-specialized readers)
 
Line 1: Line 1:
{{More footnotes|date=November 2009}}
The writer is called Irwin Wunder but it's not the most masucline name out there. To do aerobics is a factor that I'm totally addicted to. In her professional lifestyle she is a payroll clerk but she's usually needed her own company. California is exactly where her home is but she requirements to move simply because of her family members.<br><br>My homepage :: [http://urlku.info/healthymealsdelivered68340 urlku.info]
 
In mathematics, and in particular [[linear algebra]], a '''skew-symmetric''' (or '''antisymmetric''' or '''antimetric'''<ref>{{cite book |author=Richard A. Reyment, [[K. G. Jöreskog]], Leslie F. Marcus |title=Applied Factor Analysis in the Natural Sciences | publisher=Cambridge University Press | year= 1996 | isbn=0-521-57556-7 | page=68}}</ref>) '''matrix''' is a [[square matrix]] ''A'' whose [[transpose]] is also its negative; that is, it satisfies the condition {{nowrap|1=-''A'' = ''A''<sup>T</sup>.}} If the entry in the {{nowrap|1=''i''&thinsp;th row}} and {{nowrap|1=''j''&thinsp;th column}} is ''a<sub>ij</sub>'', i.e. {{nowrap|1=''A'' = (''a''<sub>''ij''</sub>)}} then the skew symmetric condition is {{nowrap|1=''a<sub>ij</sub>'' = −''a<sub>ji</sub>''.}} For example, the following matrix is skew-symmetric:
 
:<math>\begin{bmatrix}
0 & 2 & -1 \\
-2 & 0 & -4 \\
1 & 4 & 0\end{bmatrix}.</math>
 
==Properties==
 
We assume that the underlying [[field (mathematics)|field]] is not of [[Characteristic (algebra)|characteristic]] 2: that is, that {{nowrap|1=1 + 1 ≠ 0}} where 1 denotes the multiplicative identity and 0 the additive identity of the given field. Otherwise, a skew-symmetric matrix is just the same thing as a [[symmetric matrix]].
 
Sums and scalar multiples of skew-symmetric matrices are again skew-symmetric. Hence, the skew-symmetric matrices form a [[vector space]]. Its [[dimension of a vector space|dimension]] is ''n''(''n''&minus;1)/2.
 
Let Mat<sub>''n''</sub> denote the space of {{nowrap|1=''n'' &times; ''n''}} matrices. A skew-symmetric matrix is determined by ''n''(''n''&nbsp;&minus;&nbsp;1)/2 scalars (the number of entries above the [[main diagonal]]); a [[symmetric matrix]] is determined by ''n''(''n''&nbsp;+&nbsp;1)/2 scalars (the number of entries on or above the main diagonal). If Skew<sub>''n''</sub> denotes the space of {{nowrap|1=''n'' &times; ''n''}} skew-symmetric matrices and Sym<sub>''n''</sub> denotes the space of {{nowrap|1=''n'' &times; ''n''}} symmetric matrices and  then since {{nowrap|1=Mat<sub>''n''</sub> = Skew<sub>''n''</sub> + Sym<sub>''n''</sub>}} and {{nowrap|1=Skew<sub>''n''</sub> &cap; Sym<sub>''n''</sub> = {0}}}, i.e.
:<math> \mbox{Mat}_n = \mbox{Skew}_n \oplus \mbox{Sym}_n , </math>
where ⊕ denotes the [[Direct sum of modules|direct sum]]. Let {{nowrap|1=A &isin; Mat<sub>''n''</sub>}} then
:<math> A = \frac{1}{2}(A - A^{\mathsf{T}}) + \frac{1}{2}(A + A^{\mathsf{T}}) . </math>
Notice that {{nowrap|1=½(''A'' &minus; ''A''<sup>T</sup>) &isin; Skew<sub>''n''</sub>}} and {{nowrap|1=½(''A'' + ''A''<sup>T</sup>) &isin; Sym<sub>''n''</sub>.}} This is true for every [[square matrix]] ''A'' with entries from any [[field (mathematics)|field]] whose [[characteristic (algebra)|characteristic]] is different from 2.
 
Denote with <math>\langle \cdot,\cdot \rangle</math> the standard [[inner product]] on '''R'''<sup>''n''</sup>. The real ''n''-by-''n'' matrix ''A'' is skew-symmetric if and only if
:<math>\langle Ax,y \rangle = - \langle x, Ay\rangle \quad \forall x,y\in\Bbb{R}^n.</math>
This is also equivalent to <math>\langle x,Ax \rangle = 0</math> for all ''x'' (one implication being obvious, the other a plain consequence of <math>\langle x+y, A(x+y)\rangle =0</math> for all x and y).
Since this definition is independent of the choice of [[basis (linear algebra)|basis]], skew-symmetry is a property that depends only on the [[linear operator]] A and a choice of [[inner product]].
 
All [[main diagonal]] entries of a skew-symmetric matrix must be zero, so the [[trace of a matrix|trace]] is zero. If {{nowrap|1=''A'' = (''a<sub>ij''</sub>)}} is skew-symmetric, {{nowrap|1=''a<sub>ij''</sub> = &minus;''a<sub>ji''</sub>}}; hence {{nowrap|1=''a<sub>ii''</sub> = 0.}}
 
3x3 skew symmetric matrices can be used to represent [[cross product]]s as matrix multiplications.
 
===Determinant===
 
Let ''A'' be a ''n''&times;''n'' skew-symmetric matrix. The [[determinant]] of ''A'' satisfies
 
:det(''A'') = det(''A''<sup>T</sup>) = det(&minus;''A'') = (&minus;1)<sup>''n''</sup>det(''A''). Hence {{nowrap|1=det(''A'') = 0}} when ''n'' is odd.
In particular, if ''n'' is odd, and since the underlying field is not of characteristic 2, the determinant vanishes. This result is called '''Jacobi's theorem''', after [[Carl Gustav Jacobi]] (Eves, 1980).
 
The even-dimensional case is more interesting. It turns out that the determinant of ''A'' for ''n'' even can be written as the square of a [[polynomial]] in the entries of ''A'', which was first proved by Cayley:<ref>{{cite journal
| last1 = Cayley
| first1 = Arthur
| authorlink1 = Arthur Cayley
| year = 1847
| title = Sur les determinants gauches
| trans_title = On skew determinants
| journal = Crelle's Journal
| volume = 38
| pages = 93–96
}}</ref>
 
:det(''A'') = Pf(''A'')<sup>2</sup>.
 
This polynomial is called the ''[[Pfaffian]]'' of ''A'' and is denoted Pf(''A'').  Thus the determinant of a real skew-symmetric matrix is always non-negative. However this last fact can be proved in an elementary way as follows: the eigenvalues of a real skew-symmetric matrix are purely imaginary (see below) and to every eigenvalue there corresponds the conjugate eigenvalue with the same multiplicity; therefore, as the determinant is the product of the eigenvalues, each one repeated according to its multiplicity, it follows at once that the determinant, if it is not 0, is a positive real number.
 
The number of distinct terms ''s''(''n'') in the expansion of the determinant of a skew-symmetric matrix of order ''n'' has been considered already by Cayley, Sylvester, and  Pfaff. Due to cancellations, this number is  quite small as compared the number of terms of a generic matrix of order ''n'', which is ''n''!. The sequence ''s''(''n'') {{OEIS|A002370}} is
:1, 0, 1, 0, 6, 0, 120, 0, 5250, 0, 395010, 0, &hellip;
and it is encoded in the [[exponential generating function]]
:<math>\sum_{n=0}^\infty \frac{s(n)}{n!}x^n=(1-x^2)^{-\frac{1}{4}}\exp\left(\frac{x^2}{4}\right).</math>
The latter yields to the asymptotics (for ''n'' even)
:<math>s(n)=\pi^ {-\frac{1}{2} } 2^ {\frac{3}{4}} \Gamma\left (3/4 \right) (n/e)^ {n-\frac{1}{4} } \left (1+O\big(\frac{1}{n}\big)\right).</math>
 
The number of positive and negative terms are approximatively a half of the total, although their difference takes larger and larger positive and negative values as ''n'' increases {{OEIS|A167029}}.
 
=== Spectral theory ===
 
Since a matrix is similar to its own transpose, they must have the same eigenvalues. It follows that the [[eigenvalue]]s of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). From the spectral theorem, for a real skew-symmetric matrix the nonzero eigenvalues are all pure [[imaginary number|imaginary]] and thus are of the form ''i''λ<sub>1</sub>, &minus;''i''λ<sub>1</sub>, ''i''λ<sub>2</sub>, &minus;''i''λ<sub>2</sub>, … where each of the λ<sub>''k''</sub> are real.
 
Real skew-symmetric matrices are [[normal matrix|normal matrices]] (they commute with their [[adjoint matrix|adjoints]]) and are thus subject to the [[spectral theorem]], which states that any real skew-symmetric matrix can be diagonalized by a [[unitary matrix]]. Since the eigenvalues of a real skew-symmetric matrix are imaginary it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a [[block matrix|block diagonal]] form by an [[orthogonal matrix|orthogonal transformation]]. Specifically, every 2''n''&nbsp;×&nbsp;2''n'' real skew-symmetric matrix can be written in the form ''A'' = ''Q'' Σ ''Q''<sup>T</sup> where ''Q'' is orthogonal and
:<math>\Sigma = \begin{bmatrix}
\begin{matrix}0 & \lambda_1\\ -\lambda_1 & 0\end{matrix} &  0 & \cdots & 0 \\
0 & \begin{matrix}0 & \lambda_2\\ -\lambda_2 & 0\end{matrix} &  & 0 \\
\vdots &  & \ddots & \vdots \\
0 & 0 & \cdots & \begin{matrix}0 & \lambda_r\\ -\lambda_r & 0\end{matrix} \\
& & & & \begin{matrix}0 \\ & \ddots \\ & & 0 \end{matrix}
\end{bmatrix}</math>
for real λ<sub>''k''</sub>. The nonzero eigenvalues of this matrix are ±''i''λ<sub>''k''</sub>. In the odd-dimensional case Σ always has at least one row and column of zeros.
 
More generally, every complex skew-symmetric matrix can be written in the form ''A'' = ''U'' Σ ''U''<sup>T</sup> where ''U'' is unitary and Σ has the block-diagonal form given above with complex λ<sub>''k''</sub>. This is an example of the Youla decomposition of a complex square matrix.<ref>{{cite journal|doi=10.4153/CJM-1961-059-8|first=D. C.  |last=Youla|title=A normal form for a matrix under the unitary congruence group|journal=Canad. J. Math. |volume=13|pages=694–704 |year=1961}}</ref>
 
==Alternating forms==
 
We begin with a special case of the definition.  An '''alternating form''' φ on a [[vector space]] ''V'' over a [[field (mathematics)|field]] ''K'', not of [[characteristic (algebra)|characteristic]] 2, is defined to be a [[bilinear form]]
 
: &phi; : ''V'' &times; ''V'' &rarr; ''K''
 
such that
 
: &phi;(''v'',''w'') = &minus;&phi;(''w'',''v'').
 
This defines a form with desirable properties for vector spaces over fields of characteristic not equal to 2, but in a vector space over a field of characteristic 2, the definition fails, as every element is its own additive inverse.  That is, symmetric and alternating forms are equivalent, which is clearly false in the case above.  However, we may extend the definition to vector spaces over fields of characteristic 2 as follows:
 
In the case where the [[vector space]] ''V'' is over a field of arbitrary [[characteristic (algebra)|characteristic]] including characteristic 2, we may state that for all vectors ''v'' in ''V''
 
: &phi;(''v'',''v'') = 0.
 
This reduces to the above case when the field is not of characteristic 2 as seen below
 
: 0 = &phi;(''v'' + ''w'',''v'' + ''w'') = &phi;(''v'',''v'') + &phi;(''v'',''w'') + &phi;(''w'',''v'') + &phi;(''w'',''w'') = &phi;(''v'',''w'') + &phi;(''w'',''v'')
 
Whence,
 
: &phi;(''v'',''w'') = &minus;&phi;(''w'',''v'').
 
Thus, we have a definition that now holds for vector spaces over fields of all characteristics.
 
Such a φ will be represented by a skew-symmetric matrix ''A'', ''φ(v, w) = v<sup>T</sup>Aw'', once a [[basis (linear algebra)|basis]] of ''V'' is chosen; and conversely an ''n''&times;''n'' skew-symmetric matrix ''A'' on ''K''<sup>''n''</sup> gives rise to an alternating form sending ''(v, w)'' to ''v<sup>T</sup>Aw''.
 
==Infinitesimal rotations==
{{main|Euler's rotation theorem#Generators of rotations}}
{{main|Rotation matrix#Infinitesimal rotations}}
{{main|Infinitesimal strain theory#Infinitesimal rotation tensor}}
Skew-symmetric matrices over the field of real numbers form the [[tangent space]] to the real [[orthogonal group]] O(''n'') at the identity matrix; formally, the [[special orthogonal Lie algebra]]. In this sense, then, skew-symmetric matrices can be thought of as ''infinitesimal rotations''.
 
Another way of saying this is that the space of skew-symmetric matrices forms the [[Lie algebra]] o(''n'') of the [[Lie group]] O(''n'').
The Lie bracket on this space is given by the [[commutator]]:
 
:<math>[A,B] = AB - BA.\,</math>
 
It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric:
 
:<math>[A,B]^{\mathsf{T}} = B^{\mathsf{T}}A^{\mathsf{T}}-A^{\mathsf{T}}B^{\mathsf{T}} = BA-AB = -[A,B] \, .</math>
 
The [[matrix exponential]] of a skew-symmetric matrix ''A'' is then an [[orthogonal matrix]] ''R'':
 
:<math>R=\exp(A)=\sum_{n=0}^\infty \frac{A^n}{n!}.</math>
 
The image of the [[exponential map]] of a Lie algebra always lies in the [[Connected space|connected component]] of the Lie group that contains the identity element. In the case of the Lie group O(''n''), this connected component is the [[special orthogonal group]] SO(''n''), consisting of all orthogonal matrices with determinant 1. So ''R'' = exp(''A'') will have determinant +1. Moreover, since the exponential map of a connected compact Lie group is always surjective, it turns out that ''every'' orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix. In the particular important case of dimension ''n''=2, the exponential representation for an orthogonal matrix reduces to  the well-known [[complex number#Polar form|polar form]] of a complex number of unit modulus. Indeed, if n=2, a special orthogonal matrix has the form
:<math>\begin{bmatrix}
a & -b  \\
b & \,a \end{bmatrix},</math>
with a<sup>2</sup>+b<sup>2</sup>=1. Therefore, putting ''a''=cos''θ'' and ''b''=sin''θ'', it can be written
:<math>\begin{bmatrix}
\cos\,\theta & -\sin\,\theta \\
\sin\,\theta & \,\cos\,\theta \end{bmatrix}=
\exp\left( \theta             
\begin{bmatrix}
0 & -1 \\
1  &\,0 \end{bmatrix}
\right),
</math>
which corresponds exactly to the polar form cos''θ'' + ''i''sin''θ'' = e<sup>''iθ''</sup> of a complex number of unit modulus.
 
The exponential representation of an orthogonal matrix of order ''n'' can also be obtained starting from the fact that in dimension ''n'' any special orthogonal matrix ''R''  can be written as R = Q S Q<sup>T</sup>, where Q is orthogonal and S is a [[Block matrix#Block diagonal matrix|block diagonal matrix]] with <math>\scriptstyle\lfloor {n/2}\rfloor</math> blocks of order 2, plus one of order 1 if n is odd; since each single block of order 2 is also an orthogonal matrix, it admits an exponential form. Correspondingly, the matrix S writes as exponential of a skew-symmetric block matrix Σ of the form above, S=exp(Σ), so that R = Q exp(Σ)Q<sup>T</sup> = exp(Q Σ Q<sup>T</sup>), exponential of the skew-symmetric matrix Q Σ Q<sup>T</sup>. Conversely, the surjectivity of the exponential map, together with the above mentioned block-diagonalization for skew-symmetric matrices, implies the block-diagonalization for orthogonal matrices.
 
== Coordinate-free ==
More intrinsically (i.e., without using coordinates), skew-symmetric linear transformations on a vector space ''V'' with an [[inner product]] may be defined as the [[bivector]]s on the space, which are sums of simple bivectors ([[blade (geometry)|2-blades]]) <math>v \wedge w</math>. The correspondence is given by the map <math>v \wedge w \mapsto v^* \otimes w - w^* \otimes v,</math> where <math>v^*</math> is the covector dual to the vector <math>v</math>; in orthonormal coordinates these are exactly the elementary skew-symmetric matrices. This characterization is used in interpreting the [[curl (mathematics)|curl]] of a vector field (naturally a 2-vector) as an infinitesimal rotation or "curl", hence the name.
 
== Skew-symmetrizable matrix ==
An ''n''-by-''n'' matrix ''A'' is said to be '''skew-symmetrizable''' if there exist an invertible [[diagonal matrix]] ''D'' and skew-symmetric matrix ''S'' such that {{nowrap|1=''A'' = ''DS''.}} For '''real''' ''n''-by-''n'' matrices, sometimes the condition for ''D'' to have positive entries is added.<ref>{{cite arXiv |last1= Fomin |last2= Zelevinsky |first1= Sergey |first2= Andrei |eprint= math/010415 |title= Cluster algebras I: Foundations |year= 2001 |version= 1 | page=15}}</ref>
 
==See also==
*[[Symmetric matrix]]
*[[Skew-Hermitian matrix]]
*[[Symplectic matrix]]
*[[Symmetry in mathematics]]
 
==References==
{{reflist}}
 
==Further reading==
* {{cite book
|last=Eves
|first=Howard
|authorlink=Howard Eves
|title=Elementary Matrix Theory
|publisher=Dover Publications
|year=1980
|isbn=978-0-486-63946-8}}
* {{SpringerEOM
|urlname=S/s085720
|title=Skew-symmetric matrix
|last=Suprunenko|first=D. A.}}
*{{cite journal
|title=On the number of distinct terms in the expansion of symmetric and skew determinants.
|last=Aitken|first=A. C.
|year=1944
|journal=Edinburgh Math. Notes }}
 
== External links ==
* {{Cite web|url=http://mathworld.wolfram.com/AntisymmetricMatrix.html |title=Antisymmetric matrix|work= Wolfram Mathworld}}
* {{Cite web|url=http://www.tu-chemnitz.de/mathematik/hapack/|title=HAPACK - Software for (Skew-)Hamiltonian Eigenvalue Problems|
 
first1=Peter |last1=Benner|
first2=Daniel |last2=Kressner
}}
* {{Cite journal|doi=10.1145/355791.355799|title=Algorithm 530: An Algorithm for Computing the Eigensystem of Skew-Symmetric Matrices and a Class of Symmetric Matrices [F2]|year=1978|last1=Ward|first1=R. C.|last2=Gray|first2=L. J.|journal=ACM Transactions on Mathematical Software|volume=4|issue=3|pages=286}} [http://www.netlib.org/toms/530 Fortran] [http://jblevins.org/mirror/amiller/toms530.f90 Fortran90]
 
{{DEFAULTSORT:Skew-Symmetric Matrix}}
[[Category:Matrices]]

Latest revision as of 13:02, 6 December 2014

The writer is called Irwin Wunder but it's not the most masucline name out there. To do aerobics is a factor that I'm totally addicted to. In her professional lifestyle she is a payroll clerk but she's usually needed her own company. California is exactly where her home is but she requirements to move simply because of her family members.

My homepage :: urlku.info