Unitary group: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Headbomb
m Various citation cleanup + AWB fixes using AWB (8062)
 
No edit summary
Line 1: Line 1:
Greetings! I am Myrtle Shroyer. One of the extremely best things in the globe for me is to do aerobics and now I'm trying to earn cash with it. Hiring is her day occupation now and she will not alter it whenever soon. South Dakota is where I've always been residing.<br><br>Take a look at my blog - [http://youplaisir.com//blog/54226 home std test kit]
In [[linear algebra]], a '''diagonal matrix''' is a [[matrix (mathematics)|matrix]] (usually a [[square matrix]]) in which the entries outside the [[main diagonal]] (↘) are all zero. The diagonal entries themselves may or may not be zero. Thus, the matrix D = (d<sub>i,j</sub>) with ''n'' columns and ''n'' rows is diagonal if:
 
:<math>d_{i,j} = 0 \mbox{ if } i \ne j\ \forall i,j \in \{1, 2, \ldots, n\}</math>
 
For example, the following matrix is diagonal:
:<math>\begin{bmatrix}
1 & 0 & 0\\
0 & 4 & 0\\
0 & 0 & -3\end{bmatrix}</math>
 
The term ''diagonal matrix'' may sometimes refer to a '''rectangular diagonal matrix''', which is an ''m''-by-''n'' matrix with only the entries of the form ''d<sub>i,i</sub>'' possibly non-zero. For example:
:<math>\begin{bmatrix}
1 & 0 & 0\\
0 & 4 & 0\\
0 & 0 & -3\\
0 & 0 & 0\\
\end{bmatrix}</math> or <math>\begin{bmatrix}
1 & 0 & 0 & 0 & 0\\
0 & 4 & 0& 0 & 0\\
0 & 0 & -3& 0 & 0\end{bmatrix}</math>
 
However, in the remainder of this article we will consider only square matrices. Any square diagonal matrix is also a [[symmetric matrix]]. Also, if the entries come from the [[field (mathematics)|field]] '''R''' or '''C''', then it is a [[normal matrix]] as well. Equivalently, we can define a diagonal matrix as a matrix that is both [[triangular matrix|upper-]] and [[triangular matrix|lower-triangular]]. The [[identity matrix]] ''I''<sub>''n''</sub> and any square [[zero matrix]] are diagonal. A one-dimensional matrix is always diagonal.
 
== Scalar matrix ==<!-- Linked from [[Scalar matrix]] and [[Scalar transformation]] -->
A diagonal matrix with all its main diagonal entries equal is a '''scalar matrix''', that is, a scalar multiple &lambda;''I'' of the [[identity matrix]] ''I''. Its effect on a vector is [[scalar multiplication]] by &lambda;. For example, a 3&times;3 scalar matrix has the form:
:<math>
  \begin{bmatrix}
    \lambda &      0 & 0      \\
          0 & \lambda & 0      \\
          0 &      0 & \lambda
  \end{bmatrix} \equiv \lambda \boldsymbol{I_3}
</math>
 
The scalar matrices are the [[center of an algebra|center]] of the algebra of matrices: that is, they are precisely the matrices that commute with all other square matrices of the same size.
 
For an abstract vector space ''V'' (rather than the concrete vector space <math>K^n</math>), or more generally a [[module (ring theory)|module]] ''M'' over a [[ring (algebra)|ring]] ''R,'' with the [[endomorphism algebra]] End(''M'') (algebra of linear operators on ''M'') replacing the algebra of matrices, the analog of scalar matrices are '''scalar transformations'''. Formally, scalar multiplication is a linear map, inducing a map <math>R \to \operatorname{End}(M),</math> (send a scalar &lambda; to the corresponding scalar transformation, multiplication by &lambda;) exhibiting End(''M'') as a ''R''-[[Algebra (ring theory)|algebra]]. For vector spaces, or more generally [[free module]]s <math>M \cong R^n</math>, for which the endomorphism algebra is isomorphic to a matrix algebra, the scalar transforms are exactly the [[center of a ring|center]] of the endomorphism algebra, and similarly invertible transforms are the center of the [[general linear group]] GL(''V''), where they are denoted by Z(''V''), follow the usual notation for the center.
 
== Matrix operations ==
The operations of matrix addition and [[matrix multiplication]] are especially simple for diagonal matrices. Write diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) for a diagonal matrix whose diagonal entries starting in the upper left corner are ''a''<sub>1</sub>,...,''a''<sub>''n''</sub>. Then, for addition, we have
 
:diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) + diag(''b''<sub>1</sub>,...,''b''<sub>''n''</sub>) = diag(''a''<sub>1</sub>+''b''<sub>1</sub>,...,''a''<sub>''n''</sub>+''b''<sub>''n''</sub>)
 
and for [[matrix multiplication]],
 
:diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) &middot; diag(''b''<sub>1</sub>,...,''b''<sub>''n''</sub>) = diag(''a''<sub>1</sub>''b''<sub>1</sub>,...,''a''<sub>''n''</sub>''b''<sub>''n''</sub>).
 
The diagonal matrix diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) is [[invertible matrix|invertible]] [[if and only if]] the entries ''a''<sub>1</sub>,...,''a''<sub>''n''</sub> are all non-zero. In this case, we have
 
:diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>)<sup>-1</sup> = diag(''a''<sub>1</sub><sup>-1</sup>,...,''a''<sub>''n''</sub><sup>-1</sup>).
 
In particular, the diagonal matrices form a [[subring]] of the ring of all ''n''-by-''n'' matrices.
 
Multiplying an ''n''-by-''n'' matrix ''A'' from the ''left'' with diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) amounts to multiplying the ''i''-th ''row'' of ''A'' by ''a''<sub>''i''</sub> for all ''i''; multiplying the matrix ''A'' from the ''right'' with diag(''a''<sub>1</sub>,...,''a''<sub>''n''</sub>) amounts to multiplying the ''i''-th ''column'' of ''A'' by ''a''<sub>''i''</sub> for all ''i''.
 
== Operator matrix in eigenbasis ==
 
{{Main|Transformation_matrix#Finding the matrix of a transformation|Eigenvalues and eigenvectors|l1=Finding the matrix of a transformation}}
 
As explained in [[transformation_matrix#Finding_the_matrix_of_a_transformation|determining coefficients of operator matrix]], there is a special basis, ''e''<sub>1</sub>, ..., ''e''<sub>''n''</sub>, for which the matrix takes the diagonal form. Being diagonal means that all coefficients <math>a_{i,j} </math> but <math>a_{i,i}</math> are zeros in the defining equation <math>A \vec e_j = \sum a_{i,j} \vec e_i</math>, leaving only one term per sum. The surviving diagonal elements, <math>a_{i,i}</math>, are known as '''eigenvalues''' and designated with <math>\lambda_i</math> in the equation, which reduces to <math>A \vec e_i = \lambda_i \vec e_i</math>. The resulting equation is known as '''eigenvalue equation'''<ref>{{cite book |last=Nearing |first=James |year=2010 |title=Mathematical Tools for Physics |url=http://www.physics.miami.edu/nearing/mathmethods |chapter=Chapter 7.9: Eigenvalues and Eigenvectors |chapterurl= http://www.physics.miami.edu/~nearing/mathmethods/operators.pdf |accessdate=January 1, 2012|isbn=048648212X}}</ref> and used to derive the [[characteristic polynomial]] and, further, [[eigenvalues and eigenvectors]].
 
In other words, the [[eigenvalue]]s of diag(''λ''<sub>1</sub>, ..., ''λ''<sub>''n''</sub>) are ''λ''<sub>1</sub>, ..., ''λ''<sub>''n''</sub> with associated [[eigenvectors]] of ''e''<sub>1</sub>, ..., ''e''<sub>''n''</sub>.
 
== Other properties ==
 
The [[determinant]] of diag(''a''<sub>1</sub>, ..., ''a''<sub>''n''</sub>) is the product ''a''<sub>1</sub>...''a''<sub>''n''</sub>.
 
The [[adjugate]] of a diagonal matrix is again diagonal.
 
A square matrix is diagonal if and only if it is triangular and [[Normal_matrix|normal]].
 
== Uses ==
Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is always desirable to represent a given matrix or [[linear operator|linear map]] by a diagonal matrix.  
 
In fact, a given ''n''-by-''n'' matrix ''A'' is [[similar matrix|similar]] to a diagonal matrix (meaning that there is a matrix ''X'' such that ''X<sup>-1</sup>AX'' is diagonal) if and only if it has ''n'' [[linearly independent]] eigenvectors. Such matrices are said to be [[diagonalizable matrix|diagonalizable]].
 
Over the [[field (mathematics)|field]] of [[real number|real]] or [[complex number|complex]] numbers, more is true. The [[spectral theorem]] says that every [[normal matrix]] is [[matrix similarity|unitarily similar]] to a diagonal matrix (if ''AA''<sup>*</sup> = ''A''<sup>*</sup>''A'' then there exists a [[unitary matrix]] ''U'' such that ''UAU''<sup>*</sup> is diagonal). Furthermore, the [[singular value decomposition]] implies that for any matrix ''A'', there exist unitary matrices ''U'' and ''V'' such that ''UAV''<sup>*</sup> is diagonal with positive entries.
 
== Operator theory ==
In [[operator theory]], particularly the study of [[PDEs]], operators are particularly easy to understand and PDEs easy to solve if the operator is diagonal with respect to the basis with which one is working; this corresponds to a [[separable partial differential equation]]. Therefore, a key technique to understanding operators is a change of coordinates – in the language of operators, an [[integral transform]] – which changes the basis to an [[eigenbasis]] of [[eigenfunction]]s: which makes the equation separable. An important example of this is the [[Fourier transform]], which diagonalizes constant coefficient differentiation operators (or more generally translation invariant operators), such as the Laplacian operator, say, in the [[heat equation]].
 
Especially easy are [[multiplication operator]]s, which are defined as multiplication by (the values of) a fixed function – the values of the function at each point correspond to the diagonal entries of a matrix.
 
== See also ==
{{colbegin}}
* [[Anti-diagonal matrix]]
* [[Banded matrix]]
* [[Bidiagonal matrix]]
* [[Diagonally dominant matrix]]
* [[Diagonalizable matrix]]
* [[Multiplication operator]]
* [[Tridiagonal matrix]]
* [[Toeplitz matrix]]
* [[Toral Lie algebra]]
* [[Circulant matrix]]
{{colend}}
 
== References ==
 
<references />
* Roger A. Horn and Charles R. Johnson, ''Matrix Analysis'', Cambridge University Press, 1985. ISBN 0-521-30586-1 (hardback), ISBN 0-521-38632-2 (paperback).
 
[[Category:Matrix normal forms]]
[[Category:Sparse matrices]]

Revision as of 01:13, 2 February 2014

In linear algebra, a diagonal matrix is a matrix (usually a square matrix) in which the entries outside the main diagonal (↘) are all zero. The diagonal entries themselves may or may not be zero. Thus, the matrix D = (di,j) with n columns and n rows is diagonal if:

For example, the following matrix is diagonal:

The term diagonal matrix may sometimes refer to a rectangular diagonal matrix, which is an m-by-n matrix with only the entries of the form di,i possibly non-zero. For example:

or

However, in the remainder of this article we will consider only square matrices. Any square diagonal matrix is also a symmetric matrix. Also, if the entries come from the field R or C, then it is a normal matrix as well. Equivalently, we can define a diagonal matrix as a matrix that is both upper- and lower-triangular. The identity matrix In and any square zero matrix are diagonal. A one-dimensional matrix is always diagonal.

Scalar matrix

A diagonal matrix with all its main diagonal entries equal is a scalar matrix, that is, a scalar multiple λI of the identity matrix I. Its effect on a vector is scalar multiplication by λ. For example, a 3×3 scalar matrix has the form:

The scalar matrices are the center of the algebra of matrices: that is, they are precisely the matrices that commute with all other square matrices of the same size.

For an abstract vector space V (rather than the concrete vector space ), or more generally a module M over a ring R, with the endomorphism algebra End(M) (algebra of linear operators on M) replacing the algebra of matrices, the analog of scalar matrices are scalar transformations. Formally, scalar multiplication is a linear map, inducing a map (send a scalar λ to the corresponding scalar transformation, multiplication by λ) exhibiting End(M) as a R-algebra. For vector spaces, or more generally free modules , for which the endomorphism algebra is isomorphic to a matrix algebra, the scalar transforms are exactly the center of the endomorphism algebra, and similarly invertible transforms are the center of the general linear group GL(V), where they are denoted by Z(V), follow the usual notation for the center.

Matrix operations

The operations of matrix addition and matrix multiplication are especially simple for diagonal matrices. Write diag(a1,...,an) for a diagonal matrix whose diagonal entries starting in the upper left corner are a1,...,an. Then, for addition, we have

diag(a1,...,an) + diag(b1,...,bn) = diag(a1+b1,...,an+bn)

and for matrix multiplication,

diag(a1,...,an) · diag(b1,...,bn) = diag(a1b1,...,anbn).

The diagonal matrix diag(a1,...,an) is invertible if and only if the entries a1,...,an are all non-zero. In this case, we have

diag(a1,...,an)-1 = diag(a1-1,...,an-1).

In particular, the diagonal matrices form a subring of the ring of all n-by-n matrices.

Multiplying an n-by-n matrix A from the left with diag(a1,...,an) amounts to multiplying the i-th row of A by ai for all i; multiplying the matrix A from the right with diag(a1,...,an) amounts to multiplying the i-th column of A by ai for all i.

Operator matrix in eigenbasis

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church.

As explained in determining coefficients of operator matrix, there is a special basis, e1, ..., en, for which the matrix takes the diagonal form. Being diagonal means that all coefficients but are zeros in the defining equation , leaving only one term per sum. The surviving diagonal elements, , are known as eigenvalues and designated with in the equation, which reduces to . The resulting equation is known as eigenvalue equation[1] and used to derive the characteristic polynomial and, further, eigenvalues and eigenvectors.

In other words, the eigenvalues of diag(λ1, ..., λn) are λ1, ..., λn with associated eigenvectors of e1, ..., en.

Other properties

The determinant of diag(a1, ..., an) is the product a1...an.

The adjugate of a diagonal matrix is again diagonal.

A square matrix is diagonal if and only if it is triangular and normal.

Uses

Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is always desirable to represent a given matrix or linear map by a diagonal matrix.

In fact, a given n-by-n matrix A is similar to a diagonal matrix (meaning that there is a matrix X such that X-1AX is diagonal) if and only if it has n linearly independent eigenvectors. Such matrices are said to be diagonalizable.

Over the field of real or complex numbers, more is true. The spectral theorem says that every normal matrix is unitarily similar to a diagonal matrix (if AA* = A*A then there exists a unitary matrix U such that UAU* is diagonal). Furthermore, the singular value decomposition implies that for any matrix A, there exist unitary matrices U and V such that UAV* is diagonal with positive entries.

Operator theory

In operator theory, particularly the study of PDEs, operators are particularly easy to understand and PDEs easy to solve if the operator is diagonal with respect to the basis with which one is working; this corresponds to a separable partial differential equation. Therefore, a key technique to understanding operators is a change of coordinates – in the language of operators, an integral transform – which changes the basis to an eigenbasis of eigenfunctions: which makes the equation separable. An important example of this is the Fourier transform, which diagonalizes constant coefficient differentiation operators (or more generally translation invariant operators), such as the Laplacian operator, say, in the heat equation.

Especially easy are multiplication operators, which are defined as multiplication by (the values of) a fixed function – the values of the function at each point correspond to the diagonal entries of a matrix.

See also

Template:Colbegin

Template:Colend

References

  1. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • Roger A. Horn and Charles R. Johnson, Matrix Analysis, Cambridge University Press, 1985. ISBN 0-521-30586-1 (hardback), ISBN 0-521-38632-2 (paperback).