Marshall Hall's conjecture: Difference between revisions
Deltahedron (talk | contribs) m →References: better |
en>ChrisGualtieri m Remove stub template(s). Page is start class or higher. Also check for and do General Fixes + Checkwiki fixes using AWB |
||
Line 1: | Line 1: | ||
In [[mathematics]], especially in [[linear algebra]] and [[Matrix (mathematics)|matrix theory]], the '''commutation matrix''' is used for transforming the vectorized form of a [[matrix (mathematics)|matrix]] into the vectorized form of its [[transpose]]. Specifically, the commutation matrix '''K'''<sup>(m,n)</sup> is the ''mn × mn'' matrix which, for any ''m × n'' matrix '''A''', transforms vec('''A''') into vec('''A'''<sup>T</sup>): | |||
:'''K'''<sup>(m,n)</sup> vec('''A''') = vec('''A'''<sup>T</sup>) . | |||
Here vec('''A''') is the ''mn × 1'' [[column vector]] obtain by stacking the columns of '''A''' on top of one another: | |||
:vec('''A''') = [ '''A'''<sub>1,1</sub>, ..., '''A'''<sub>m,1</sub>, '''A'''<sub>1,2</sub>, ..., '''A'''<sub>m,2</sub>, ..., '''A'''<sub>1,n</sub>, ..., '''A'''<sub>m,n</sub> ]<sup>T</sup> | |||
where '''A''' = ['''A'''<sub>i,j</sub>]. | |||
The commutation matrix is a special type of [[permutation matrix]], and is therefore [[orthogonal matrix|orthogonal]]. It is also an [[involution (mathematics)#Linear Algebra|involution]] and [[symmetric matrix|symmetric]]. | |||
The main use of the commutation matrix, and the source of its name, is to commute the [[Kronecker product]]: for every ''m × n'' matrix '''A''' and every ''r × q'' matrix '''B''', | |||
:'''K'''<sup>(r,m)</sup>('''A''' <math>\otimes</math> '''B''')'''K'''<sup>(n,q)</sup> = '''B''' <math>\otimes</math> '''A'''. | |||
An explicit form for the commutation matrix is as follows: if '''e'''<sub>r,j</sub> denotes the j-th canonical vector of dimension ''r'' (i.e. the vector with 1 in the j-th coordinate and 0 elsewhere) then | |||
:'''K'''<sup>(r,m)</sup> = <math>\sum_{i=1}^{r}</math><math>\sum_{j=1}^{m}</math>'''e'''<sub>r,i</sub>'''e'''<sub>m,j</sub><sup>T</sup><math>\otimes</math>'''e'''<sub>m,j</sub>'''e'''<sub>r,i</sub><sup>T</sup>. | |||
==References== | |||
Jan R. Magnus and Heinz Neudecker (1988), ''Matrix Differential Calculus with Applications in Statistics and Econometrics'', Wiley. | |||
[[Category:Linear algebra]] | |||
[[Category:Matrices]] | |||
{{Linear-algebra-stub}} |
Revision as of 15:48, 18 December 2013
In mathematics, especially in linear algebra and matrix theory, the commutation matrix is used for transforming the vectorized form of a matrix into the vectorized form of its transpose. Specifically, the commutation matrix K(m,n) is the mn × mn matrix which, for any m × n matrix A, transforms vec(A) into vec(AT):
- K(m,n) vec(A) = vec(AT) .
Here vec(A) is the mn × 1 column vector obtain by stacking the columns of A on top of one another:
- vec(A) = [ A1,1, ..., Am,1, A1,2, ..., Am,2, ..., A1,n, ..., Am,n ]T
where A = [Ai,j].
The commutation matrix is a special type of permutation matrix, and is therefore orthogonal. It is also an involution and symmetric.
The main use of the commutation matrix, and the source of its name, is to commute the Kronecker product: for every m × n matrix A and every r × q matrix B,
An explicit form for the commutation matrix is as follows: if er,j denotes the j-th canonical vector of dimension r (i.e. the vector with 1 in the j-th coordinate and 0 elsewhere) then
References
Jan R. Magnus and Heinz Neudecker (1988), Matrix Differential Calculus with Applications in Statistics and Econometrics, Wiley.