|
|
Line 1: |
Line 1: |
| :''This article is about the transpose of a matrix. For other uses, see [[Transposition (disambiguation)|Transposition]]''
| | You have been bombarded lately by thus many ads online or inside TV striving to suggest the only diets which work for you. Yet, none of these guaranteed fat reduction plans did not produce any outcome or you have lost couple pounds, however not for lengthy.<br><br>[http://www.purevolume.com/listeners/JuliaLopez/posts/292285/Top+Tips+For+Selecting+A+Successful+Diet+Approach+ diet plans] actually have the above mentioned stated criteria and enable the dieter to eat what they want at certain certain periods. This would ensure which the dieter does not receive into binge eating and does not drop away within the diet.<br><br>It refuses to matter how small the weight reduction is. When it comes to diet plans, should you find 1 that provides we a result then you've got a winner. stay with it!<br><br>Whole grains: if you need to take grainy foods, go for beans and/ brown rice. Foods which are rich inside whole grains mean that its outer kernel, vitamin plus internal mineral haven't been removed.<br><br>Because fat has more calories per ounce than any alternative type of food, controlling the fat consumption is a huge help inside lessening your calorie intake. With low fat diets we have to find a calorie consumption level which is installing for a weight and the amount of weight we want to remove. With a low fat diet, you have to eat many fruits, lean meats and vegetables; yet avoid anything which is fatty or fried.<br><br>It has been medically proven that there are cardiovascular benefits to a glass of wine, a beer or perhaps a cocktail by raising the advantageous cholesterol levels plus thinning the blood. The health plus diet issues are raised by drinking alcohol to excess.<br><br>I eat a range of food when I'm dieting to prevent details from getting boring. There are tons of healthy recipes available and I will try to add any healthy dishes whenever I may think of them. In the mean time here are my favorite healthy dishes which I like to consume at least a couple of occasions a week. |
| | |
| :''Note that this article assumes that matrices are taken over a commutative ring. These results may not hold in the non-commutative case.''
| |
| | |
| [[File:Matrix transpose.gif|thumb|200px|right|The transpose '''A'''<sup>T</sup> of a matrix '''A''' can be obtained by reflecting the elements along its main diagonal. Repeating the process on the transposed matrix returns the elements to their original position.]]
| |
| | |
| In [[linear algebra]], the '''transpose''' of a [[matrix (mathematics)|matrix]] '''A''' is another matrix '''A'''<sup>T</sup> (also written '''A'''′, '''A'''<sup>tr</sup>,<sup>t</sup>'''A''' or '''A'''<sup>t</sup>) created by any one of the following equivalent actions:
| |
| * reflect '''A''' over its [[main diagonal]] (which runs from top-left to bottom-right) to obtain '''A'''<sup>T</sup>
| |
| * write the rows of '''A''' as the columns of '''A'''<sup>T</sup>
| |
| * write the columns of '''A''' as the rows of '''A'''<sup>T</sup>
| |
| | |
| Formally, the ''i'' th row, ''j'' th column element of '''A'''<sup>T</sup> is the ''j'' th row, ''i'' th column element of '''A''':
| |
| | |
| :<math>[\mathbf{A}^\mathrm{T}]_{ij} = [\mathbf{A}]_{ji}</math>
| |
| | |
| If '''A''' is an {{nowrap|''m'' × ''n''}} matrix then '''A'''<sup>T</sup> is an {{nowrap|''n'' × ''m''}} matrix.
| |
| | |
| The transpose of a matrix was introduced in 1858 by the British mathematician [[Arthur Cayley]].<ref>Arthur Cayley (1858) [http://books.google.com/books?id=flFFAAAAcAAJ&pg=PA31#v=onepage&q&f=false "A memoir on the theory of matrices,"] ''Philosophical Transactions of the Royal Society of London'', '''148''' : 17-37. The transpose (or "transposition") is defined on page 31.</ref>
| |
| | |
| == Examples ==
| |
| *<math>\begin{bmatrix}
| |
| 1 & 2 \end{bmatrix}^{\mathrm{T}}
| |
| = \,
| |
| \begin{bmatrix}
| |
| 1 \\
| |
| 2 \end{bmatrix}
| |
| </math>
| |
| | |
| *<math>\begin{bmatrix}
| |
| 1 & 2 \\
| |
| 3 & 4 \end{bmatrix}^{\mathrm{T}}
| |
| =
| |
| \begin{bmatrix}
| |
| 1 & 3 \\
| |
| 2 & 4 \end{bmatrix}
| |
| </math>
| |
| | |
| * <math>
| |
| \begin{bmatrix}
| |
| 1 & 2 \\
| |
| 3 & 4 \\
| |
| 5 & 6 \end{bmatrix}^{\mathrm{T}}
| |
| =
| |
| \begin{bmatrix}
| |
| 1 & 3 & 5\\
| |
| 2 & 4 & 6 \end{bmatrix}
| |
| </math>
| |
| | |
| == Properties ==
| |
| For matrices '''A''', '''B''' and scalar ''c'' we have the following properties of transpose: | |
| | |
| {{ordered list
| |
| |1= <math>( \mathbf{A}^\mathrm{T} ) ^\mathrm{T} = \mathbf{A} \quad \,</math>
| |
| :The operation of taking the transpose is an [[Involution (mathematics)|involution]] (self-[[Inverse matrix|inverse]]).
| |
| |2= <math>(\mathbf{A}+\mathbf{B}) ^\mathrm{T} = \mathbf{A}^\mathrm{T} + \mathbf{B}^\mathrm{T} \,</math>
| |
| :The transpose respects addition.
| |
| |3= <math>\left( \mathbf{A B} \right) ^\mathrm{T} = \mathbf{B}^\mathrm{T} \mathbf{A}^\mathrm{T} \,</math>
| |
| :Note that the order of the factors reverses. From this one can deduce that a [[square matrix]] '''A''' is [[Invertible matrix|invertible]] if and only if '''A'''<sup>T</sup> is invertible, and in this case we have ('''A'''<sup>−1</sup>)<sup>T</sup> = ('''A'''<sup>T</sup>)<sup>−1</sup>. By induction this result extends to the general case of multiple matrices, where we find that ('''A'''<sub>1</sub>'''A'''<sub>2</sub>...'''A'''<sub>''k''−1</sub>'''A'''<sub>''k''</sub>)<sup>T</sup> = '''A'''<sub>''k''</sub><sup>T</sup>'''A'''<sub>''k''−1</sub><sup>T</sup>...'''A'''<sub>2</sub><sup>T</sup>'''A'''<sub>1</sub><sup>T</sup>.
| |
| |4= <math>(c \mathbf{A})^\mathrm{T} = c \mathbf{A}^\mathrm{T} \,</math>
| |
| :The transpose of a [[Scalar (mathematics)|scalar]] is the same scalar. Together with (2), this states that the transpose is a [[linear map]] from the [[Vector space|space]] of {{nowrap|''m'' × ''n''}} matrices to the space of all {{nowrap|''n'' × ''m''}} matrices.
| |
| |5= <math>\det(\mathbf{A}^\mathrm{T}) = \det(\mathbf{A}) \,</math>
| |
| :The [[determinant]] of a square matrix is the same as that of its transpose.
| |
| |6= The [[dot product]] of two column [[vector space|vector]]s '''a''' and '''b''' can be computed as
| |
| :<math> \mathbf{a} \cdot \mathbf{b} = \mathbf{a}^{\mathrm{T}} \mathbf{b},</math>
| |
| which is written as '''a'''<sub>''i''</sub> '''b'''<sup>''i''</sup> in [[Einstein notation]].
| |
| |7= If '''A''' has only real entries, then '''A'''<sup>T</sup>'''A''' is a [[positive-semidefinite matrix]].
| |
| |8= <math>(\mathbf{A}^\mathrm{T})^{-1} = (\mathbf{A}^{-1})^\mathrm{T} \,</math>
| |
| : The transpose of an invertible matrix is also invertible, and its inverse is the transpose of the inverse of the original matrix. The notation '''A'''<sup>−T</sup> is sometimes used to represent either of these equivalent expressions. | |
| |9= If '''A''' is a square matrix, then its [[Eigenvalue, eigenvector and eigenspace|eigenvalues]] are equal to the eigenvalues of its transpose since they share the same Characteristic polynomial.
| |
| }}
| |
| | |
| == Special transpose matrices ==
| |
| A square matrix whose transpose is equal to itself is called a [[symmetric matrix]]; that is, '''A''' is symmetric if
| |
| :<math>\mathbf{A}^{\mathrm{T}} = \mathbf{A} .</math>
| |
| | |
| A square matrix whose transpose is equal to its negative is called a [[skew-symmetric matrix]]; that is, '''A''' is skew-symmetric if
| |
| :<math>\mathbf{A}^{\mathrm{T}} = -\mathbf{A} .</math>
| |
| | |
| A square [[complex number|complex]] matrix whose transpose is equal to the matrix with every entry replaced by its [[complex conjugate]] is called a [[Hermitian matrix]] (equivalent to the matrix being equal to its [[conjugate transpose]]); that is, '''A''' is Hermitian if
| |
| :<math>\mathbf{A}^{\mathrm{T}} = \mathbf{A}^{*} .</math>
| |
| | |
| A square [[complex number|complex]] matrix whose transpose is equal to the negation of its complex conjugate is called a [[skew-Hermitian matrix]]; that is, '''A''' is skew-Hermitian if
| |
| :<math>\mathbf{A}^{\mathrm{T}} = -\mathbf{A}^{*} .</math>
| |
| | |
| A square matrix whose transpose is equal to its inverse is called an [[orthogonal matrix]]; that is, '''A''' is orthogonal if
| |
| :<math>\mathbf{A}^{\mathrm{T}} = \mathbf{A}^{-1} .</math>
| |
| | |
| == Transpose of a linear map ==
| |
| {{see also|Dual space#Transpose of a linear map|l1=Dual space'' (section ''Transpose of a linear map'')''}}
| |
| | |
| The transpose may be defined using a [[coordinate-free]] approach:
| |
| | |
| If {{nowrap|1=''f'' : ''V'' → ''W''}} is a [[linear operator|linear map]] between [[vector space]]s ''V'' and ''W'' with respective [[dual space]]s ''V''<sup>∗</sup> and ''W''<sup>∗</sup>, the ''transpose'' of ''f'' is the linear map {{nowrap|1=<sup>t</sup>''f'' : ''W''<sup>∗</sup> → ''V''<sup>∗</sup>}} that satisfies
| |
| :<math> {}^\mathrm{t} f (\phi ) = \phi \circ f \quad \forall \phi \in W^* .</math>
| |
| | |
| The definition of the transpose may be seen to be independent of any bilinear form on the vector spaces, unlike the adjoint ([[#Adjoint of a bilinear map|below]]).
| |
| | |
| If the matrix ''A'' describes a linear map with respect to [[basis (linear algebra)|bases]] of ''V'' and ''W'', then the matrix ''A''<sup>T</sup> describes the transpose of that linear map with respect to the dual bases.
| |
| | |
| === Transpose of a bilinear form ===
| |
| {{main|Bilinear form}}
| |
| | |
| Every linear map to the dual space {{nowrap|1=''f'' : ''V'' → ''V''<sup>∗</sup>}} defines a bilinear form {{nowrap|1=''B'' : ''V'' × ''V'' → ''F''}}, with the relation {{nowrap|1=''B''('''v''', '''w''') = ''f''('''v''')('''w''')}}. By defining the transpose of this bilinear form as the bilinear form <sup>t</sup>''B'' defined by the transpose {{nowrap|1=<sup>t</sup>''f'' : ''V''<sup>∗∗</sup> → ''V''<sup>∗</sup>}} i.e. {{nowrap|1=<sup>t</sup>''B''('''w''', '''v''') = <sup>t</sup>''f''('''w''')('''v''')}}, we find that {{nowrap|1=''B''('''v''','''w''') = <sup>t</sup>''B''('''w''','''v''')}}.
| |
| | |
| === Adjoint ===
| |
| {{distinguish|Hermitian adjoint}}
| |
| | |
| If the vector spaces ''V'' and ''W'' have respective [[nondegenerate form|nondegenerate]] [[bilinear form]]s ''B''<sub>''V''</sub> and ''B''<sub>''W''</sub>, a concept closely related to the transpose – the ''adjoint'' – may be defined:
| |
| | |
| If {{nowrap|1=''f'' : ''V'' → ''W''}} is a [[linear map]] between [[vector space]]s ''V'' and ''W'', we define ''g'' as the ''adjoint'' of ''f'' if {{nowrap|1=''g'' : ''W'' → ''V''}} satisfies
| |
| :<math>B_V(v, g(w)) = B_W(f(v),w) \quad \forall\ v \in V, w \in W .</math>
| |
| | |
| These bilinear forms define an [[isomorphism]] between ''V'' and ''V''<sup>∗</sup>, and between ''W'' and ''W''<sup>∗</sup>, resulting in an isomorphism between the transpose and adjoint of ''f''. The matrix of the adjoint of a map is the transposed matrix only if the [[basis (linear algebra)|bases]] are orthonormal with respect to their bilinear forms. In this context, many authors use the term transpose to refer to the adjoint as defined here.
| |
| | |
| The adjoint allows us to consider whether {{nowrap|1=''g'' : ''W'' → ''V''}} is equal to {{nowrap|1=''f''<sup> −1</sup> : ''W'' → ''V''}}. In particular, this allows the [[orthogonal group]] over a vector space ''V'' with a quadratic form to be defined without reference to matrices (nor the components thereof) as the set of all linear maps {{nowrap|''V'' → ''V''}} for which the adjoint equals the inverse.
| |
| | |
| Over a complex vector space, one often works with [[sesquilinear form]]s (conjugate-linear in one argument) instead of bilinear forms. The [[Hermitian adjoint]] of a map between such spaces is defined similarly, and the matrix of the Hermitian adjoint is given by the conjugate transpose matrix if the bases are orthonormal.
| |
| | |
| ==Implementation of matrix transposition on computers==
| |
| | |
| On a [[computer]], one can often avoid explicitly transposing a matrix in [[Random access memory|memory]] by simply accessing the same data in a different order. For example, [[software libraries]] for [[linear algebra]], such as [[BLAS]], typically provide options to specify that certain matrices are to be interpreted in transposed order to avoid the necessity of data movement.
| |
| | |
| However, there remain a number of circumstances in which it is necessary or desirable to physically reorder a matrix in memory to its transposed ordering. For example, with a matrix stored in [[row-major order]], the rows of the matrix are contiguous in memory and the columns are discontiguous. If repeated operations need to be performed on the columns, for example in a [[fast Fourier transform]] algorithm, transposing the matrix in memory (to make the columns contiguous) may improve performance by increasing [[memory locality]].
| |
| | |
| {{Main|In-place matrix transposition}}
| |
| | |
| Ideally, one might hope to transpose a matrix with minimal additional storage. This leads to the problem of transposing an ''n'' × ''m'' matrix [[in-place]], with [[Big O notation|O(1)]] additional storage or at most storage much less than ''mn''. For ''n'' ≠ ''m'', this involves a complicated [[permutation]] of the data elements that is non-trivial to implement in-place. Therefore efficient [[in-place matrix transposition]] has been the subject of numerous research publications in [[computer science]], starting in the late 1950s, and several algorithms have been developed.
| |
| | |
| ==See also==
| |
| *[[Invertible matrix]]
| |
| *[[Moore–Penrose pseudoinverse]]
| |
| *[[Projection (linear algebra)]]
| |
| | |
| ==References==
| |
| | |
| {{reflist}}
| |
| | |
| ==External links==
| |
| *[http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/detail/lecture05.htm MIT Linear Algebra Lecture on Matrix Transposes]
| |
| *[http://mathworld.wolfram.com/Transpose.html Transpose], mathworld.wolfram.com
| |
| *[http://planetmath.org/encyclopedia/Transpose.html Transpose], planetmath.org
| |
| *[http://khanexercises.appspot.com/video?v=2t0003_sxtU Khan Academy introduction to matrix transposes]
| |
| | |
| {{linear algebra}}
| |
| | |
| [[Category:Matrices]]
| |
| [[Category:Abstract algebra]]
| |
| [[Category:Linear algebra]]
| |
| | |
| [[de:Matrix (Mathematik)#Die transponierte Matrix]]
| |
You have been bombarded lately by thus many ads online or inside TV striving to suggest the only diets which work for you. Yet, none of these guaranteed fat reduction plans did not produce any outcome or you have lost couple pounds, however not for lengthy.
diet plans actually have the above mentioned stated criteria and enable the dieter to eat what they want at certain certain periods. This would ensure which the dieter does not receive into binge eating and does not drop away within the diet.
It refuses to matter how small the weight reduction is. When it comes to diet plans, should you find 1 that provides we a result then you've got a winner. stay with it!
Whole grains: if you need to take grainy foods, go for beans and/ brown rice. Foods which are rich inside whole grains mean that its outer kernel, vitamin plus internal mineral haven't been removed.
Because fat has more calories per ounce than any alternative type of food, controlling the fat consumption is a huge help inside lessening your calorie intake. With low fat diets we have to find a calorie consumption level which is installing for a weight and the amount of weight we want to remove. With a low fat diet, you have to eat many fruits, lean meats and vegetables; yet avoid anything which is fatty or fried.
It has been medically proven that there are cardiovascular benefits to a glass of wine, a beer or perhaps a cocktail by raising the advantageous cholesterol levels plus thinning the blood. The health plus diet issues are raised by drinking alcohol to excess.
I eat a range of food when I'm dieting to prevent details from getting boring. There are tons of healthy recipes available and I will try to add any healthy dishes whenever I may think of them. In the mean time here are my favorite healthy dishes which I like to consume at least a couple of occasions a week.