|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| In [[linear algebra]], a '''row vector''' or '''row matrix''' is a 1 × ''m'' [[matrix (mathematics)|matrix]], i.e. a matrix consisting of a single row of ''m'' elements:<ref>{{harvtxt|Meyer|2000}}, p. 8</ref>
| | The person who wrote some [http://photo.net/gallery/tag-search/search?query_string=article article] is called Leland but it's not this most masucline name on the. To go to karaoke is the thing that they loves most of each of the. He works as a cashier. His wife and him live back in Massachusetts and he will have everything that he calls for there. He's not godd at design but locate want to check its website: http://prometeu.net<br><br>Also visit my blog post ... clash of clans hack no survey ([http://prometeu.net Read the Full Guide]) |
| | |
| :<math>\mathbf x = \begin{bmatrix} x_1 & x_2 & \dots & x_m \end{bmatrix}.</math>
| |
| | |
| The [[transpose]] of a row vector is a [[column vector]]:
| |
| :<math>\begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}.</math>
| |
| | |
| The set of all row vectors forms a [[vector space]] which acts like the [[dual space]] to the set of all column vectors, in the sense that any linear functional on the space of column vectors (i.e. any element of the dual space) can be represented uniquely as a dot product with a specific row vector.
| |
| | |
| == Notation ==
| |
| | |
| Row vectors are sometimes written using the following non-standard notation:
| |
| :<math>\mathbf x = \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix}.</math>
| |
| | |
| == Operations ==
| |
| | |
| * [[Matrix multiplication]] involves the action of multiplying each row vector of one matrix by each column vector of another matrix.
| |
| | |
| * The [[dot product]] of two vectors '''a''' and '''b''' is equivalent to multiplying the row vector representation of '''a''' by the column vector representation of '''b''':
| |
| | |
| :<math>\mathbf{a} \cdot \mathbf{b} = \begin{bmatrix}
| |
| a_1 & a_2 & a_3
| |
| \end{bmatrix}\begin{bmatrix}
| |
| b_1 \\ b_2 \\ b_3
| |
| \end{bmatrix}.</math>
| |
| | |
| ==Preferred input vectors for matrix transformations==
| |
| Frequently a row vector presents itself for an operation within n-space expressed by an ''n'' by ''n'' matrix ''M'':
| |
| :''v M = p''.
| |
| Then ''p'' is also a row vector and may present to another ''n'' by ''n'' matrix ''Q'':
| |
| :p Q = t.
| |
| Conveniently, one can write ''t = p Q = v MQ'' telling us that the [[matrix product]] transformation ''MQ'' can take ''v'' directly to ''t''. Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.
| |
| | |
| In contrast, when a column vector is transformed to become another column under an n by n matrix action, the operation occurs to the left:
| |
| : ''p = M v'' and ''t = Q p'' ,
| |
| leading to the algebraic expression ''QM v'' for the composed output from ''v'' input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation. The natural bias to read left-to-right, as subsequent transformations are applied in linear algebra, stands against column vector inputs.
| |
| | |
| Nevertheless, using the [[transpose]] operation these differences between inputs of a row or column nature are resolved by an [[antihomomorphism]] between the groups arising on the two sides. The technical construction uses the [[dual space]] associated with a vector space to develop the [[Dual space#Transpose of a linear map|transpose of a linear map]].
| |
| | |
| For an instance where this row vector input convention has been used to good effect see Raiz Usmani,<ref>Raiz A. Usmani (1987) ''Applied Linear Algebra'' [[Marcel Dekker]] ISBN 0824776224. See Chapter 4: "Linear Transformations"</ref> where on page 106 the convention allows the statement "The product mapping ''ST'' of ''U'' into ''W'' [is given] by:
| |
| :<math>\alpha (ST) = (\alpha S) T = \beta T = \gamma</math>."
| |
| (The Greek letters represent row vectors).
| |
| | |
| [[Ludwik Silberstein]] used row vectors for spacetime events; he applied Lorentz transformation matrices on the right in his [[List of important publications in physics#The Theory of Relativity|Theory of Relativity]] in 1914 (see page 143).
| |
| In 1963 when [[McGraw-Hill]] published ''Differential Geometry'' by [[Heinrich Guggenheimer]] of the [[University of Minnesota]], he uses the row vector convention in chapter 5, "Introduction to transformation groups" (eqs. 7a,9b and 12 to 15). When [[H. S. M. Coxeter]] reviewed<ref>Coxeter [http://www.ams.org/mathscinet/pdf/188842.pdf Review of ''Linear Geometry''] from [[Mathematical Reviews]]</ref> ''Linear Geometry'' by [[Rafael Artzy]], he wrote, "[Artzy] is to be congratulated on his choice of the 'left-to-right' convention, which enables him to regard a point as a row matrix instead of the clumsy column that many authors prefer."
| |
| | |
| == See also ==
| |
| * [[Covariance and contravariance of vectors]]
| |
| | |
| == Notes ==
| |
| <references/>
| |
| | |
| == References ==
| |
| {{see also|Linear algebra#Further reading}}
| |
| * {{Citation
| |
| | last = Axler
| |
| | first = Sheldon Jay
| |
| | year = 1997
| |
| | title = Linear Algebra Done Right
| |
| | publisher = Springer-Verlag
| |
| | edition = 2nd
| |
| | isbn = 0-387-98259-0
| |
| }}
| |
| * {{Citation
| |
| | last = Lay
| |
| | first = David C.
| |
| | date = August 22, 2005
| |
| | title = Linear Algebra and Its Applications
| |
| | publisher = Addison Wesley
| |
| | edition = 3rd
| |
| | isbn = 978-0-321-28713-7
| |
| }}
| |
| * {{Citation
| |
| | last = Meyer
| |
| | first = Carl D.
| |
| | date = February 15, 2001
| |
| | title = Matrix Analysis and Applied Linear Algebra
| |
| | publisher = Society for Industrial and Applied Mathematics (SIAM)
| |
| | isbn = 978-0-89871-454-8
| |
| | url = http://www.matrixanalysis.com/DownloadChapters.html
| |
| }}
| |
| * {{Citation
| |
| | last = Poole
| |
| | first = David
| |
| | year = 2006
| |
| | title = Linear Algebra: A Modern Introduction
| |
| | publisher = Brooks/Cole
| |
| | edition = 2nd
| |
| | isbn = 0-534-99845-3
| |
| }}
| |
| * {{Citation
| |
| | last = Anton
| |
| | first = Howard
| |
| | year = 2005
| |
| | title = Elementary Linear Algebra (Applications Version)
| |
| | publisher = Wiley International
| |
| | edition = 9th
| |
| }}
| |
| * {{Citation
| |
| | last = Leon
| |
| | first = Steven J.
| |
| | year = 2006
| |
| | title = Linear Algebra With Applications
| |
| | publisher = Pearson Prentice Hall
| |
| | edition = 7th
| |
| }}
| |
| | |
| [[Category:Linear algebra]]
| |
| [[Category:Matrices]]
| |
| [[Category:Vectors]]
| |
The person who wrote some article is called Leland but it's not this most masucline name on the. To go to karaoke is the thing that they loves most of each of the. He works as a cashier. His wife and him live back in Massachusetts and he will have everything that he calls for there. He's not godd at design but locate want to check its website: http://prometeu.net
Also visit my blog post ... clash of clans hack no survey (Read the Full Guide)