Chernoff bound: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
→‎Example: Changed lower to upper bound in formula - not sure if rest is wrong too.
en>BG19bot
m WP:CHECKWIKI error fix for #61. Punctuation goes before References. Do general fixes if a problem exists. - using AWB (9957)
Line 1: Line 1:
In [[linear algebra]], a '''row vector''' or '''row matrix''' is a 1 &times; ''m'' [[matrix (mathematics)|matrix]], i.e. a matrix consisting of a single row of ''m'' elements:<ref>{{harvtxt|Meyer|2000}}, p. 8</ref>
I enjoy bilingual network marketing, spending time with family and friends and [http://www.sharkbayte.com/keyword/developing+relationships developing relationships] via social media. I have been an avid network marketer for the past 5 years and believe that earning an annuity or recurring income is the key. This means that you need to invest time, money and effort in a good system in a lucrative industry. I am certain we have it here. This is best value for money system and tools system I have seen as yet! I will work with you if you want to make it happen for you.<br><br>Look at my blog post [https://www.youtube.com/watch?v=ITzrgfp9lgI&list=UUg3DMmhiHy97o2DJARM9TRQ home business opportunity/home business/network marketing/affiliate program/social media/work from home/make money online/affiliate programs/you tube video feedback]
 
:<math>\mathbf x = \begin{bmatrix} x_1 & x_2 & \dots & x_m \end{bmatrix}.</math>
 
The [[transpose]] of a row vector is a [[column vector]]:
:<math>\begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}.</math>
 
The set of all row vectors forms a [[vector space]] which acts like the [[dual space]] to the set of all column vectors, in the sense that any linear functional on the space of column vectors (i.e. any element of the dual space) can be represented uniquely as a dot product with a specific row vector.
 
== Notation ==
 
Row vectors are sometimes written using the following non-standard notation:
:<math>\mathbf x = \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix}.</math>
 
== Operations ==
 
* [[Matrix multiplication]] involves the action of multiplying each row vector of one matrix by each column vector of another matrix.
 
* The [[dot product]] of two vectors '''a''' and '''b''' is equivalent to multiplying the row vector representation of '''a''' by the column vector representation of '''b''':
 
:<math>\mathbf{a} \cdot \mathbf{b} = \begin{bmatrix}
    a_1  & a_2  & a_3
\end{bmatrix}\begin{bmatrix}
    b_1 \\ b_2 \\ b_3
\end{bmatrix}.</math>
 
==Preferred input vectors for matrix transformations==
Frequently a row vector presents itself for an operation within n-space expressed by an ''n'' by ''n'' matrix ''M'':
:''v M = p''.
Then ''p'' is also a row vector and may present to another ''n'' by ''n'' matrix ''Q'':
:p Q = t.
Conveniently, one can write ''t = p Q = v MQ'' telling us that the [[matrix product]] transformation ''MQ'' can take ''v'' directly to ''t''.  Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.
 
In contrast, when a column vector is transformed to become another column under an n by n matrix action, the operation occurs to the left:
: ''p = M v''    and   ''t = Q p'' ,
leading to the algebraic expression ''QM v''  for the composed output from ''v'' input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation. The natural bias to read left-to-right, as subsequent transformations are applied in linear algebra, stands against column vector inputs.
 
Nevertheless, using the [[transpose]] operation these differences between inputs of a row or column nature are resolved by an [[antihomomorphism]] between the groups arising on the two sides. The technical construction uses the [[dual space]] associated with a vector space to develop the [[Dual space#Transpose of a linear map|transpose of a linear map]].
 
For an instance where this row vector input convention has been used to good effect see Raiz Usmani,<ref>Raiz A. Usmani (1987) ''Applied Linear Algebra'' [[Marcel Dekker]] ISBN 0824776224. See Chapter 4: "Linear Transformations"</ref> where on page 106 the convention allows the statement "The product mapping ''ST'' of ''U'' into ''W'' [is given] by:
:<math>\alpha (ST) = (\alpha S) T = \beta T = \gamma</math>."
(The Greek letters represent row vectors).
 
[[Ludwik Silberstein]] used row vectors for spacetime events; he applied Lorentz transformation matrices on the right in his [[List of important publications in physics#The Theory of Relativity|Theory of Relativity]] in 1914 (see page 143).
In 1963 when [[McGraw-Hill]] published ''Differential Geometry'' by [[Heinrich Guggenheimer]] of the [[University of Minnesota]], he uses the row vector convention in chapter 5, "Introduction to transformation groups" (eqs. 7a,9b and 12 to 15). When [[H. S. M. Coxeter]] reviewed<ref>Coxeter [http://www.ams.org/mathscinet/pdf/188842.pdf Review of ''Linear Geometry''] from [[Mathematical Reviews]]</ref> ''Linear Geometry'' by [[Rafael Artzy]], he wrote, "[Artzy] is to be congratulated on his choice of the 'left-to-right' convention, which enables him to regard a point as a row matrix instead of the clumsy column that many authors prefer."
 
== See also ==
* [[Covariance and contravariance of vectors]]
 
== Notes ==
<references/>
 
== References ==
{{see also|Linear algebra#Further reading}}
* {{Citation
| last = Axler
| first = Sheldon Jay
| year = 1997
| title = Linear Algebra Done Right
| publisher = Springer-Verlag
| edition = 2nd
| isbn = 0-387-98259-0
}}
* {{Citation
| last = Lay
| first = David C.
| date = August 22, 2005
| title = Linear Algebra and Its Applications
| publisher = Addison Wesley
| edition = 3rd
| isbn = 978-0-321-28713-7
}}
* {{Citation
| last = Meyer
| first = Carl D.
| date = February 15, 2001
| title = Matrix Analysis and Applied Linear Algebra
| publisher = Society for Industrial and Applied Mathematics (SIAM)
| isbn = 978-0-89871-454-8
| url = http://www.matrixanalysis.com/DownloadChapters.html
}}
* {{Citation
| last = Poole
| first = David
| year = 2006
| title = Linear Algebra: A Modern Introduction
| publisher = Brooks/Cole
| edition = 2nd
| isbn = 0-534-99845-3
}}
* {{Citation
| last = Anton
| first = Howard
| year = 2005
| title = Elementary Linear Algebra (Applications Version)
| publisher = Wiley International
| edition = 9th
}}
* {{Citation
| last = Leon
| first = Steven J.
| year = 2006
| title = Linear Algebra With Applications
| publisher = Pearson Prentice Hall
| edition = 7th
}}
 
[[Category:Linear algebra]]
[[Category:Matrices]]
[[Category:Vectors]]

Revision as of 09:53, 5 March 2014

I enjoy bilingual network marketing, spending time with family and friends and developing relationships via social media. I have been an avid network marketer for the past 5 years and believe that earning an annuity or recurring income is the key. This means that you need to invest time, money and effort in a good system in a lucrative industry. I am certain we have it here. This is best value for money system and tools system I have seen as yet! I will work with you if you want to make it happen for you.

Look at my blog post home business opportunity/home business/network marketing/affiliate program/social media/work from home/make money online/affiliate programs/you tube video feedback