Power of two: Difference between revisions
Line 1: | Line 1: | ||
In [[linear algebra]], a '''Hankel matrix''' (or '''catalecticant matrix'''), named after [[Hermann Hankel]], is a [[square matrix]] with constant skew-diagonals (positive sloping diagonals), e.g.: | |||
:<math>\begin{bmatrix} | |||
a & b & c & d & e \\ | |||
b & c & d & e & f \\ | |||
c & d & e & f & g \\ | |||
d & e & f & g & h \\ | |||
e & f & g & h & i \\ | |||
\end{bmatrix}.</math> | |||
If the ''i'',''j'' element of ''A'' is denoted ''A''<sub>''i'',''j''</sub>, then we have | |||
:<math>A_{i,j} = A_{i-1,j+1}.\ </math> | |||
The Hankel matrix is closely related to the [[Toeplitz matrix]] (a Hankel matrix is an upside-down Toeplitz matrix). For a special case of this matrix see [[Hilbert matrix]]. | |||
A Hankel [[operator (mathematics)|operator]] on a [[Hilbert space]] is one whose matrix with respect to an [[orthonormal basis]] is a (possibly infinite) Hankel matrix | |||
<math>(A_{i,j})_{i,j \ge 1}</math>, where <math> A_{i,j}</math> depends only on <math>i+j</math>. | |||
The determinant of a Hankel matrix is called a [[catalecticant]]. | |||
==Hankel transform== | |||
The '''Hankel transform''' is the name sometimes given to the transformation of a [[sequence]], where the transformed sequence corresponds to the determinant of the Hankel matrix. That is, the sequence <math>\{h_n\}_{n\ge 0}</math> is the Hankel transform of the sequence <math>\{b_n\}_{n\ge 0}</math> when | |||
:<math>h_n = \det (b_{i+j-2})_{1 \le i,j \le n+1}.</math> | |||
Here, <math>a_{i,j}=b_{i+j-2}</math> is the Hankel matrix of the sequence <math>\{b_n\}</math>. The Hankel transform is invariant under the [[binomial transform]] of a sequence. That is, if one writes | |||
:<math>c_n = \sum_{k=0}^n {n \choose k} b_k</math> | |||
as the binomial transform of the sequence <math>\{b_n\}</math>, then one has | |||
:<math>\det (b_{i+j-2})_{1 \le i,j \le n+1} = \det (c_{i+j-2})_{1 \le i,j \le n+1}.</math> | |||
== Hankel matrices for system identification == | |||
Hankel matrices are formed when given a sequence of output data and a realization of an underlying state-space or hidden Markov model is desired. The [[singular value decomposition]] of the Hankel matrix provides a means of computing the A, B, and C matrices which define the state-space realization. | |||
== Orthogonal polynomials on the real line == | |||
{{Incomplete|section|date=February 2009}} | |||
=== Positive Hankel matrices and the Hamburger moment problems === | |||
{{further|Hamburger moment problem}} | |||
=== Orthogonal polynomials on the real line === | |||
=== Tridiagonal model of positive Hankel operators === | |||
==See also== | |||
* [[Hamburger moment problem]] | |||
* [[Toeplitz matrix]], a Hankel matrix 'upside-down'. | |||
==References== | |||
* {{cite book | title=An introduction to Hankel operators | author=J.R. Partington | authorlink=Jonathan Partington | series=LMS Student Texts | volume=13 | publisher=[[Cambridge University Press]] | year=1988 | isbn=0-521-36791-3 }} | |||
[[Category:Matrices]] | |||
[[Category:Transforms]] | |||
{{Linear-algebra-stub}} |
Revision as of 01:49, 16 January 2014
In linear algebra, a Hankel matrix (or catalecticant matrix), named after Hermann Hankel, is a square matrix with constant skew-diagonals (positive sloping diagonals), e.g.:
If the i,j element of A is denoted Ai,j, then we have
The Hankel matrix is closely related to the Toeplitz matrix (a Hankel matrix is an upside-down Toeplitz matrix). For a special case of this matrix see Hilbert matrix.
A Hankel operator on a Hilbert space is one whose matrix with respect to an orthonormal basis is a (possibly infinite) Hankel matrix , where depends only on .
The determinant of a Hankel matrix is called a catalecticant.
Hankel transform
The Hankel transform is the name sometimes given to the transformation of a sequence, where the transformed sequence corresponds to the determinant of the Hankel matrix. That is, the sequence is the Hankel transform of the sequence when
Here, is the Hankel matrix of the sequence . The Hankel transform is invariant under the binomial transform of a sequence. That is, if one writes
as the binomial transform of the sequence , then one has
Hankel matrices for system identification
Hankel matrices are formed when given a sequence of output data and a realization of an underlying state-space or hidden Markov model is desired. The singular value decomposition of the Hankel matrix provides a means of computing the A, B, and C matrices which define the state-space realization.
Orthogonal polynomials on the real line
Positive Hankel matrices and the Hamburger moment problems
47 year-old Podiatrist Hyslop from Alert Bay, has lots of hobbies and interests that include fencing, property developers in condo new launch singapore and handball. Just had a family trip to Monasteries of Haghpat and Sanahin.
Orthogonal polynomials on the real line
Tridiagonal model of positive Hankel operators
See also
- Hamburger moment problem
- Toeplitz matrix, a Hankel matrix 'upside-down'.
References
- 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.
My blog: http://www.primaboinca.com/view_profile.php?userid=5889534