# Rank factorization

Every finite-dimensional matrix has a rank decomposition: Let ${\displaystyle A}$ be an ${\displaystyle m\times n}$ matrix whose column rank is ${\displaystyle r}$. Therefore, there are ${\displaystyle r}$ linearly independent columns in ${\displaystyle A}$; equivalently, the dimension of the column space of ${\displaystyle A}$ is ${\displaystyle r}$. Let ${\displaystyle c_{1},c_{2},\ldots ,c_{r}}$ be any basis for the column space of ${\displaystyle A}$ and place them as column vectors to form the ${\displaystyle m\times r}$ matrix ${\displaystyle C=[c_{1}:c_{2}:\ldots :c_{r}]}$. Therefore, every column vector of ${\displaystyle A}$ is a linear combination of the columns of ${\displaystyle C}$. To be precise, if ${\displaystyle A=[a_{1}:a_{2}:\ldots :a_{n}]}$ is an ${\displaystyle m\times n}$ matrix with ${\displaystyle a_{j}}$ as the ${\displaystyle j}$-th column, then

${\displaystyle a_{j}=f_{1j}c_{1}+f_{2j}c_{2}+\cdots +f_{rj}c_{r},}$

where ${\displaystyle f_{ij}}$'s are the scalar coefficients of ${\displaystyle a_{j}}$ in terms of the basis ${\displaystyle c_{1},c_{2},\ldots ,c_{r}}$. This implies that ${\displaystyle A=CF}$, where ${\displaystyle f_{ij}}$ is the ${\displaystyle (i,j)}$-th element of ${\displaystyle F}$.

## rank(${\displaystyle A}$) = rank(${\displaystyle A^{\text{T}}}$)

An immediate consequence of rank factorization is that the rank of ${\displaystyle A}$ is equal to the rank of its transpose ${\displaystyle A^{\text{T}}}$. Since the columns of ${\displaystyle A}$ are the rows of ${\displaystyle A^{\text{T}}}$, the column rank of ${\displaystyle A}$ equals its row rank.

Proof: To see why this is true, let us first define rank to mean column rank. Since ${\displaystyle A=CF}$, it follows that ${\displaystyle A^{\text{T}}=F^{\text{T}}C^{\text{T}}}$. From the definition of matrix multiplication, this means that each column of ${\displaystyle A^{\text{T}}}$ is a linear combination of the columns of ${\displaystyle F^{\text{T}}}$. Therefore, the column space of ${\displaystyle A^{\text{T}}}$ is contained within the column space of ${\displaystyle F^{\text{T}}}$ and, hence, rank(${\displaystyle A^{\text{T}}}$) ≤ rank(${\displaystyle F^{\text{T}}}$). Now, ${\displaystyle F^{\text{T}}}$ is ${\displaystyle n}$×${\displaystyle r}$, so there are ${\displaystyle r}$ columns in ${\displaystyle F^{\text{T}}}$ and, hence, rank(${\displaystyle A^{\text{T}}}$) ≤ ${\displaystyle r}$ = rank(${\displaystyle A}$). This proves that rank(${\displaystyle A^{\text{T}})}$ ≤ rank(${\displaystyle A}$). Now apply the result to ${\displaystyle A^{\text{T}}}$ to obtain the reverse inequality: since ${\displaystyle (A^{\text{T}})^{\text{T}}}$ = ${\displaystyle A}$, we can write rank(${\displaystyle A}$) = rank(${\displaystyle (A^{\text{T}})^{\text{T}})}$ ≤ rank(${\displaystyle A^{\text{T}}}$). This proves rank(${\displaystyle A)}$ ≤ rank(${\displaystyle A^{\text{T}}}$). We have, therefore, proved rank(${\displaystyle A^{\text{T}})}$ ≤ rank(${\displaystyle A}$) and rank(${\displaystyle A}$) ≤ rank(${\displaystyle A^{\text{T}}}$), so rank(${\displaystyle A}$) = rank(${\displaystyle A^{\text{T}}}$). (Also see the first proof of column rank = row rank under rank).

## Rank Factorization from Row Echelon Forms

In practice, we can construct one specific rank factorization as follows: we can compute ${\displaystyle B}$, the reduced row echelon form of ${\displaystyle A}$. Then ${\displaystyle C}$ is obtained by removing from ${\displaystyle A}$ all non-pivot columns, and ${\displaystyle F}$ by eliminating all zero rows of ${\displaystyle B}$.

## Example

Consider the matrix

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix}}\sim {\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\\0&0&0&0\end{bmatrix}}=B{\text{.}}}$

${\displaystyle B}$ is in reduced echelon form. Then ${\displaystyle C}$ is obtained by removing the third column of ${\displaystyle A}$, the only one which is not a pivot column, and ${\displaystyle F}$ by getting rid of the last row of zeroes, so

${\displaystyle C={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix}}{\text{,}}\qquad F={\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix}}{\text{.}}}$

It is straightforward to check that

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix}}={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix}}{\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix}}=CF{\text{.}}}$

## Proof

Let ${\displaystyle P}$ be an ${\displaystyle n\times n}$ permutation matrix such that ${\displaystyle AP=(C,D)}$ in block partitioned form, where the columns of ${\displaystyle C}$ are the ${\displaystyle r}$ pivot columns of ${\displaystyle A}$. Every column of ${\displaystyle D}$ is a linear combination of the columns of ${\displaystyle C}$, so there is a matrix ${\displaystyle G}$ such that ${\displaystyle D=CG}$, where the columns of ${\displaystyle G}$ contain the coefficients of each of those linear combinations. So ${\displaystyle AP=(C,CG)=C(I_{r},G)}$, ${\displaystyle I_{r}}$ being the ${\displaystyle r\times r}$ identity matrix. We will show now that ${\displaystyle (I_{r},G)=FP}$.

Transforming ${\displaystyle AP}$ into its reduced row echelon form amounts to left-multiplying by a matrix ${\displaystyle E}$ which is a product of elementary matrices, so ${\displaystyle EAP=BP=EC(I_{r},G)}$, where ${\displaystyle EC={\begin{pmatrix}I_{r}\\0\end{pmatrix}}}$. We then can write ${\displaystyle BP={\begin{pmatrix}I_{r}&G\\0&0\end{pmatrix}}}$, which allows us to identify ${\displaystyle (I_{r},G)=FP}$, i.e. the nonzero ${\displaystyle r}$ rows of the reduced echelon form, with the same permutation on the columns as we did for ${\displaystyle A}$. We thus have ${\displaystyle AP=CFP}$, and since ${\displaystyle P}$ is invertible this implies ${\displaystyle A=CF}$, and the proof is complete.

## References

• {{#invoke:citation/CS1|citation

|CitationClass=citation }}

• {{#invoke:citation/CS1|citation

|CitationClass=citation }}

• {{#invoke:citation/CS1|citation

|CitationClass=citation }} Template:Refend