3/28/2022

Qr Householder

61
Qr Householder Rating: 6,9/10 6644 votes

$initialize$So far, we have only shown existence , but How do we compute QR decompositions?

  1. I really like the Householder matrix. I’ve blogged about its ability to generate an orthonormal basis containing a particular vector in a previous blog post.This blog post is a bit of a tutorial which will use the Householder matrix to perform a QR decomposition.
  2. A better example of a QR decomposition of a 3x3 Matrix with the Householder Transformation than those of last year I hope.

Householder Reflections

QR DecompositionQR AlgorithmsBlock Householder QR AlgorithmsHouseholder Re ectionsGPU Implementation Implementation on GPUs Initial strategy: Householder QR matrix dimensions constrained to multiples of 32 Use CUBLAS to compute Householder re ections and vector outer products Write kernel with CUDA to do better than CUBLAS’s cublasSgemv.

Householder reflection is $H=I-2vv'$ where $norm v2=1$ which reflects over hyperplane orthogonal to $v$

Lemma. Householder reflections are orthogonal matrices

Proof. $~$ HOMEWORK

We use a series of Householder reflections to reduce $APi$ to an upper triangular matrix, and the resultant product of Householder matrices gives us $Q,$, i.e. $$underbrace{H_rcdots H_1}_{Q'}APi=R$$

First, find $H_1$ that makes every entry below $R_{11}$ zero,
begin{align*}
H_1a_1&=R_{11}e_1
R_{11}e_1&=a_1-2v_1v_1'a_1
v_1(underbrace{2v_1'a_1}_text{scalar})&=a_1-R_{11}e_1
implies~v_1&=frac{a_1-R_{11}e_1}{norm{a_1-R_{11}e_1}2}
implies~R_{11}e_1&=pmnorm{a_1}2
v_1&=frac{a_1-norm{a_1}{}e_1}{norm{vphantom{big };!a_1-norm{a_1}{}e_1}2}
implies H_1&=I-frac{(a_1-norm{a_1}{}e_1)(a_1-norm{a_1}{}e_1)'}{norm{vphantom{big };!a_1-norm{a_1}{}e_1}{}_2^2}
end{align*}

Then, we have
$$H_aAPi~=~defhmatres{{begin{matrix}begin{matrix}R_{11}&text{--------------}~~end{matrix}begin{matrix}begin{matrix}0vdots0&end{matrix}&!!!!!!mat{&&&tilde{A}_2&&&}end{matrix}end{matrix}}}left[{begin{matrix}begin{matrix}R_{11}&text{--------------}~~end{matrix}begin{matrix}begin{matrix}0vdots0&end{matrix}&!!!!!!underbrace{mat{&&&tilde{A}_2&&&}}_text{repeat on these}end{matrix}end{matrix}}^{vphantom{Big }}right]$$

Givens Rotations

Givens rotations $Gij$ where $Gij$ is the identity matrix except
- $Gij_{ii}=Gij_{jj}=lambda$
- $Gij_{ij}=sigma$
- $Gij_{ji}=-sigma$

$$text{for example, },G^{(2,4)}=mat{1&0&0&00&lambda&0&sigma0&0&1&00&-sigma&0&lambda}$$

Qr Householder Link

HOMEWORK $~$ Show that Givens rotation is orthogonal when $sigma^2+lambda^2=1$

We can also use Givens rotations to compute the decomposition, so that $$underbrace{G_Tcdots G_1}_{Q'}APi=R$$

To find $G_1$, we look for a Givens rotation $G^{(1,2)}$ that will make the entry in row 2 column 1 of the product equal to zero. Consider the product with the following matrix $M$ $$mat{lambda&sigma-sigma&lambda},mat{M_{11}&M_{12}&cdots&M_{1m}M_{21}&M_{22}&cdots&M_{2m}}~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~=mat{hphantom{-}lambda M_{11}+sigma M_{21}&hphantom{-1}lambda M_{12}+sigma M_{22}&cdots&hphantom{-1}lambda M_{1m}+sigma M_{2m}-sigma M_{11}+lambda M_{21}&-sigma M_{12}+lambda M_{22}&cdots&-sigma M_{1m}+lambda M_{2m}}$$

To find $G_1$, we solve $$begin{matrix}begin{split}-sigma M_{11}+lambda M_{21}&=0text{s.t. }~~~~~~~~~~~~~sigma^2+lambda^2&=1end{split}end{matrix}~~~~~~~implies~~~~~~~begin{matrix}lambda=frac{M_{11}}{sqrt{M_{11}^2+M_{21}^2}}~~~text{and}~~~sigma=frac{M_{21}}{sqrt{M_{11}^2+M_{21}^2}}end{matrix}$$

Now, we can repeat this process to successively make all entries below the diagonal zero, giving us a QR decomposition.

HOMEWORK

  1. Determine the computational complexity for QR decomposition using
    • Gram-Schmidt
    • Modified Gram-Schmidt
    • Householder reflections
    • Givens rotations
  2. Compare the complexity of Householder vs Givens for a sparse matrix
  3. Implement QR decomposition using Householder reflections, (input matrix A of full column rank and output Q,R)
  4. Repeat 3 using Givens rotations

$$~$$

'Large' data least squares

$AinRnm$, where $ngg m~~$ (which is a different situation than $mgg n$)

Optional reference : 'Incremental QR Decomposition' , by Gentleman.

We can apply QR decomposition to solving the least squares equation $Ax=b$ since
begin{align*}
hat{x}&=(A'A)inv A'b
&=((QR)'(QR))inv(QR)'b
&=(R'cancel{Q'Q}R)inv R'Q'b
&=(R'R)inv R'Q'b
&=Rinvcancel{(R')inv R'}Q'b
&=Rinv Q'b
end{align*}

We need to find $Rinv$ and $Q'b$. To do so, consider just the first $m!+!1$ rows of the matrix $mat{A&b},$ and perform QR decomposition to get $$mat{tilde{R}&tilde{Q}'tilde{b}0&tilde{s}}$$ Then, we add one row at a time to the bottom and perform Givens rotations so that $$mat{tilde{R}&tilde{Q}'tilde{b}0&tilde{s}a_{m+2}&b_{m+2}}~~~implies~~~mat{tilde{R}_{m+2}&tilde{Q}_{m+2}'tilde{b}_{m+2}0&tilde{s}_{m+2}0&0}$$

$$~$$

Householder Reflections

Householder reflections and QR decomposition

Keywords
array
Householder
Usage
Arguments
A
numeric matrix with nrow(A)>=ncol(A).
Details

The Householder method applies a succession of elementary unitary matrices to the left of matrix A. These matrices are the so-called Householder reflections.

Value
List with two matrices Q and R, Q orthonormal and R upper triangular, such that A=Q%*%R.
References

Trefethen, L. N., and D. Bau III. (1997). Numerical Linear Algebra. SIAM, Society for Industrial and Applied Mathematics, Philadelphia.

See Also
Aliases
  • householder
Examples
Documentation reproduced from package pracma, version 1.9.9, License: GPL (>= 3)

Community examples

Qr Householder Code

API documentation