Version 1.0.1
3rd May 2023 User documentation
 Recent Changes

Code Download
 Single
 Double
HSL_MP82: Tallskinny QR factorization
HSL_MP82
computes a QR/SVD factorization of a tallskinny distributed matrix \({A}\) across a set of processors by using a communicationavoiding algorithm. For the QR decomposition \[A = QR\] is computed, where \(Q\) is an orthonormal matrix, \(R\) is an upper triangular matrix. Three methods are available depending on the conditioning of \(A\). (1) The TSQR method is unconditionally accurate produces a matrix \(Q\) that can be either formed explicitly (orthonormal) or can be stored implicitly via a set of Householder transformations. The implemented TSQR is based on a butterfly tree reduction process. (2) The CholQR2 method that is stable as long as the condition number of \(A\) is smaller than \({\bf u}^{{1}/{2}}\) where \({\bf u}\) is the computing precision. (3) The shifted Cholesky QR method that is stable as long as the condition number of \(A\) is smaller than \({\bf u}^{1}\). For the economic SVD decomposition \[A = U \Sigma V^T\] is computed by using the GramSVD method, where \(U\), \(V\) are orthonormal matrices, and \(\Sigma\) is a diagonal matrix containing the singular values of \(A\). GramSVD provides a stable decomposition as long as the condition number of \(A\) is smaller than \({\bf u}^{{1}/{2}}\) where \({\bf u}\) is the computing precision.
The package also provides a subroutine to apply the implicit Q factor computed by using the TSQR algorithm to a matrix \(B\).