MADlib
1.1 A newer version is available
User Documentation
|
In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix, with many useful applications in signal processing and statistics. More...
The linalg module consists of useful utility functions for basic linear algebra operations. Several of these functions can be used while implementing new algorithms.
Refer to the file for documentation on each of the utlity functions.
Linear-algebra functions.
Let \(A\) be a \(mxn\) matrix, where \(m \ge n\). Then \(A\) can be decomposed as follows:
\[ A = U \Sigma V^T, \]
where \(U\) is a \(m \times n\) orthonormal matrix, \(\Sigma\) is a \(n \times n\) diagonal matrix, and \(V\) is an \(n \times n\) orthonormal matrix. The diagonal elements of \(\Sigma\) are called the {singular values}.
row_id col1 col2 row1 1 1 0 row2 2 0 1
row_id row_vec row1 1 {1, 0} row2 2 {0, 1}
row_id col_id value row1 1 1 1 row2 2 2 1The 'row_id' represents the row number, 'col_id' represents the column id and the 'value' represents the matrix value at ['row_id', 'col_id']
Ouptut for eigen vectors/values will be in the following 3 tables:
Result summary table will be in the following format:
In the result summary table, the reconstruction error is computed as \( \sqrt{mean((X - USV^T)_{ij}^2)} \), where the average is over all elements of the matrices. The relative reconstruction error is then computed as ratio of the reconstruction error and \( \sqrt{mean(X_{ij}^2)} \).
Sparse matrices:
Block matrices:
Native implementation for sparse matrix:
Let \(A\) be a \(m \times n\) matrix, where \(m \ge n\). Then \(A\) can be decomposed as follows:
\[ A = U \Sigma V^T, \]
where \(U\) is a \(m \times n\) orthonormal matrix, \(\Sigma\) is a \(n \times n\) diagonal matrix, and \(V\) is an \(n \times n\) orthonormal matrix. The diagonal elements of \(\Sigma\) are called the {singular values}.
It is possible to formulate the problem of computing the singular triplets ( \(\sigma_i, u_i, v_i\)) of \(A\) as an eigenvalue problem involving a Hermitian matrix related to \(A\). There are two possible ways of achieving this:
\[ H(A) = \begin{bmatrix} 0 & A\\ A^* & 0 \end{bmatrix} \]
The singular values are the nonnegative square roots of the eigenvalues of the cross product matrix. This approach may imply a severe loss of accuracy in the smallest singular values. The cyclic matrix approach is an alternative that avoids this problem, but at the expense of significantly increasing the cost of the computation.
Computing the cross product matrix explicitly is not recommended, especially in the case of sparse A. Bidiagonalization was proposed by Golub and Kahan [citation?] as a way of tridiagonalizing the cross product matrix without forming it explicitly.
Consider the following decomposition
\[ A = P B Q^T, \]
where \(P\) and \(Q\) are unitary matrices and \(B\) is an \(m \times n\) upper bidiagonal matrix. Then the tridiagonal matrix \(B*B\) is unitarily similar to \(A*A\). Additionally, specific methods exist that compute the singular values of \(B\) without forming \(B*B\). Therefore, after computing the SVD of B,
\[ B = X\Sigma Y^T, \]
it only remains to compute the SVD of the original matrix with \(U = PX\) and \(V = QY\).