Matrix decomposition - Wikipedia

Excerpt

From Wikipedia, the free encyclopedia


From Wikipedia, the free encyclopedia

In the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

Example[edit]

In numerical analysis, different decompositions are used to implement efficient matrix algorithms.

For instance, when solving a system of linear equations {\displaystyle A\mathbf {x} =\mathbf {b} }, the matrix A can be decomposed via the LU decomposition. The LU decomposition factorizes a matrix into a lower triangular matrix L and an upper triangular matrix U. The systems {\displaystyle L(U\mathbf {x} )=\mathbf {b} } and {\displaystyle U\mathbf {x} =L^{-1}\mathbf {b} } require fewer additions and multiplications to solve, compared with the original system {\displaystyle A\mathbf {x} =\mathbf {b} }, though one might require significantly more digits in inexact arithmetic such as floating point.

Similarly, the QR decomposition expresses A as QR with Q an orthogonal matrix and R an upper triangular matrix. The system Q(Rx) = b is solved by Rx = QTb = c, and the system Rx = c is solved by ‘back substitution’. The number of additions and multiplications required is about twice that of using the LU solver, but no more digits are required in inexact arithmetic because the QR decomposition is numerically stable.

[edit]

LU decomposition[edit]

LU reduction[edit]

Block LU decomposition[edit]

Rank factorization[edit]

Cholesky decomposition[edit]

QR decomposition[edit]

RRQR factorization[edit]

Interpolative decomposition[edit]

Eigendecomposition[edit]

  • Also called spectral decomposition.
  • Applicable to: square matrix A with linearly independent eigenvectors (not necessarily distinct eigenvalues).
  • Decomposition: {\displaystyle A=VDV^{-1}}, where D is a diagonal matrix formed from the eigenvalues of A, and the columns of V are the corresponding eigenvectors of A.
  • Existence: An n-by-n matrix A always has n (complex) eigenvalues, which can be ordered (in more than one way) to form an n-by-n diagonal matrix D and a corresponding matrix of nonzero columns V that satisfies the eigenvalue equation {\displaystyle AV=VD}. {\displaystyle V} is invertible if and only if the n eigenvectors are linearly independent (that is, each eigenvalue has geometric multiplicity equal to its algebraic multiplicity). A sufficient (but not necessary) condition for this to happen is that all the eigenvalues are different (in this case geometric and algebraic multiplicity are equal to 1)
  • Comment: One can always normalize the eigenvectors to have length one (see the definition of the eigenvalue equation)
  • Comment: Every normal matrix A (that is, matrix for which {\displaystyle AA^{}=A^{}A}, where {\displaystyle A^{*}} is a conjugate transpose) can be eigendecomposed. For a normal matrix A (and only for a normal matrix), the eigenvectors can also be made orthonormal ({\displaystyle VV^{*}=I}) and the eigendecomposition reads as {\displaystyle A=VDV^{*}}. In particular all unitary, Hermitian, or skew-Hermitian (in the real-valued case, all orthogonal, symmetric, or skew-symmetric, respectively) matrices are normal and therefore possess this property.
  • Comment: For any real symmetric matrix A, the eigendecomposition always exists and can be written as {\displaystyle A=VDV^{\mathsf {T}}}, where both D and V are real-valued.
  • Comment: The eigendecomposition is useful for understanding the solution of a system of linear ordinary differential equations or linear difference equations. For example, the difference equation {\displaystyle x_{t+1}=Ax_{t}} starting from the initial condition {\displaystyle x_{0}=c} is solved by {\displaystyle x_{t}=A^{t}c}, which is equivalent to {\displaystyle x_{t}=VD^{t}V^{-1}c}, where V and D are the matrices formed from the eigenvectors and eigenvalues of A. Since D is diagonal, raising it to power {\displaystyle D^{t}}, just involves raising each element on the diagonal to the power t. This is much easier to do and understand than raising A to power t, since A is usually not diagonal.

Jordan decomposition[edit]

The Jordan normal form and the Jordan–Chevalley decomposition

  • Applicable to: square matrix A
  • Comment: the Jordan normal form generalizes the eigendecomposition to cases where there are repeated eigenvalues and cannot be diagonalized, the Jordan–Chevalley decomposition does this without choosing a basis.

Schur decomposition[edit]

Real Schur decomposition[edit]

QZ decomposition[edit]

  • Also called: generalized Schur decomposition
  • Applicable to: square matrices A and B
  • Comment: there are two versions of this decomposition: complex and real.
  • Decomposition (complex version): {\displaystyle A=QSZ^{*}} and {\displaystyle B=QTZ^{*}} where Q and Z are unitary matrices, the * superscript represents conjugate transpose, and S and T are upper triangular matrices.
  • Comment: in the complex QZ decomposition, the ratios of the diagonal elements of S to the corresponding diagonal elements of T, {\displaystyle \lambda {i}=S{ii}/T_{ii}}, are the generalized eigenvalues that solve the generalized eigenvalue problem {\displaystyle A\mathbf {v} =\lambda B\mathbf {v} } (where {\displaystyle \lambda } is an unknown scalar and v is an unknown nonzero vector).
  • Decomposition (real version): {\displaystyle A=QSZ^{\mathsf {T}}} and {\displaystyle B=QTZ^{\mathsf {T}}} where A, B, Q, Z, S, and T are matrices containing real numbers only. In this case Q and Z are orthogonal matrices, the T superscript represents transposition, and S and T are block upper triangular matrices. The blocks on the diagonal of S and T are of size 1×1 or 2×2.

Takagi’s factorization[edit]

Singular value decomposition[edit]

Scale-invariant decompositions[edit]

Refers to variants of existing matrix decompositions, such as the SVD, that are invariant with respect to diagonal scaling.

Analogous scale-invariant decompositions can be derived from other matrix decompositions; for example, to obtain scale-invariant eigenvalues.[3][4]

Hessenberg decomposition[edit]

Complete orthogonal decomposition[edit]

  • Also known as: UTV decomposition, ULV decomposition, URV decomposition.
  • Applicable to: m-by-n matrix A.
  • Decomposition: {\displaystyle A=UTV^{*}}, where T is a triangular matrix, and U and V are unitary matrices.
  • Comment: Similar to the singular value decomposition and to the Schur decomposition.

Other decompositions[edit]

Polar decomposition[edit]

Algebraic polar decomposition[edit]

Mostow’s decomposition[edit]

Sinkhorn normal form[edit]

  • Applicable to: square real matrix A with strictly positive elements.
  • Decomposition: {\displaystyle A=D_{1}SD_{2}}, where S is doubly stochastic and D1 and D2 are real diagonal matrices with strictly positive elements.

Sectoral decomposition[edit]

Williamson’s normal form[edit]

Matrix square root[edit]

Generalizations[edit]

[icon]

This section needs expansion with: examples and additional citations. You can help by adding to it. (December 2014)

There exist analogues of the SVD, QR, LU and Cholesky factorizations for quasimatrices and cmatrices or continuous matrices.[13] A ‘quasimatrix’ is, like a matrix, a rectangular scheme whose elements are indexed, but one discrete index is replaced by a continuous index. Likewise, a ‘cmatrix’, is continuous in both indices. As an example of a cmatrix, one can think of the kernel of an integral operator.

These factorizations are based on early work by Fredholm (1903), Hilbert (1904) and Schmidt (1907). For an account, and a translation to English of the seminal papers, see Stewart (2011).

See also[edit]

References[edit]

Notes[edit]

  1. ^ If a non-square matrix is used, however, then the matrix U will also have the same rectangular shape as the original matrix A. And so, calling the matrix U upper triangular would be incorrect as the correct term would be that U is the ‘row echelon form’ of A. Other than this, there are no differences in LU factorization for square and non-square matrices.

Citations[edit]

  1. ^ Lay, David C. (2016). Linear algebra and its applications. Steven R. Lay, Judith McDonald (Fifth Global ed.). Harlow. p. 142. ISBN 978-1-292-09223-2. OCLC 920463015.
  2. ^ Piziak, R.; Odell, P. L. (1 June 1999). “Full Rank Factorization of Matrices”. Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.
  3. ^ Uhlmann, J.K. (2018), “A Generalized Matrix Inverse that is Consistent with Respect to Diagonal Transformations”, SIAM Journal on Matrix Analysis and Applications, 239 (2): 781–800, doi:10.1137/17M113890X
  4. ^ Uhlmann, J.K. (2018), “A Rank-Preserving Generalized Matrix Inverse for Consistency with Respect to Similarity”, IEEE Control Systems Letters, 3: 91–95, arXiv:1804.07334, doi:10.1109/LCSYS.2018.2854240, ISSN 2475-1456, S2CID 5031440
  5. ^ Choudhury & Horn 1987, pp. 219–225
  6. ^ Jump up to: a b c Bhatia, Rajendra (2013-11-15). “The bipolar decomposition”. Linear Algebra and Its Applications. 439 (10): 3031–3037. doi:10.1016/j.laa.2013.09.006.
  7. ^ Horn & Merino 1995, pp. 43–92
  8. ^ Mostow, G. D. (1955), Some new decomposition theorems for semi-simple groups, Mem. Amer. Math. Soc., vol. 14, American Mathematical Society, pp. 31–54
  9. ^ Nielsen, Frank; Bhatia, Rajendra (2012). Matrix Information Geometry. Springer. p. 224. arXiv:1007.4402. doi:10.1007/978-3-642-30232-9. ISBN 9783642302329. S2CID 118466496.
  10. ^ Zhang, Fuzhen (30 June 2014). “A matrix decomposition and its applications”. Linear and Multilinear Algebra. 63 (10): 2033–2042. doi:10.1080/03081087.2014.933219. S2CID 19437967.
  11. ^ Drury, S.W. (November 2013). “Fischer determinantal inequalities and HighamÊŒs Conjecture”. Linear Algebra and Its Applications. 439 (10): 3129–3133. doi:10.1016/j.laa.2013.08.031.
  12. ^ Idel, Martin; Soto Gaona, Sebastián; Wolf, Michael M. (2017-07-15). “Perturbation bounds for Williamson’s symplectic normal form”. Linear Algebra and Its Applications. 525: 45–58. arXiv:1609.01338. doi:10.1016/j.laa.2017.03.013. S2CID 119578994.
  13. ^ Townsend & Trefethen 2015

Bibliography[edit]

External links[edit]