In linear algebra, a **diagonal matrix** is a square matrix in which the entries outside the main diagonal are all zero. The diagonal entries themselves may or may not be zero. Thus, the matrix D = (d_{i,j}) with n columns and n rows is diagonal if: Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations. ...
For the square matrix section, see square matrix. ...
In linear algebra, the main diagonal of a square matrix is the diagonal which runs from the top left corner to the bottom right corner. ...
For example, the following matrix is diagonal: The term *diagonal matrix* may sometimes refer to a rectangular *m*-by-*n* matrix in case only the entries of the form a_{i,i} are non-zero. However, in this article we will consider only square matrices. Any diagonal matrix is also a symmetric matrix. Also, if the entries come from the field **R** or **C**, then it is a normal matrix as well. In linear algebra, a symmetric matrix is a matrix that is its own transpose. ...
In abstract algebra, a field is an algebraic structure in which the operations of addition, subtraction, multiplication and division (except division by zero) may be performed, and the same rules hold which are familiar from the arithmetic of ordinary numbers. ...
A complex square matrix A is a normal matrix if where A* is the conjugate transpose of A. (If A is a real matrix, A*=AT and so it is normal if ATA = AAT.) All unitary, hermitian, and skew-hermitian matrices are normal. ...
Equivalently, we can define a diagonal matrix as a matrix that is both upper- and lower-triangular. In the mathematical discipline of linear algebra, a triangular matrix is a special kind of square matrix where the entries below or above the main diagonal are zero. ...
In the mathematical discipline of linear algebra, a triangular matrix is a special kind of square matrix where the entries below or above the main diagonal are zero. ...
The identity matrix *I*_{n} and any square zero matrix are diagonal. A one-dimensional matrix is always diagonal. In linear algebra, the identity matrix of size n is the n-by-n square matrix with ones on the main diagonal and zeros elsewhere. ...
In mathematics, a zero matrix is a matrix with all its entries being zero. ...
A diagonal matrix with all its main diagonal entries equal is a **scalar matrix**, that is, a scalar multiple λ*I* of the identity matrix *I*. Its effect on a vector is scalar multiplication by λ. The scalar matrices are the center of the algebra of matrices: that is, they are precisely the matrices that commute with all other square matrices of the same size. In linear algebra, the identity matrix of size n is the n-by-n square matrix with ones on the main diagonal and zeros elsewhere. ...
In mathematics, scalar multiplication is one of the basic operations defining a vector space in linear algebra (or more generally, a module in abstract algebra). ...
The term center is used in various contexts in abstract algebra to denote the set of all those elements that commute with all other elements. ...
## Matrix operations
The operations of matrix addition and matrix multiplication are especially simple for diagonal matrices. Write diag(*a*_{1},...,*a*_{n}) for a diagonal matrix whose diagonal entries starting in the upper left corner are *a*_{1},...,*a*_{n}. Then, for addition, we have This article gives an overview of the various ways to perform matrix multiplication. ...
- diag(
*a*_{1},...,*a*_{n}) + diag(*b*_{1},...,*b*_{n}) = diag(*a*_{1}+*b*_{1},...,*a*_{n}+*b*_{n}) and for matrix multiplication, This article gives an overview of the various ways to perform matrix multiplication. ...
- diag(
*a*_{1},...,*a*_{n}) · diag(*b*_{1},...,*b*_{n}) = diag(*a*_{1}*b*_{1},...,*a*_{n}*b*_{n}). The diagonal matrix diag(*a*_{1},...,*a*_{n}) is invertible if and only if the entries *a*_{1},...,*a*_{n} are all non-zero. In this case, we have In linear algebra, an n-by-n (square) matrix is called invertible, non-singular, or regular if there exists an n-by-n matrix such that where denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication. ...
- diag(
*a*_{1},...,*a*_{n})^{-1} = diag(*a*_{1}^{-1},...,*a*_{n}^{-1}). In particular, the diagonal matrices form a subring of the ring of all *n*-by-*n* matrices. In abstract algebra, a branch of mathematics, a subring is a subset of a ring containing the multiplicative identity, which is itself a ring under the same binary operations. ...
Multiplying the matrix *A* from the *left* with diag(*a*_{1},...,*a*_{n}) amounts to multiplying the *i*-th *row* of *A* by *a*_{i} for all *i*; multiplying the matrix *A* from the *right* with diag(*a*_{1},...,*a*_{n}) amounts to multiplying the *i*-th *column* of *A* by *a*_{i} for all *i*.
## Other properties The eigenvalues of diag(*a*_{1}, ..., *a*_{n}) are *a*_{1}, ..., *a*_{n}. The unit vectors **e**_{1}, ..., **e**_{n} form a basis of eigenvectors. The determinant of diag(*a*_{1}, ..., *a*_{n}) is the product *a*_{1}...*a*_{n}. In mathematics, a number is called an eigenvalue of a matrix if there exists a nonzero vector such that the matrix times the vector is equal to the same vector multiplied by the eigenvalue. ...
In mathematics, a unit vector in a normed vector space is a vector (most commonly a spatial vector) whose length is 1. ...
In linear algebra, a basis is a minimum set of vectors that, when combined, can address every vector in a given space. ...
In linear algebra, the eigenvectors (from the German eigen meaning inherent, characteristic) of a linear operator are non-zero vectors which, when operated on by the operator, result in a scalar multiple of themselves. ...
In algebra, a determinant is a function depending on n that associates a scalar det(A) to every nÃ—n square matrix A. The fundamental geometric meaning of a determinant is as the scale factor for volume when A is regarded as a linear transformation. ...
The adjugate of a diagonal matrix is again diagonal. In linear algebra, the adjugate or classical adjoint of a square matrix is a matrix which plays a role similar to the inverse of a matrix; it can however be defined for any square matrix without the need to perform any divisions. ...
A square matrix is diagonal if and only if it is triangular and normal.
## Uses Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is always desirable to represent a given matrix or linear map by a diagonal matrix. In mathematics, a linear transformation (also called linear operator or linear map) is a function between two vector spaces that respects the arithmetical operations addition and scalar multiplication defined on vector spaces, or, in other words, it preserves linear combinations. Definition and first consequences Formally, if V and W are...
In fact, a given *n*-by-*n* matrix *A* is similar to a diagonal matrix (meaning that there is a matrix *X* such that *XAX*^{-1} is diagonal) if and only if it has *n* linearly independent eigenvectors. Such matrices are said to be diagonalizable. Several equivalence relations in mathematics are called similarity. ...
In linear algebra, a set of elements of a vector space is linearly independent if none of the vectors in the set can be written as a linear combination of finitely many other vectors in the set. ...
In linear algebra, a square matrix A is called diagonalizable if it is similar to a diagonal matrix, i. ...
Over the field of real or complex numbers, more is true. The spectral theorem says that every normal matrix is unitarily similar to a diagonal matrix (if *AA*^{*} = *A*^{*}*A* then there exists a unitary matrix *U* such that *UAU*^{*} is diagonal). Furthermore, the singular value decomposition implies that for any matrix *A*, there exist unitary matrices *U* and *V* such that *UAV*^{*} is diagonal with positive entries. In abstract algebra, a field is an algebraic structure in which the operations of addition, subtraction, multiplication and division (except division by zero) may be performed, and the same rules hold which are familiar from the arithmetic of ordinary numbers. ...
In mathematics, the real numbers may be described informally in several different ways. ...
In mathematics, a complex number is a number of the form where a and b are real numbers, and i is the imaginary unit, with the property i 2 = âˆ’1. ...
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or about matrices. ...
A complex square matrix A is a normal matrix if where A* is the conjugate transpose of A. (If A is a real matrix, A*=AT and so it is normal if ATA = AAT.) All unitary, hermitian, and skew-hermitian matrices are normal. ...
Several equivalence relations in mathematics are called similarity. ...
In mathematics, a unitary matrix is an n by n complex matrix U satisfying the condition where In is the identity matrix and U* is the conjugate transpose (also called the Hermitian adjoint) of U. Note this condition says that a matrix U is unitary if it has an inverse...
In linear algebra, the singular value decomposition (SVD) is an important factorization of a rectangular real or complex matrix, with several applications in signal processing and statistics. ...
## See also In linear algebra, a tridiagonal matrix is one that is almost diagonal. ...
## External links PlanetMath is a free, collaborative, online mathematics encyclopedia. ...
## References - Roger A. Horn and Charles R. Johnson,
*Matrix Analysis*, Cambridge University Press, 1985. ISBN 0-521-30586-1 (hardback), ISBN 0-521-38632-2 (paperback). |