In linear algebra, a **positive-definite matrix** is a Hermitian matrix which in many ways is analogous to a positive real number. The notion is closely related to a positive-definite symmetric bilinear form (or a sesquilinear form in the complex case). Jump to: navigation, search Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (or linear spaces), linear transformations, and systems of linear equations. ...
A Hermitian matrix (or self-adjoint matrix) is a square matrix with complex entries which is equal to its own conjugate transpose - that is, the element in the ith row and jth column is equal to the complex conjugate of the element in the jth row and ith column, for...
In common usage positive is sometimes used in affirmation, as a synonym for yes or to express certainty. Look up Positive on Wiktionary, the free dictionary In mathematics, a number is called positive if it is bigger than zero. ...
In mathematics, the real numbers are intuitively defined as numbers that are in one-to-one correspondence with the points on an infinite lineâ€”the number line. ...
In mathematics, a definite bilinear form B is one for which B(v,v) has a fixed sign (positive or negative) when it is not 0. ...
In mathematics, a bilinear form on a vector space V over a field F is a mapping V Ã— V â†’ F which is linear in both arguments. ...
Sesquilinear form - Wikipedia, the free encyclopedia /**/ @import /skins-1. ...
## Equivalent formulations
Let *M* be an *n* × *n* Hermitian matrix. In the following we denote the transpose of a matrix of vector *a* by *a*^{T}, and the conjugate transpose by *a* ^{*} . The matrix *M* is said to be **positive definite** if it has one (and therefore all) of the following equivalent properties: A Hermitian matrix (or self-adjoint matrix) is a square matrix with complex entries which is equal to its own conjugate transpose - that is, the element in the ith row and jth column is equal to the complex conjugate of the element in the jth row and ith column, for...
In mathematics, and in particular linear algebra, the transpose of a matrix is another matrix, produced by turning rows into columns and vice versa. ...
In mathematics, the conjugate transpose or adjoint of an m-by-n matrix A with complex entries is the n-by-m matrix A* obtained from A by taking the transpose and then taking the complex conjugate of each entry. ...
Analogous statements hold if *M* is a real symmetric matrix, by replacing by , and the conjugate transpose by the transpose. Jump to: navigation, search In linear algebra, a determinant is a function depending on n that associates a scalar det(A) to every nÃ—n square matrix A. The fundamental geometric meaning of a determinant is as the scale factor for volume when A is regarded as a linear transformation. ...
Jump to: navigation, search In linear algebra, a symmetric matrix is a matrix that is its own transpose. ...
In mathematics, the conjugate transpose or adjoint of an m-by-n matrix A with complex entries is the n-by-m matrix A* obtained from A by taking the transpose and then taking the complex conjugate of each entry. ...
In mathematics, and in particular linear algebra, the transpose of a matrix is another matrix, produced by turning rows into columns and vice versa. ...
## Further properties Every positive definite matrix is invertible and its inverse is also positive definite. If *M* is positive definite and *r* > 0 is a real number, then *r**M* is positive definite. If *M* and *N* are positive definite, then *M* + *N* is also positive definite, and if *M**N* = *N**M*, then *M**N* is also positive definite. Every positive definite matrix *M*, has at least one square root matrix *N* such that *N*^{2} = *M*. In fact, *M* may have infinitely many square roots, but exactly one positive definite square root. Jump to: navigation, search In mathematics, the principal square root of a non-negative real number is denoted and represents the non-negative real number whose square (the result of multiplying the number by itself) is . ...
## Negative-definite, semidefinite and indefinite matrices The Hermitian matrix *M* is said to be **negative-definite** if for all non-zero (or, equivalently, all non-zero ). It is called **positive-semidefinite** if for all (or ) and **negative-semidefinite** if for all (or ). A Hermitian matrix which is neither positive- nor negative-semidefinite is called **indefinite**.
## Non-Hermitian matrices A real matrix *M* may have the property that *x*^{T}*Mx* > 0 for all nonzero real vectors *x* without being symmetric. The matrix provides an example. In general, we have *x*^{T}*Mx* > 0 for all real nonzero vectors *x* if and only if the symmetric part, (*M* + *M*^{T}) / 2, is positive definite. The situation for complex matrices may be different, depending on how one generalizes the inequality *z*^{*}*Mz* > 0. If *z*^{*}*Mz* is real for all complex vectors *z*, then the matrix *M* is necessarily Hermitian. So, if we require that *z*^{*}*Mz* be real and positive, then *M* is automatically Hermitian. On the other hand, we have that Re(*z*^{*}*Mz*) > 0 for all complex nonzero vectors *z* if and only if the Hermitian part, (*M* + *M*_{*}) / 2, is positive definite. There is no agreement in the literature on the proper definition of *positive-definite* for non-Hermitian matrices.
## Generalizations Suppose *K* denotes the field or , *V* is a vector space over *K*, and is a bilinear map which is Hermitian in the sense that *B*(*x*,*y*) is always the complex conjugate of *B*(*y*,*x*). Then *B* is called *positive definite* if *B*(*x*,*x*) > 0 for every nonzero *x* in *V*. Jump to: navigation, search In abstract algebra, a field is an algebraic structure in which the operations of addition, subtraction, multiplication and division (except division by zero) may be performed, and the same rules hold which are familiar from the arithmetic of ordinary numbers. ...
Jump to: navigation, search A vector space (or linear space) is the basic object of study in the branch of mathematics called linear algebra. ...
In mathematics, a bilinear operator is a generalized multiplication which satisfies the distributive law. ...
## References - Roger A. Horn and Charles R. Johnson.
*Matrix Analysis,* Chapter 7. Cambridge University Press, 1985. ISBN 0-521-30586-1 (hardback), ISBN 0-521-38632-2 (paperback). |