In mathematics, the spectrum of a matrix is the set of its eigenvalues.[1][2][3] More generally, if is a linear operator on any finite-dimensional vector space, its spectrum is the set of scalars such that is not invertible. The determinant of the matrix equals the product of its eigenvalues. Similarly, the trace of the matrix equals the sum of its eigenvalues.[4][5][6] From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of its nonzero eigenvalues (the density of multivariate normal distribution will need this quantity).
In many applications, such as PageRank, one is interested in the dominant eigenvalue, i.e. that which is largest in absolute value. In other applications, the smallest eigenvalue is important, but in general, the whole spectrum provides valuable information about a matrix.
Definition
editLet V be a finite-dimensional vector space over some field K and suppose T : V → V is a linear map. The spectrum of T, denoted σT, is the multiset of roots of the characteristic polynomial of T. Thus the elements of the spectrum are precisely the eigenvalues of T, and the multiplicity of an eigenvalue λ in the spectrum equals the dimension of the generalized eigenspace of T for λ (also called the algebraic multiplicity of λ).
Now, fix a basis B of V over K and suppose M ∈ MatK (V) is a matrix. Define the linear map T : V → V pointwise by Tx = Mx, where on the right-hand side x is interpreted as a column vector and M acts on x by matrix multiplication. We now say that x ∈ V is an eigenvector of M if x is an eigenvector of T. Similarly, λ ∈ K is an eigenvalue of M if it is an eigenvalue of T, and with the same multiplicity, and the spectrum of M, written σM, is the multiset of all such eigenvalues.
Related notions
editThe eigendecomposition (or spectral decomposition) of a diagonalizable matrix is a decomposition of a diagonalizable matrix into a specific canonical form whereby the matrix is represented in terms of its eigenvalues and eigenvectors.
The spectral radius of a square matrix is the largest absolute value of its eigenvalues. In spectral theory, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements in the spectrum of that operator.
Notes
edit- ^ Golub & Van Loan (1996, p. 310)
- ^ Kreyszig (1972, p. 273)
- ^ Nering (1970, p. 270)
- ^ Golub & Van Loan (1996, p. 310)
- ^ Herstein (1964, pp. 271–272)
- ^ Nering (1970, pp. 115–116)
References
edit- Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations (3rd ed.), Baltimore: Johns Hopkins University Press, ISBN 0-8018-5414-8
- Herstein, I. N. (1964), Topics In Algebra, Waltham: Blaisdell Publishing Company, ISBN 978-1114541016
- Kreyszig, Erwin (1972), Advanced Engineering Mathematics (3rd ed.), New York: Wiley, ISBN 0-471-50728-8
- Nering, Evar D. (1970), Linear Algebra and Matrix Theory (2nd ed.), New York: Wiley, LCCN 76091646