Vector
Vector
Vector
Britannica Quiz
table. The
product of two matrices shows the result of doing one transformation
followed by another (from right to left), and if the transformations are done
in reverse order the result is usually different. Thus, the product of two
matrices depends on the order of multiplication; if S and T are square
matrices (matrices with the same number of rows as columns) of the same
size, then ST and TS are rarely equal. The matrix for a given transformation
is found using coordinates. For example, in two dimensions a linear
transformation T can be completely determined simply by knowing its effect
on any two vectors v and w that have different directions. Their
transformations T(v) and T(w) are given by two coordinates; therefore, only
four coordinates, two for T(v) and two for T(w), are needed to specify T.
These four coordinates are arranged in a 2-by-2 matrix. In three dimensions
three vectors u, v, and w are needed, and to specify T(u), T(v), and T(w) one
needs three coordinates for each. This results in a 3-by-3 matrix.
Get a Britannica Premium subscription and gain access to exclusive
content.Subscribe Now
Eigenvectors
When studying linear transformations, it is extremely useful to find nonzero
vectors whose direction is left unchanged by the transformation. These are
called eigenvectors (also known as characteristic vectors). If v is an
eigenvector for the linear transformation T, then T(v) = λv for some scalar
λ. This scalar is called an eigenvalue. The eigenvalue of greatest absolute
value, along with its associated eigenvector, have special significance for
many physical applications. This is because whatever process is represented
by the linear transformation often acts repeatedly—feeding output from the
last transformation back into another transformation—which results in
every arbitrary (nonzero) vector converging on the eigenvector associated
with the largest eigenvalue, though rescaled by a power of the eigenvalue.
In other words, the long-term behaviour of the system is determined by its
eigenvectors.
matrix
mathematics
Print Cite Share Feedback
Also known as: matrix theory
Written and fact-checked by
Arthur Cayley
invertible matrix
determinant
square matrix
zero matrix
element
See all related content →
Matrix, a set of numbers arranged in rows and columns so as to form a
rectangular array. The numbers are called the elements, or entries, of the
matrix. Matrices have wide applications in engineering, physics, economics,
and statistics as well as in various branches of mathematics. Matrices also
have important applications in computer graphics, where they have been
used to represent rotations and other transformations of images.
Historically, it was not the matrix but a certain number associated with a
square array of numbers called the determinant that was first recognized.
Only gradually did the idea of the matrix as an algebraic entity emerge. The
term matrix was introduced by the 19th-century English
mathematician James Sylvester, but it was his friend the
mathematician Arthur Cayley who developed the algebraic aspect
of matrices in two papers in the 1850s. Cayley first applied them to the
study of systems of linear equations, where they are still very useful. They
are also important because, as Cayley recognized, certain sets of matrices
form algebraic systems in which many of the ordinary laws
of arithmetic (e.g., the associative and distributive laws) are valid but in
which other laws (e.g., the commutative law) are not valid.
Britannica Quiz
Sikorsky Helicopter
Getting the Job Done' is a series showcasing the state-of-the-art equipment
and tools that are essential to bringing #Trojena to life. In this episode,
Command Pilots Brad Warren and Colby Martin fly the heavy-duty Helicopter
Express over the mountains of #NEOM to complete challenging tasks key...
SPONSORED BY TROJENA
LEARN MORE
Sikorsky Helicopter
Getting the Job Done' is a series showcasing the state-of-the-art equipment
and tools that are essential to bringing #Trojena to life. In this episode,
Command Pilots Brad Warren and Colby Martin fly the heavy-duty Helicopter
Express over the mountains of #NEOM to complete challenging tasks key...
SPONSORED BY TROJENA
LEARN MORE
A matrix O with all its elements 0 is called a zero, or null, matrix. A square
matrix A with 1s on the main diagonal (upper left to lower right) and 0s
everywhere else is called an identity, or unit, matrix. It
is denoted by I or In to show that its order is n. If B is any square matrix
and I and O are the unit and zero matrices of the same order, it is always
true that B + O = O + B = B and BI = IB = B. Hence O and I behave like the
0 and 1 of ordinary arithmetic. (In fact, ordinary arithmetic is the special
case of matrix arithmetic in which all matrices are 1 × 1.)
A square matrix A in which the elements aij are nonzero only when i = j is
called a diagonal matrix. Diagonal matrices have the special property that
multiplication of them is commutative; that is, for two diagonal
matrices A and B, AB = BA. The trace of a square matrix is the sum of the
elements on the main diagonal.