Orthogonality and Vector Spaces
Orthogonality and Vector Spaces
Orthogonality and Vector Spaces
Orthogonality
7.1
We have already seen that when we define an inner product in a vector space we can
generalize the concepts of lengths and angles. In particular, we know that the length
of a vector x is given by
||x|| =
x x,
cos =
Thus, the angle between two vectors is a right angle if and only if their inner product
is zero, and in that case we say that those vectors are orthogonal. Recall as well that
the projection of a vector v on a given vector u is the vector:
Pu (v) =
uv
u.
uu
for all x Rn .
1
7.2
Orthogonal complement
7.3
w uj
,
uj uj
(j = 1, . . . , p).
v u2 = 12;
v u3 = 33;
u1 u1 = 11;
u2 u2 = 6;
u3 u3 = 33/2;
Then
v=
12
33
11
u1 +
u2 +
u3 = u1 2u2 2u3 .
11
6
33/2
7.4
Gram-Schmidt process
The Gram-Schmidt process is a simple algorithm for producing an orthogonal or orthonormal basis for any subspace of Rn . To illustrate how it works, consider the
following example:
Example: Let S = Span(v1 , v2 ), with v1 = (3, 6, 0) 0 and v2 = (1, 2, 2) 0 , linearly independent, and not orthogonal, since their inner product does not equal zero. If we
15
want to construct an orthogonal basis for S, we can consider p = Pv1 (v2 ) = (3, 6, 0) 0
45
and v2 p is orthogonal to v1 (see the picture below).
v2
v2 p
v1
p=Pv1(v2)
w 1 = v1 ,
w 2 = v2
v2 w 1
w1 ,
w1 w1
w 3 = v3
v3 w 1
v3 w 2
w1
w2 ,
w1 w1
w2 w2
..
.
wp =
vp w 1
vp w 2
vp wp1
w1
w2
wp1 .
w1 w 1
w2 w2
wp1 wp1
1 k p.
Once an orthogonal basis has been found, it is easy to obtain an orthonormal basis:
simply normalize all the vectors.
7.4.1
QR factorization
1 0 0
1 1 0
1 1 1
nearly independent, so let us find a basis of orthonormal vectors for Span(A1 , A2 , A3 ) =
C(A). First set w1 = A1 = (1, 1, 1, 1) 0 . Then
0
1 3/4
1 3 1 1/4
A 2 w1
w2 = A 2
w1 = =
.
1 4 1 1/4
w1 w1
1
1
1/4
1
we obtain the third vector of the orthogonal basis by
0
0
1
3
A3 w2
A 3 w1
0 2 1 2 1 2/3
w1
w =
w3 = A 3
.
=
w1 w1
w2 w2 2 1 4 1 12 1 1/3
1/3
1
1
1
Now we can normalize the basis, and the resulting vectors will be the columns of Q
in the factorization:
1/2
1/2
Q=
1/2
1/2
3/ 12
1/ 12
1/ 12
1/ 12
8
2/ 6
1/ 6
1/ 6
Thus
1/2
R = Qt A = 3/ 12
1/2
1/2
1/2
1/ 12 1/ 12 1/ 12
2/ 6 1/ 6 1/ 6
= 0
3/2
3/ 12
0
0
1
1
1
=
1
2/ 12 .
2/ 6
7.5
The following pictures show a few pairs from different mappings from R to R; two
of them, f1 (x) = ex and f2 (x) = x2 are not linear, and it can be seen that they distort
the domain when it is transformed into the range. On the other hand, f3 (x) = 2x and
f4 (x) = 2x are linear transformations, and clearly they spread the domain evenly,
always by the same factor.
9
f1(x) = ex
f2(x) = x2
f4(x) = 2x
f3(x) = 2x
The only linear maps from R to R are multiplications by a scalar but in higher dimensions more can happen. For instance, the linear transformation from R2 to R2 defined
by:
10
Rotation by an angle= 3
x1
x2
cos
sin
sin
cos
x1
x2
7.5.1
Reflection
(x,y)
(x,y)
(x,y)
T(v)
T(v)
(x,y)
Reflection about the x-axis: This reflection transforms every vector in R2 with
coordinates (x, y) into the vector (x, y). The points in the x-axis are the fixed
points. The matrix associated to this linear transformation is given by AT =
(T (e1 ), T (e2 )). Clearly,
AT =
0 1
Reflection about the y-axis: This reflection transforms every vector in R2 with
coordinates (x, y) into the vector (x, y). The points in the y-axis are the fixed
points. The matrix associated to this linear transformation is given by AT =
(T (e1 ), T (e2 )). In this situation,
AT =
12
(y,x)
(x,y)
(x,y)
T(v)
T(v)
(y,x)
Reflection about the line x = y : In this reflection, the set of fixed points are the
points in the line y = x. The image of the vectors in the standard basis is given
by T (e1 ) = e2 and T (e2 ) = e1 . Thus, the associated matrix is
0 1
.
AT =
1 0
Reflection about the origin: The image of the vectors in the standard basis is
T (e1 ) = e1 and T (e2 ) = e2 . The only fixed point is the origin. Thus, the
matrix associated to this isometry is
AT =
R3
In R3 , when we reflect about a given plane, say the xy-plane, we move from above
the plane to below the plane (or vise-versa); this means simply changing the sign of
the other variable (z in the case of the xy-plane).
13
AT = 0
0 .
1
0
AT =
0 .
AT = 0
7.5.2
1 0 .
0 1
Orthogonal Projections
(x,y)
(0,y)
T(v)
T(v)
(x,y)
(x,0)
AT =
AT =
R3
In this vector space, the simplest projections are those in which we project onto one
of the three axes or onto one of the three coordinate planes.
15
1 0 0
AT = 0 0 0 .
0 0 0
Projection on the y-axis: This projection transforms every vector in R3 with
coordinates (x, y, z) into the vector (0, y, 0). The matrix associated to this linear
transformation is given by AT = (T (e1 ), T (e2 ), T (e3 )):
0 0 0
AT = 0 1 0 .
0 0 0
Projection on the z-axis: This projection transforms every vector in R3 with
coordinates (x, y, z) into the vector (0, 0, z). The matrix associated to this linear
transformation is given by AT = (T (e1 ), T (e2 ), T (e3 )):
0 0 0
AT = 0 0 0 .
0 0 1
Projection on the xy-plane: This projection transforms every vector in R3 with
coordinates (x, y, z) into the vector (x, y, 0). The matrix associated to this linear
transformation is given by AT = (T (e1 ), T (e2 ), T (e3 )):
1 0 0
AT = 0 1 0 .
0 0 0
16
1 0 0
AT = 0 0 0 .
0 0 1
Projection on the yz-plane: This projection transforms every vector in R3 with
coordinates (x, y, z) into the vector (0, y, z). The matrix associated to this linear
transformation is given by AT = (T (e1 ), T (e2 ), T (e3 )):
0 0 0
AT = 0 1 0 .
0 0 1
7.5.3
c 0
.
AT =
0 c
Analogously, in R3 the matrix associated to a dilation/contraction is given by:
c 0 0
AT = 0 c 0 .
0 0 c
17
7.5.4
Rotations
R2
Let v be a vector with coordinates (x, y), and let T be the linear transformation that
performs a counter clock wise rotation by an angle . The associated matrix is characterized by T (e1 ) = (cos , sin ) and T (e2 ) = ( sin , cos ); therefore
cos sin
.
AT =
sin
cos
R3
In 3 dimensions, we will consider the following rotations about the three positive
coordinate axes.
Counter clock wise rotation by an angle of about the positive x-axis: The
corresponding matrix associated to this linear transformation is given by
1
0
0
AT = 0 cos sin .
0 sin
cos
Counter clock wise rotation by an angle of about the positive y-axis: In this
case we have
cos
sin
AT =
sin
0 .
cos
Counter clock wise rotation by an angle of about the positive z-axis: Finally,
we have
cos
AT = sin
18
sin
cos 0 .
0 1