MATH231: Linear Algebra, Lecture 1: Tran Duc Anh, Mail: Ducanh@hnue - Edu.vn September 2021
MATH231: Linear Algebra, Lecture 1: Tran Duc Anh, Mail: Ducanh@hnue - Edu.vn September 2021
MATH231: Linear Algebra, Lecture 1: Tran Duc Anh, Mail: Ducanh@hnue - Edu.vn September 2021
Contents
A. Definition of vector space 1
• For each real number λ and each element ~u ∈ V, the multiplication λ · ~u makes sense
and is an element of V.
Definition 1 (Vector space). We say V is a vector space over the real number field R (or
briefly, V is a R−vector space, or a vector space over R) if the operations + and . above
satisfy the following properties:
• There exists a ”neutral” element in V with respect to the operation +, denoted by ~0,
satisfying ~u + ~0 = ~0 + ~u = ~u for every ~u ∈ V.
• For each element ~u ∈ V, there exists an ”opposite” element of ~u, denoted by −~u, satis-
fying ~u + (−~u) = (−~u) + ~u = ~0.
1
• For every ~u ∈ V, we have 1 · ~u = ~u (normalisation property).
Note 2. If V is a R−vector space, we can briefly say V is a vector space. The elements of V
will be called vectors. Element ~0 is called zero vector. The multiplication · of V is called the
multiplication with scalars, or the multiplication with real numbers.
Example 4. Denote by R[x] the set of all the polynomials of one variable x with real coeffi-
cients. It means each element of R[x] is of the form f (x) = a0 + a1 x + a2 x2 + . . . + ak xk with k
a natural number and ai real numbers. Let f (x) and g(x) be two polynomials. We can define
the sum f (x) + g(x) in the usual way and obtain a polynomial. Similarly, if λ is a real number
and f (x) is a polynomial then λ · f (x) is also a polynomial. Therefore, (R[x], +, ·) is a vector
space over R.
Definition 6. The vectors ~v1 , ~v2 , . . . , ~vk ∈ V are said to be linearly independent, if whenever
we find a linear combination a1~v1 + a2~v2 + . . . + ak~vk = ~0 then the coefficients a1 = a2 = . . . =
ak = 0. If these vectors are not linearly independent then we say these vectors are linearly
dependent.
Example 8. Consider a vector ~v1 ∈ V. ~v1 is linearly independent iff ~v1 6= ~0.
Example 9. Consider two vectors ~v1 , ~v2 ∈ V. These two vectors are linearly independent
iff ~v1 6 k~v2 . Therefore, the two vectors ~v1 , ~v2 are linearly dependent iff they are of the same
direction. It means that the linear dependence generalizes the parallel notion in high school.
Example 10. Consider three vectors ~v1 , ~v2 , ~v3 ∈ R3 . These three vectors are linearly depen-
dent iff they are coplanar.
• For all ~u, ~v ∈ W, we have ~u + ~v ∈ W (we say W is closed under the addition of V ).
• For all λ ∈ R and ~u ∈ W, we have λ·~u ∈ W (we say W is closed under the multiplication
with real numbers).
2
Definition 12. Let A ⊂ V be a subset. Suppose all the elements of A are ~v1 , ~v2 , . . . , ~vk .
Define the set span A to be the set of all the linear combinations of vectors in A. Then span A
is a vector subspace of V and we call span A the vector subspace generated by A (or the linear
hull A).
Note 13. In the textbook, the authors use the notation hAi instead of span A. We will use
these two notations in this subject.
From the definition of linear hull, we have the following simple proposition.
Proposition 14. Suppose ~v1 , ~v2 , . . . , ~vk are k linearly independent vectors in the vector spaceV.
Suppose ~vk+1 ∈ V is a vector with ~vk+1 6∈ span {~v1 , ~v2 , . . . , ~vk }, then k + 1 vectors
Proof Suppose ~v1 , ~v2 , . . . , ~vk+1 are linearly dependent, then there exists a nontrivial linear
combination
a1~v1 + a2~v2 + . . . + ak+1~vk+1 = ~0
where ai are real numbers which are not simultaneously zero.
If ak+1 6= 0 then we obtain
a1 a2 ak
~vk+1 = − ~v1 − ~v2 − . . . − ~vk ∈ span {~v1 , ~v2 , . . . , ~vk }.
ak+1 ak+1 ak+1
This is a contradiction.
It means ak+1 = 0. Then we obtain a nontrivial linear combination
This is also impossible, since, according to the assumption, ~v1 , ~v2 , . . . , ~vk are linearly indepen-
dent.
Proposition 15. Let X, Y ⊂ V be two subsets of a vector space V. Suppose every vector of
Y is a linear combination of vectors in X. Then we have
span Y ⊂ span X.
3
Definition 16. A subset B ⊂ A is called a maximal linearly independent of A if the vectors
in B are linearly independent and if we put another vector in A\B into B then the set B will
be no longer linearly independent.
Proposition 17. Suppose B is a maximal linearly independent subset of A.
(a) Then each vector ~v ∈ A is a linear combination of vectors in B, i.e. ~v ∈ span B.
(b) Therefore
span B = span A.
Proof Proof of (a) Suppose B = {~v1 , ~v2 , . . . , ~vm }. Consider an arbitrary vector ~vj ∈ A. If
~vj 6∈ span B, then {~v1 , ~v2 , . . . , ~vm , ~vj } is linearly independent by Proposition 14, therefore, B
is not a maximal linearly independent subset. So ~vj ∈ span B.
Proof of (b) Since B ⊂ A, span B ⊂ span A. On the other hand, each vector of A is a
linear combination of vectors in B by Proposition 15. So span A ⊂ span B.
Therefore, span B = span A.
Proposition 18 (Technical lemma). Let B and C be two maximal linearly independent subsets
of A. Then B and C are of the same number of elements.
Ideas of the proof For a detailed proof, see Lemma 2.3.3, page 36 of the Textbook. Here I
would like to sum up the basic ideas.
Suppose B = {~u1 , ~u2 , . . . , ~ur } and C = {~v1 , ~v2 , . . . , ~vs }. Since B is a maximal linearly
independent subset of A, each vector ~vi ∈ C is a linear combination of vectors B by Proposition
17. Consider vector ~v1 ∈ C. Suppose
~v1 = a1~u1 + a2~u2 + . . . + ar ~ur .
Since ~v1 6= ~0, there exists a coefficient ai 6= 0. We can suppose a1 6= 0.
Then we obtain
1 a2 ar
~u1 = ~v1 − ~u2 − . . . − ~ur .
a1 a1 a1
It means ~u1 is a linear combination of {~v1 , ~u2 , . . . , ~ur }. From this, we can prove {~v1 , ~u2 , ~u3 , . . . , ~ur }
is a maximal linearly independent subset(1) of A.
It means we can successively draw vectors ~vi of C, and we can draw a respective vector ~uj
of B such that if we replace ~uj by ~vi we still have a maximal linearly independent subset of r
elements, i.e. of the same number of elements of B. This process can be performed until we
draw all the elements of C, so we deduce s ≤ r.
Since the role of B and C are similar, their numbers of elements are equal.
Definition 19. For each system of vectors ~v1 , ~v2 , . . . , ~vk ∈ V, we call the number of a maximal
linearly independent subset of the system the rank of the system of vectors, and denote it by
rank {~v1 , ~v2 , . . . , ~vk } or more briefly rk {~v1 , ~v2 , . . . , ~vk }.
Note 20. In some books, the notation for the rank of system of vectors may be different:
rang{~v1 , ~v2 , . . . , ~vk }, or rg {~v1 , ~v2 , . . . , ~vk }. In the textbook, the authors use ha.ng {~v1 , ~v2 , . . . , ~vk }.
Corollary 21. From the definition of rank, we have
rank{~v1 , ~v2 , . . . , ~vk } ≤ k.
(1)
This statement is left as an exercise to students.