Lecture 8
LINEAR VECTOR SPACES
A vector space (or linear space) is a collection of
objects (called vectors) that, informally speaking,
may be scaled and added.
More formally, a vector space is a set on which two
operations, called (vector) addition and (scalar)
multiplication, are defined and satisfy certain natural
axioms which are listed below.
Let F be a field (such as the real numbers or complex
numbers), whose elements will be called scalars. A
vector space over the
field F is a set V together with
two operations,
1) vector addition: V × V → V denoted v + w, where v,
w V, and such that V is
closed under vector
addition:
If u, v are V, then (u + v) V.
2) scalar multiplication: F × V → V denoted a v, where a
F and v V, and such that V is
closed under scalar
multiplication
:
If a F, v V, then (a v) V.
satisfying the 8. axioms below:
1. Vector addition is associative:
u + (v + w) = (u + v) + w.
2. Vector addition is commutative:
v + w = w + v.
3. Vector addition has an additive identity element:
There exists an element 0 V, called the zero vector, such that
v + 0 = v for all v V.
4. Vector addition has an negative element:
For all v
in
V, there exists an element w V, called the negative of v,
such that
v + w = 0.
and we denote w = -v
5. Distributivity holds for scalar multiplication over vector addition:
For all a F a (v + w) = a v + a w.
6. Distributivity holds for scalar multiplication over field addition:
For all a, b F (a + b) v = a v + b v.
7. Scalar multiplication is compatible with multiplication in the field of
scalars:
For all a, b F a (b v) = (ab) v.
8 . Scalar multiplication has an identity element:
For all v V, we have 1 v = v, where 1 denotes the multiplicative
identity in F.
For all u, v, w
V
In R
n,
the n-tuple (ordered list) (
x
1
,
x
2
, x
3
,... x
n
),
x
i
R
, is
identified with the vector connecting point (x
1
, x
2
, x
3
,... x
n
) with
the origin (
0,0,0,...,0
).
x
1
x
2
(x
1
, x
2
)
A vector - is a quantity which has a direction
and size.
head
tail
OTHER EXAMPLES
PROPERTIES
• The zero vector 0 in V is unique
• 0 v = 0 for every v V;
• 0 = 0 for every R;
• u = v u = v, for 0;
• v = v = , for v 0;
• ( - ) v = v - v,
•( - ) v = ( - v ).
Linear Subspace
DEFINITION
A non-empty subset W V is a subspace of V if:
closed under
addition
closed under multiplication
Property
The zero vector 0 always belongs to a linear subspace,
Proof
v
V then
from (ii)
(-1)v
V then
from (i)
v + (-1)v = v – v = 0 so 0
V.
QED
R
for
W
w
W
w
ii
W
w
v
W
w
v
i
)
(
,
)
(
Examples in R
2
u
v
u + v
Examples
1. The set of vectors with coefficients
{ (x
1
, x
2
, x
3
): x
1
+ x
2
+ x
3
=0 }
is a proper subspace of R
3
.
2. The set of vectors { x
R
7
: 3x
1
+ 7x
7
= 0}
is
a proper subspace
of R
7
,
but the set {x R
7
: 3x
1
+ 7x
7
= 1}
is not
a linear subspace
of R
7
.
Note
Properties
Example
Let us consider two linear subspaces:
U = {(x
1
, x
2
, x
3
): x
1
+ 2x
2
- 3x
3
= 0}, W = {(x
1
, x
2
, x
1)
: x
1
-
2x
2
= 0}.
The intersection U
∩
W is a linear subspace
The union U W is not a linear subspace.
A counterexample:
Let us take two vectors (1,1,1) U and (2, 1, 0) W,
their sum is the vector (1,1,1) + (2, 1, 0) = (3, 2, 1) which does
not belong to the set U W.
Linear Combinations
Example:
Linear combination of vectors a and b: 3 a +
2 b
a
b
3a
2b
3a +2b
x
1
x
2
Definition
A linear combination of vectors a
1
, a
2
, ....., a
k
is a
vector v
v = c
1
a
1
+ c
2
a
2
+ ... + c
n
a
n
where c
1
, c
2
, ..., c
k
are real numbers R.
Examples (space R
3
)
1) Vector (4, 5,-8) is a linear combination of vectors
(0, 1, 0)
and (2, 2, -4)
because:
(4, 5, -8) =
1
(0, 1, 0) +
2
(2, 2, -4).
2) A linear combination of vectors (1, 2, 3) and (-3, 4,
2) and
(1, 0, 1) is the vector (-15, 28, 18)
because:
4
(1, 2, 3) +
5
(-3, 4, 2) + (
– 4)
(1, 0, 1) = (-15,
28, 18)
Space Spanned by Vectors
DEFINITION
For a set of vectors S = {v
1
, v
2
,..., v
n
} the subspace
Lin (v
1
, v
2
,..., v
n
) = { k
1
v
1
+ k
2
v
2
+ ...+ k
n
v
n
: k
1
,
k
2
,..., k
n
R }
generated by forming all the linear combinations of vectors from S
is called the space spanned by S.
Example - Space R
3
The subspace spanned by vectors (4, 3, 1 ) i (1, 2, 0 ) has the
form of a set of vectors:
a (4, 3, 1) + b( 1, 2, 0) = ( 4a + b, 3a + 2b, a )
and determines a plane which passes through the point (0, 0, 0 )
and is parallel to the vectors (4,3 ,1), (1, 2, 0).
Linear Dependence
Definition
Vectors a
1
, a
2
, ....., a
k
are linearly dependent if ,
there exist numbers c
1
, c
2
, ..., c
k
R, not all equal to
zero,
such that
c
1
a
1
+ c
2
a
2
+ ... + c
n
a
n
= 0,
If such numbers do not exist, then the vectors are said
to be linearly independent.
Linearly dependent sets of vectors are those in which at least one vector is
a combination of the others.
The empty set is always linearly independent.
Example
Two vectors from R
2
are linearly dependent if they are not co-linear.
Three vectors from R
3
are linearly independent if they are not co-planar.
Facts
dependent
Null Space and the Rank of a Matrix
NULLSPACE
Definition Let A be a m x n matrix, the set
N (A) = {x R
n
: A x = 0} (a
subset of R
n
)
is called the nullspace. (note 0 R
m
)
It is simply the the set of all solutions of the
homogeneous system A x = 0.
A homogeneous system of equations
be the columns of matrix A, A = [a
1
, a
2
, ....., a
k
]
and
c
1
a
1
+ c
2
a
2
+ ... + c
n
a
n
= 0,
a
1
, a
2
, ....., a
k
A c =
0
c = [c
1
, c
2
,... c
n
]
T
the constants appearing in the definition of linear
dependence,
is the same as
Let
then the equation
the null space of A
If the null space has no other elements than the
obvious 0 vector, then the column vectors are linearly
independent.
The columns of A form a linearly independent set
•if N(A) = {0} - there is one obviuos solution 0: A 0 = 0
•if rank(A) = n - the rank of A tells us how many are there
independent column vectors
Let A be a m x n matrix
n
The rows of A form a linearly independent set
•if N(A
T
) = {0} - there is one obviuos solution 0
: A
T
0 = 0
•if rank(A
T
) = m - the rank of A
T
tells us how many are there
independent row vectors
Let A be a m x n
matrix
m
If A is a square n x n matrix then it is nonsingular
(det A ≠ 0) :
•if the columns form a linearly independent set
•if the rows form a linearly independent set
Proof
The matrix
A(a
1
, a
2
, ....., a
n
) - a matrix with columns equal to the vectors
a
1
, a
2
, .....,a
n
is of order n x (n+1), so it cannot be of rank n+1.
Example
The vectors (1, 2, 3), (2, 3, 4), (3, 4, 5) are linearly independent.
5
4
3
4
3
2
3
2
1
A
Unit Vectors
Definition
The vectors :
e
i
= ( 0, ..., 1..., 0 ),
where the i- th coefficient is equal to 1, and the
rest are zero, are called versors or the unit
vectors of the Cartesian Coordinate System.
These vectors are linearly independent. Of course, in this case
the matrix A is the unit matrix , A = I thus det A = 1 0.
Quite often in R
3
, instead e
1
, e
2
, e
3
, we use the traditional
notation
i, j, k.
.
k
a
j
a
i
a
a
z
y
x
Any vector a = (a
z
, a
y
, a
z
) can be represented as
a linear combination of this vector and the unit vectors i, j, k
{e
1
, e
2
, e
3
} = { i,
j, k}.