Section 3.2

Basic Concepts of Vector Spaces

>

> with(linalg):

Warning, the protected names norm and trace have been redefined and unprotected

>

Linear Combinations, Spans, and Subspaces

******************************************************

Definition 3.2

Given vectors v[1], v[2] ,..., v[k] in a vector space V and scalars r[1], r[2] , ...., r[k] in R

the vector

r[1]*v[1]+r[2]*v[2] + . . . + r[k]*v[k]

is a linear combination of the vectors v[1], v[2] ,..., v[k] with scalar coefficients

r[1], r[2] , ...., r[k] .

*********************************************************

Recall linear combination in Section 1.1

Example 1: Given the set S = { x , x^2 , 3*x^3+4*x , 5 x -6}. Linear combination of

the vectors in S is given as r[1]*x+r[2]*x^2+r[3]*(3*x^3+4*x)+r[4]*(5*x-6) .

*********************************************************************

Definition 3.3

Let X be a subset of a vector space V. The span of X is the set of all linear combinations

of vectors in X, and is denoted by sp(X). If X is a finite set, so that X={ v[1], v[2] ,..., v[k] } then

we also write sp(X) as sp( v[1], v[2] ,..., v[k] ). If W = sp(X), the vectors in X span or generate W.

*********************************************************************

Recall spans in Section 1.1

Example 2: Let

>

> W = `span`(matrix(2,2,[1,0,0,0]),matrix(2,2,[0,1,0,0]));

W = span(matrix([[1, 0], [0, 0]]),matrix([[0, 1], [...

>

Example 3: Let S = { 1, x, x^2, x^3 ,..., x^n }. Then the sp(S) = P[n] , i.e. the set of all polynomials of

degree <= n ( p(x) = a[n]*x^n+a[n-1]*x^`n-1` + . . . + a[1]*x+a[0] ).

**********************************************************************

Definition 3.4

A subset W of a vector space V is a subspace of V if W itself fulfills the requirements of a vector

space, where addition and scalar multiplication of vectors in W produce vector the same vectors

as these operations did in V.

************************************************************************

Recall subspaces in Section 1.6

**********************************************************************

Theorem 3.2 (Test for a Subspace)

A subset W of a vector space V is a subspace of V if and only if W

1.) W is non-empty

2.) If v and w are in W, then v+w is in W. (Closure under vector addition.)

3.) If r is a scalar in R and v is in W, then rv is in W. (Closure under scalar multiplication.)

***********************************************************************

Example 4: Let S be the set of all polynomials of degree <=n such that p(0) = 0. Then S is

a subspace of P[n] . Why? (Usual addition and scalar multiplication.)

Example 5: Let S be the set of all continuous functions on [a,b] such that f(a)=f(b). Then S

is a subspace of C[a,b]. Why? (Usual addition and scalar multiplication.)

Example 6: Let S be the set of all polynomials of degree <= 3 with integer coefficients. Then

S is not a subspace of P[3] . Why? (Usual addition and scalar multiplication.)

Linear Independence

***************************************************

Definition 3.5

Let X be a set of vectors in a vector space V. A dependence relation in

this set X is an equation of the form

r[1]*v[1]+r[2]*v[2] + . . . + r[k]*v[k] = 0 some r[j] <> 0

where v[j] in V for i=1,2,...,k. If such a dependence relation exists, then X

is a linearly dependent set of vectors. Otherwise, the set X of vectors is

linearly independent .

*******************************************************

Recall linearly independent vectors in Section 2.1

Example 7: Let

>

> W = `span`(matrix(2,2,[1,1,0,2]),matrix(2,2,[-1,1,0,2]),matrix(2,2,[1,1,1,1]));

W = span(matrix([[1, 1], [0, 2]]),matrix([[-1, 1], ...

Are the vectors in W linearly independent or linearly dependent?

We must solve the following to determine if the only solution for r[1], r[2], r[3] is r[1] = r[2] = r[3] = 0 .

>

> r[1]*matrix(2,2,[1,1,0,2])+r[2]*matrix(2,2,[-1,1,0,2])+r[3]*matrix(2,2,[1,1,1,1]) = matrix(2,2,[0,0,0,0]);

r[1]*matrix([[1, 1], [0, 2]])+r[2]*matrix([[-1, 1],...

>

You have the following system of equations,

>

> r[1] -r[2] + r[3] = 0; r[1]+r[2]+r[3] = 0; r[3]=0; 2*r[1]+2*r[2]+r[3]=0;

r[1]-r[2]+r[3] = 0

r[1]+r[2]+r[3] = 0

r[3] = 0

2*r[1]+2*r[2]+r[3] = 0

>

or,

>

> matrix(4,3,[1,-1,1,1,1,1,0,0,1,2,2,1])*matrix(3,1,[r[1],r[2],r[3]]) = matrix(4,1,[0,0,0,0]);

matrix([[1, -1, 1], [1, 1, 1], [0, 0, 1], [2, 2, 1]...

>

Augmented matrix,

>

> Aug:= matrix(4,4,[1,-1,1,0,1,1,1,0,0,0,1,0,2,2,1,0]);

Aug := matrix([[1, -1, 1, 0], [1, 1, 1, 0], [0, 0, ...

>

Which has the following solution,

>

> rref(Aug);

matrix([[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [...

>

Example 8: Let S = { 1+2*x^2, 4+x+5*x^2, 3+2*x }. Are the vectors in S linearly independent

or linearly dependent?

We must solve the following to determine if the only solution for r[1], r[2], r[3] is r[1] = r[2] = r[3] = 0 .

( NOTE : This equation must hold FOR ALL x!!!!!)

>

> r1*(1+2*x^2)+r2*(4+x+5*x^2)+r3*(3+2*x) = 0;

r1*(1+2*x^2)+r2*(4+x+5*x^2)+r3*(3+2*x) = 0

> p:= r1*(1+2*x^2)+r2*(4+x+5*x^2)+r3*(3+2*x):

>

Equate coefficients on both sides of the equation to get,

>

> coeff(p,x,0)=0; coeff(p,x,1)=0; coeff(p,x,2)=0;

r1+4*r2+3*r3 = 0

r2+2*r3 = 0

2*r1+5*r2 = 0

>

This produces the following linear system,

>

> matrix(3,3,[1,4,3,0,1,2,2,5,0])*matrix(3,1,[r1,r2,r3]) = matrix(3,1,[0,0,0]);

matrix([[1, 4, 3], [0, 1, 2], [2, 5, 0]])*matrix([[...

>

Augmented matrix,

>

> Aug := augment(matrix(3,3,[1,4,3,0,1,2,2,5,0]),matrix(3,1,[0,0,0]));

Aug := matrix([[1, 4, 3, 0], [0, 1, 2, 0], [2, 5, 0...

>

Which has the following solution,

>

> rref(Aug);

matrix([[1, 0, -5, 0], [0, 1, 2, 0], [0, 0, 0, 0]])...

>

Example 9: Let S = {cos(x), sin(x)}. Are the vectors in S linearly independent

or linearly dependent?

The equation,

>

> r*cos(x) + s*sin(x) = 0;

r*cos(x)+s*sin(x) = 0

>

must be true for all values of x. Therefore, if I pick two values of x, I will be able to determine

if there exist constant r and s not equal to zero that solve the equation. If the vectors are linealy

dependent then nonzero constants must exist for all choices of x.

Let x = 0, then

>

> r*cos(0) + s*sin(0) = 0;

r = 0

>

Let x = Pi/2 , then

>

> r*cos(Pi/2) + s*sin(Pi/2) = 0;

s = 0

>

The only solution is r=s=0. Which implies sin(x) and cos(x) are linearly independent.

Basis

****************************************************

Definition 3.6

Let V be a vector space. A set of vectors in V is a basis for V if the following conditions

are met:

1.) The set of vectors spans V.

2.) The set is linearly independent.

******************************************************

Recall basis in Section 2.1

Example 10: Determine a basis for all 2 x 2 matrices.

>

> S = `span`(matrix(2,2,[1,0,0,0]),matrix(2,2,[0,1,0,0]),matrix(2,2,[0,0,1,0]),matrix(2,2,[0,0,0,1]));

S = span(matrix([[1, 0], [0, 0]]),matrix([[0, 1], [...

>

************************************************************

Definition 3.7

Let V be a finitely generated vector space. The number of elements in a basis for V is the

dimension of V, and is denoted by dim(V).

*************************************************************

Recall dimension in Section 2.1

Example 10 (revisited): What is the dimension of S the set of all 2 x 2 matrices?

Example 11: What is the dim( P[n] ) ?

Exercises

1, 3, 5, 8, 11, 13, 25.