Tuesday, September 13, 2011

[Thinking Cap] Directed review of linear algebra

Your ability to appreciate the finer points of the following lectures depends on your background in linear algebra. To that extent, here is a directed review of linear algebra.

0. Convince yourself that matrix multiplication is essentially a bunch of dot products 
between row vectors of one matrix and the column vectors of the other matrix. This should also give another reason as to why the inner dimensions of the matrices must match before you can multiply them (since you can't define the dot product between two vectors of differing dimensions)

1. Remember the notion of linear dependence. A vector is considered linearly dependent on another set of vectors if it can be written as the linear combination of that set of vectors. Specifically as c1v1+c2v2+..ckvk where vi are the vectors and ci are scalar constants. Show to yourself that the vector <6, 6> is linearly dependent on the vectors <2, 1> and <1, 2>

2. Remember the notion of a space spanned by a set of vectors. Given a set S of vectors, the space spanned by S is the set of all vectors which can be written as the linear combination of the vectors in S. 

3. Remember the notion of linear independence. A set of vectors is linearly independent if none of the vectors in that set can be written as the linear combination of the rest of the vectors. 

4.  Remember  the notion if "basis" for a space. A set is considered a basis for a space if (a) the set is linearly independent and (b) every vector in that space can be written as a linear combination of the basis vectors. The "dimensionality" of a space is the size of its basis set. 

5. Remember the notion of "orthogonal basis" for a space--which is a set of vectors that  is a basis *and* the vectors are orthogonal to each other (i.e., their pairwise dot product vi*vj is 0 if i != j

6. Remember the notion of "orthonormal basis" for a space--which is a set of vectors that forms an orthogonal basis *and* the vectors are all unit vectors.

7. Remember that the row-rank of a matrix is the size of the basis set of the space spanned by the row vectors of the matrix. The column-rank of a matrix is the size of the basis of the space spanned by the column vectors. The Row and Column ranks of a matrix are always equal and this number is the rank of the matrix. The matrix is considered "full rank" if its rank is equal to its number of rows and number of columns (so full rank matrix has to be a square matrix)

Now use this knowledge to answer the following to yourself (feel free to enter answers on the blog). 


Q1. For three dimensional euclidean space, is the set [ <1, 0, 0>, <0, 1, 0>] linearly independent? Is it a basis? 

Q2. For 2-dimensional euclidean space, is the set [<1, 1>, <1, 2>]  linearly independent? Is it a basis? Is it an orthogonal basis? Is it an orthonormal basis? 

Q3. For two dimensional euclidean space, how many different basis sets can you get?  How many of them are orthogonal bases? How many of them are ortho-normal bases? 

Q4: If a matrix is full rank, then are its column vectors linearly independent? how about is row vectors? Are they also orthogonal?  


That is it for now..

Rao





relation between the "rank" of a matrix and the maximum number of linearly independent  rows (or columns). Note that a set S of vectors is considered linearly independent if no member of that set can be written as a linear combination of the 

2 comments:

  1. Q1:
    Let v1=<1,0,0> and v2= <0,1,0>

    let S={v1,v2}

    If the vector equation c1v1+c2v2=0, where c1 and c2 are scalar, have exactly one solution then the vectors in set S are said to be lineraly independent.


    we have C1(1,0,0) + c2(0,1,0)=0
    => c1*1 + c2*0 = 0 and c1*0 + c2*1 = 0
    => c1 = 0 and c2 = 0

    This is the only solution and so these vectors are linearly independent.

    Let S={v1,v2} is a Set of vector in vector space V, then S is called basis for V if

    (a) span(s)=V, i.e. S spans the vector space V.

    (b) S is a linearly independent set of vectors.

    Now if these two vectors v1 and v2 will span over V, then for each u=(u1,u2,u3) in V there must be a scalar c1 and c2 to satisfy the following equation:

    c1v1 + c2v2= (u1,u2,u3)

    c1(1,0,0) + c2(0,1,0) = (u1,u2,u3)

    Here the third component of each of these vector is zero and hence the linear combination will never have non zero third component. Therefore, if we choose
    u=(u1,u2,u3) to be any vector in V with u3 not equal to 0, we will not be able to find scalars and to satisfy the equation above.

    Therefore these two vectors cannot span V and hence can't be be a basis for V

    ReplyDelete
  2. Q1. Set [ <1, 0, 0>, <0, 1, 0>] is linearly independent
    Proof: If vectors x,y are linearly independent, then ax + by = 0 which implies a = 0,b = 0.

    a(1,0,0)+b(0,1,0)=(0,0,0)

    a=0 and also b=0. Hence, they are linearly independent.

    Is it a basis? No, they it is not.

    Proof: Determinant of matrix (considering third vector as <0,0,0>
    1 0 0
    0 1 0
    0 0 0

    is equal to zero, hence cant form a basis.

    Q2) Is the set [<1, 1>, <1, 2>] linearly independent?

    a(1,1)+b(1,2)=(0,0)

    a+b=0
    a+2b=0

    -b+2b=0
    b=0
    a=0

    Hence, linearly independent.

    Is it a basis?
    det of matrix
    1 1
    1 2

    2-1 =1 which is not equal to zero, hence they form a basis.

    Is it an orthogonal basis?
    To form an orthogonal set, product of all the possible pairs must be zero.
    since only two vectors hence, 1*1+1*2 =3 which is not equal to zero hence not an orthogonal basis.

    Is it an orthonormal basis?
    No, first of all the set doesn't make orthogonal basis and vectors are not even unit vectors.

    Q4. If a matrix is full rank, then both of its column vectors and row vectors are linearly independent. They might or might not be orthogonal.

    ReplyDelete