So today we will discuss some properties of

the matrix of a linear transformation. Okay. Let me recall the important result that we

proved last time towards the end. V and W are finite dimensional vector spaces.

Over the same field, let us say real field. T from V to W is linear. I have 2 ordered

basis I have 2 ordered basis of V and W respectively. I am (cau) using this notation for that, BV

BW. Then we had seen last time that for every X and V, if you look at the matrix of TX,

TX is in W. The matrix of TX related to VW. This is the matrix of T the relative to the

basis BV BW into the matrix of X relative to BV. This is what we said is the converse

of the statement that if A is a matrix, then TX equal to AX is a linear transformation,

okay. Let us look at a few more properties of the matrix of a linear transformation,

specially we will look at 1st how it behaves. What is the matrix of the composition of linear

transformations? What is the matrix of the composition of linear

transformations? What we will see is that this is the multiplication of the corresponding

matrices and this is really the defining place for matrix multiplication, something that

we should always remember. Matrix multiplication, the unique, peculiar way it is defined, comes

really by looking at matrices as linear transformations okay. So the next result really defines matrix

multiplication. Then we will also look at the question, if T is invertible, what how

can you compute the matrix of the inverse transformation T inverse okay? The natural

answer is, it will be the inverse, the matrix of tree T inverse relative to the same basis

will be the inverse of the matrix of the transformation T okay. And also finally establish relationship between

matrices corresponding to different basis okay. These 3 results we will discuss today.

So the 1st is composition. So this is a framework for me. The framework is Jack I will use a

slightly different notation for this theorem. You will see it is it is because of its simplicity.

I have UVW, finite dimensional vector spaces. Real vector spaces say with I will write down

the basis for U. So these are ordered basis. I will write down

the basis BU explicitly. I need to talk about composition of maps. So I have 2 maps. T is

from U to V. T is from U to V and S is from V to W. Suppose

these are linear. Suppose T and S are linear, then I am looking at the composition. 1st

I must define the composition but before that, I want to write down the formula for the matrix

of the composition of data. Remember, a circle T. A circle T is a map from, see T is from

U to V. So T takes a vector X from U to V. S takes that vector TX to W. So this is a

matrix from U to W. So I must write DUBW. Okay, this is a linear transformation, this

is a function in the 1st place from U into W. What we would like to demonstrate is that

this is S, now is S is from V to W. So BV BW into T. T is from UV BU BV where I have not yet defined

the composition where a circle T of X, the composition, a circle T of X, this is equal

to S of T of X where X is in U. This is the formula for the composition. For X and U,

a circle T of X is S of T of X composition. Okay. So let us prove this result and you

will see that this really defines, sees if S and T are linear transformations, we will

now show that S it is easily see in that S circle T is a linear transformation, so we

know what this matrix is. What this formula says is the matrix multiplication of the matrix

of this transformation and the matrix of this transformation, the product is defined by

the left-hand side matrix okay? So this defines matrix multiplication really.

Given 2 matrices I can always find linear transformations S and T such that the 1st

matrix A is this, the 2nd matrix B is this, then I would like to know what is AB? That

is given by the composition a circle T relative to these 2 bases. Okay? So this is really

the definition of matrix multiplication. Okay. So let us prove this. Before before proving

this formula, I must show that a circle T is linear but I am going to leave that as

an exercise. So proof, clearly composition of linear transformations is again a linear

transformation. We need to observe that a circle T is a linear transformation from U

into W. Okay. Remember, this is a formula connecting to,

showing that 2 matrices are equal. Let us say, I want to show matrix T is equal to matrix

Q then I will have to show that the corresponding entries are the same. A little more general,

I will so that the Jth column of the matrix MT is equal to the Jth column or the matrix

Q. Then it follows that T is equal to Q. That is what I will do. Okay. So let me start with,

I have written down the basis I have written down the basis BU explicitly. I am going to

exploit that. Consider a circle T of UJ, a circle T of UJ is a vector in W, I want to

look at the matrix of this relative to BW. Want to look at this is a matrix relative

to BW. This is an element in W. This is a vector

in W. I will appeal to this, I will appeal to this result and also you said the definition

of composition. So this is equal to let me write this as it is, S of T of UJ, that is

composition and then keep this BW. I have simply expanded what is inside this bracket.

Now I have S of some vector. What is the matrix of S of some vector? What is the matrix of

T of some vector? It is a matrix of T into that vector. Just write the appropriate basis.

This is the matrix of S, oh the matrix of S. S is a function from V to W. So the matrix

of S will be BV BW into the matrix of T of UJ. Okay. Now matrix of T of UJ, T of UJ is from U into

W. So I must remember to write this as BW. It is, is that correct? BV. T is a function

from U into V. So this is BV. TUJ BV. Apply this formula once again. So this is equal to matrix of T. T is from U to V DU BV and UJ

is a vector in U, BU. Is that okay? This is U. Is this okay? So what I have done is on

the left I have a circle T of okay, now I will expand the left-hand side. This will

be, this is another linear transformation. I can call that R if you want. So this is

a circle T, a circle T is now a matrix, Jack is now a transformation from U to W. UJ Jack

is in U BU, this is my left-hand side and the right-hand side, I will write it as this.

Do you agree with this? A circle T is a linear transformation, so I can call that as R if

you want. Then it is left-hand side is R of UJ. R of UJ, I will appeal appeal to the same

formula. R of UJ is matrix of R into matrix of UJ. What is R? A circle T. This is because

R is a circle T, that is linear. Okay. The only thing that you need to observe is what

is the matrix of UJ to the basis BU. Method observation, we are through. Ith component

1, all other entries 0. That is a column vector. UJ, matrix of UJ relative to BU, how do you

write that? You must write UJ as a linear combination of U1, U2, et cetera, UN. The

only unique linear combination of UJ is 0 times U1 plus 0 times U2, et cetera 1 into

UJ plus 0 UJ plus and etc, 0 UN. So the matrix of UJ relative to BU is the

column vector EJ. This happens in the Jth coordinate. This is what we call as EJ, the

standard, the Jth standard basis vector of RN, the Jth standard basis vector of RN. This

is the matrix of UJ relative to BU relative to BU. With respect to some other basis, you

will not get this. Are we through with the proof? You need to make one more observation.

This observation was made much earlier. This is a matrix. On the left, this is a vector,

it is really EJ I have written down. So this left-hand side is a circle T BU BW EJ is equal

to S T EJ for all J. This is true for all J as J varies from 1 to N. What you observe is that if A is a matrix

and EJ is the Jth standard basis vector, A is of order M cross N, then A EJ is the Jth

column. EJ is the Jth standard basis vector, A is an M cross N matrix, then A EJ is the

Jth column of A. Jth column of A on the left is the Jth column of this product. If A EJ

equals B EJ for all J, then A is equal to B. That is what we have. So if you want, you

can call this M, you can call this as N. Then I have M EJ equal to N EJ for all J. Jth column

of M is equal to Jth column of N. J arbitrary. So M is equal to N. So these 2 matrices must

be the same and so I have this formula. A circle T BU BW, on the right, BV BW S BU

BV T. Okay? Okay. I told you this defines matrix multiplication and we know that matrix

multiplication is associative. That can be shown by using this result and one of the

previous theorems. So let me just give this as a corollary. To fill up the details is

an exercise for you. Matrix multiplication is associative, is a corollary of this result.

Okay? One of the consequences, matrix multiplication

is associative. Jack Just a few lines of this proof, given 2 mat 3 matrices, matrix multiplication

A, B, C, given 3 matrices A, B, C such that multiplication is possible, the product A,

B, C is possible, then product AB, product BC will be possible. So given 3 matrices such that the product

ABC is possible. A into BC is AB into C, that is what matrix (assos) multiplication associativity,

means this. We want to show A into BC equals AB into C. You are given A, B, C. Construct

3 transformations, TA, TB, TC such that TAX equals AX, TBS equals BX, TCX equals CX. What

is the matrix of the transformation TA corresponding to the standard basis? That will be A. Matrix

of TB corresponding to the standard basis, that will be B. Matrix of TC corresponding

to standard basis, that will be C. Standard basis in the appropriate spaces. See, the

product ABC must be defined. So the number of columns of A must be the same as the number

of rows of B, the number of columns of B must be the same as the number of rows of C. The order of A, B, C will be the number of

rows of A times the number of columns of C. So you need to choose appropriate basis and

appropriate spaces. K, L, M, whatever. Then use the fact that this formula holds and show

that you have AB into C is A into BC okay? So just take matrices, write down the obvious

linear (tra) natural linear transformations defined through this matrices and look at

the matrices of these linear transformations in turn with relative to the standard basis

and apply this here okay? You can show that matrix multiplication is associative. Okay.

One of the other consequences is what is the inverse matrix of a linear transformation

which is known to be invertible. Okay? The answer has been given. Let us prove let

us prove this result. But before that, let us look at the specific case, see we are talking

about inverse transformations, in particular we will look at a linear transformation over

a vector space, that is from the vector space to itself. Consider T from B to V linear,

such a transformation will be called an operator. If the domain and the co-domain are the same,

then T will be called an operator. If the weight spaces V and W are different, then

you have to obviously choose different bases but if the spaces are the same, then it is

comfortable to deal with only one basis okay? So let us say script B, I will not use V.

There is only one vector space. Let script be B, a basis of V. I will not use BV. There

is only one space here. Then I will use this notation, TB. There is only one notation.

So I will use this notation to denote TBB. See I know how to write down the matrix of

a linear transformation when 2 bases are given. In particular if the bases coincide, then

I know what this right-hand side is. Instead of writing BB, I will simplify the rotation

by writing TB. Now this I can do when I know that T is an operator. The space, from the

space V to itself. So I will use this notation. This is just a notation. Okay, this is just

a terminology. TB will be this matrix which we know how to compute. Let us look at the

particular case. So what is the matrix of the identity transformation on V? What is

the matrix of the identity transformation on V? Can you see that it is identity matrix?

But if it is between 2 different bases, then it is not identity matrix? Why is this identity

matrix? That is because you must look at the 1st basis vector, write it as a linear combination

of the same basis. The only choice is 1st co-ordinate is 1, all other co all other coefficients

are 0. So the 1st column is 100, 2nd column is 010, et cetera. So the identity matrix sorry the for the linear

transformation I on V with respect to a particular fixed bases B, reading that as 2 different

bases, this is the identity matrix. The identity matrix of order the same as the dimension

of V. So for the identity transformation, if the 2 bases are the same, then it is identity

matrix. If the basis are different, then it is not identity. You can verify easily by

simple examples. Also, what is the matrix of the 0 transformation relative to a single

basis? That is a 0 matrix. Okay. Left-hand side 0 transformation, right-hand side is

a 0 matrix. This is an equation involving matrices. Okay, in this case let us go back

to that formula that we derived just now. I have T, S from V to itself be V linear operators.

This I have proved earlier. I have proved this earlier. What happens with this in this

example in this particular situation? a circle T, there is only one basis. Okay, so this

is simplified formula when you are dealing with linear operators. Now what it actually

means is that little more abstraction can be brought here. I defined the function phi

from TL, recall I defined the function phi from, on a linear transformation T. So this

is in LVW to R M cross N, RMN or R M cross N. By phi of T equals the transformation T

relative to 2 bases. This time I will choose in the case when W is V, I will have only

one basis. So this will be with respect only one basis. One be the only base that I started

with. LVV can be shortened to LV but I leave it as it is. We had observed that this is an isomer. This

is linear 1 to 1 and down to. And so it is an isomersism and we use this formula to compute

the dimension of LVW. If M is M dimensional, W is N dimensional then we computed the dimension

of LVW by using this formula, this isomersism. Now if we in the light of this formula what

it for also follows is that this phi preserves products. phi preserves products. What is

the meaning of this? phi of a circle T equals phi of S into phi of T. It is like FXY equals

FX into FY. That is called multiplicative function. Phi preserves products. What is the consequence of this? Consequence

of this formula really, the formula that I told you for the inverse transformation when

you know that the inverse exists. So let us derive that next. So I want to show this result that let T from

V to V be an invertible transformation an invertible linear operator

and a script B be a basis of V, then T inverse is also a linear transformation from T inverse

is also a linear operator. We had seen this before. What is the matrix of T inverse relative

to B? What we want to show is that this is equal to the matrix of T relative to B, take

the inverse of that. So I will introduce a bracket and write this

minus 1 outside. Okay? Remember this is an equation again involving matrices equation

involving matrices. What is inside the brackets are linear transformations. Okay, proof. I

will make use of what we had seen just now. This formula and the fact that the identity

information with respect to the matrix of the identity transformation relative to a

fixed bases is the identity matrix. Since T is invertible, there exists S from V to

V such that T is invertible, there exists S such that T is by which I mean T circle

S. So T circle S is a circle T equals the identity transformation. This is this is the formula for transformations.

There are no matrices here. S and T are linear transformations. On the right-hand side, I

is the identity transformation. So I am using the same notation. The context must make it

clear whether it is a matrix or a linear transformation. So I will apply this formula to this equation.

So if you look at T circle S relative to the fixed bases B that I started with, that will

be equal to identity relative to B. I am using the 1st equation. T circle S equals that.

I am using just that. Identity related to B is the identity matrix. This time, it is

an equation involving matrices. The right-hand side, I is the identity matrix.

T circle S B I will invoke this formula. That is the matrix of T relative to B into the

matrix of S relative to B. This is equal to the identity matrix. I can do a similar thing

for the 2nd equation and just write down on the other side. This is equal to S into T,

matrix of S into matrix of T. This is matrix of S into matrix of T coming from this equation.

So I have 2 matrices, let us say A and B. A into B equals identity, that is equal to

B to A. In fact one of them is enough because it is a square matrix, rank nullity theorem

could be used. Okay, in any case, I have an equation like

A into B equals identity. A and B are matrices. So since A square and B squared, A and B,

both must be invertible. What it also means is that this SB is the inverse of this matrix.

Okay? This matrix must be the inverse of this matrix. So let me write that. What in particular this means is that S, this

is the matrix remember, this must be the inverse of this matrix by definition. AB equals identity

A and B are square. So B equals A inverse. That is what I have written down. But S is

T inverse and so I have this. So you do not have to

compute the matrix of the universe transformation if you know the matrix of the original transformation.

You take the inverse of the matrix of the transformation that you started with, that

will be the matrix of the inverse transformation relative to the same basis that you started

with okay. If you change the basis, then this changes. Okay. That brings us really to the next probably

that most crucial question, how do matrices corresponding to different basis behave? how

do matrices corresponding to different basis behave? Okay? So let us answer that question.

How do matrices corresponding to different basis for the same transformation behave?

The answer is given in the next result. Let me state this theorem. So I have a single

linear transformation, a linear operator really. Oh. I have 2 bases. Let T be linear, let T

be linear let B1 and B2 let B1 and B2 be basis of V. Look at the identity transformation

and then look at the matrix of the identity transformation relative to these 2 bases. Let me call that as the matrix M. I am looking

at the identity linear transformation relative to I am computing the matrix of this identity

linear transformation relative to these 2 bases. We know that this is not I, this is

not identity matrix. I am calling that as a matrix M. Then we have the following. For

every X and V the matrix of S relative to B2, the 2nd basis, is M times the matrix of

X relative to B1. This really gives us the other formula. How are matrices of a particular

linear transformation corresponding to different basis related? Let me write that formula here. The matrix

of T relative to B2 by which I mean B2 B2 is M into the matrix of T relative to B1 B1?

That is matrix of T relative to B1 times M inverse okay? This is the important probably

the most important relationship. Remember that this involves M inverse, so we must show

that M is invertible, okay. But I am going to leave that little part as an exercise.

This is similar, this is similar to what we did earlier. This is similar to what we did

earlier. Use the previous result, composition, rather, okay use the earlier result to show

that M is invertible. So I am going to leave this part as an exercise.

Exercise, show that M is invertible. Remember, M is a matrix. To show that M is invertible,

one could for instance show that the system MX equal to 0 for XNRN if M is the dimension

of V, has X equal to 0 as the only solution. MX equal to 0 for XNRN have X equal to 0 is

the only solution. Now that, you can use this idea to prove that M is invertible. So I assume

that M is invertible and derive this formula but this formula is really a consequence of

this formula. So we need to prove this 1st. So let us prove this. Let us start with the matrix of X relative

to B2. I can write this as the matrix of identity X, identity transformation. X relative to

B2. And then use this formula that we proved earlier. Let me recall that here. Matrix of

TX relative to B2, BW actually, that is the matrix of T relate of B1B2 into matrix of

X relative to B1. This is what we proved earlier. We used BV BW. This is BW. BV BW BV. I am

using B1 B2 here. Vector space V is the same. Two different basis now. So let me use this

result here. This is the matrix of I relative to B1 B2 into the matrix of X relative to

B1. I have to 1st formula immediately. This is what we are denoting by M. So XB2

is M XB1. That has proved the 1st formula. Okay? XB 2 is identity transformation I am

applying and then I am up appealing to this formula. Matrix of I relative to B1 B2 and

then matrix of X relative to B1. This is what we are calling as M. So I have the 1st formula.

XB 2 is M XB 1. That is the 1st formula. I need to prove this. So let me now start with consider TX B2 TX B2 I am going to appeal

to the previous formula. Let me call TX as Y, then I am looking at Y relative to B2.

Y relative to B2 is M times Y relative to B1. Y relative to B2 is M times Y relative

to B1. Instead of TX, I have Y. Y relative to B2

is M times Y relative to B1. TX B1, I will apply this formula. I will apply this formula

with B2 replacing B1 with B1 equals B2 really. I will apply this formula B1 equals B2. This

formula holds for any 2 bases in particular B1 equals B2. So I am going to look at this

formula. Tell me if this statement is clear? I will write this as M. Do you agree with

this? That is I am I am looking at a single basis now. I am looking at a single basis

and then I am looking at this formula for a single basis. M into T D1 into XB1. That

is, it is actually matrix of TB1 B1 into matrix of X B1. The formula for a single basis, I am using

that. But matrix of T B1 B1 is what we are denoting as T B1, matrix of T relative to

B1. On the left-hand side, I have T XB2, I will use that formula for B2. See, I have

really got what I wanted on the right-hand side. So I want this. I have sort of got of

what what what I want really here. I will now expand this TX B2 for a single

basis B2 will be TB2B2 that is just TB2. Is it okay? Same thing. What I did here? I have,

what I have done here, I used for B2, then invoked the 1st formula. TB2 into the 1st

formula is XB2 is M XB1. I hope it is clear. I started with TX B2, apply apply this formula

for Jack as though there is a single basis. So as a single basis, I get this formula,

B2 and then formula 1, XB2 is M times XBN, I proved that. So I invoke that here. So finally

what do I have? That is, this is on the left. TB2 MX B1, this is on the left. That is expanded form of this. On the right,

I have this. So let me write this. MTB1 XB 1. I want to show two matrices are equal.

I will show that their Jth columns are equal. This is true for all X in V that I started

with. This is true for X in V, in particular, see I I am looking at XB1. In particular,

take B1, right down explicitly, let us say U1, U2, et cetera, UN and then look at X may

be placed by U1, U2, et cetera, let us say X being replaced by UJ. If I replaced X by

UJ and write the matrix relative to that same basis, then this is a Jth column. This is

something that we did just now. Instead of X, replace instead of X, replace

them by the basis elements that are present in B1. Then I will get TB2M is MTB1. This

is true for all X. Apply it to the basis elements, so I get this. Again, something like PEJ equals

QEJ. So Jth column of P is equal to Jth columns of Q. So P is equal to Q. This whole thing

is P is for me. This is a matrix. So remember M is a matrix. This is a matrix. This is a

product. I am calling P. This whole thing is again a matrix, I am calling that Q. I

will apply a stand, the basis elements in B1 to this equation. Then this will be the 1st column of the identity

matrix, 2nd column of the identity matrix, et cetera. So these 2 matrices are the same.

Invoke the fact that M is invertible, post multiply by M inverse, I get the required

formula. Since M is invertible, I can post multiply this equation by M inverse. This

is now an equation involving matrices. So I have TB2 equals M TB1 M inverse. Okay? That

is if if P is okay, let us say if A is a matrix of a linear transformation corresponding to

one basis and B is the matrix transformation corresponding to the other basis, then A and

B are related by the formula, A equals M times B times M inverse for some invertible matrix

M. If A is a matrix of a linear transformation

corresponding to one basis, B is the matrix of the same linear transformation corresponding

to another basis, then A and B are related by the formula A equals M times B times M

inverse for some invertible matrix M. How one could determine M is a different but there

is an invertible matrix M that satisfies this equation. Such, okay Can you see that if A

equal to MB M inverse, then B is equal to M inverse AM. That is, M inverse AM inverse

inverse. Instead of M inverse, let us use N. Then if A is equal to MB M inverse, then

B is equal to NAN inverse, okay? So we can if A is related to B by means of the formula

we say that B is similar to A. If A is related to B by means of the formula

we say B is similar to A. By the observation that we have seen just now, it follows that

if B is similar to A, then A is similar to B. Okay. A is similar to itself. A is equal

to identity A identity inverse. If A similar B, B is similar to C, product of inverses

you can use to show that A is similar to C. So this is an equivalence relation. Similarity

of matrices is an equivalence relation. What does it preserve? Now that is something that

we cannot discuss right now. It preserves what are called eigenvalues. It preserves

eigenvalues. Okay that is something we will discuss later. Let me stop here.

thank you

Sir,

You are great.

You are master of your subject,I tried to understand this topic from my class notes,different books but I could not understand than I reached here and now I am satisfied and confident after taking your lecture,you explain the topic in a brilliant way.

Thank you very much Sir.