Unfortunately, no one can be told, what the

Matrix is. You have to see it for yourself.

– Morpheus Surprisingly apt words on the importance of

understanding matrix operations visually Hey everyone! If I had to choose just one topic that makes all of the others in linear algebra start to click and which too often goes unlearned the first

time a student takes linear algebra, it would be this one:

the idea of a linear transformation and its relation to matrices. For this video, I’m just going to focus on

what these transformations look like in the case of two dimensions and how they relate to the idea of matrix-vector

multiplication. In particular, I want to show you a way to

think about matrix-vector multiplication that doesn’t rely on memorization. To start, let’s just parse this term “linear

transformation”. “Transformation” is essentially a fancy

word for “function”. It’s something that takes in inputs and spits

out an output for each one. Specifically in the context of linear algebra,

we like to think about transformations that take in some vector and spit out another vector. So why use the word “transformation” instead

of “function” if they mean the same thing? Well, it’s to be suggestive of a certain way to

visualize this input-output relation. You see, a great way to understand functions

of vectors is to use movement. If a transformation takes some input vector

to some output vector, we imagine that input vector moving over to

the output vector. Then to understand the transformation as a

whole, we might imagine watching every possible input

vector move over to its corresponding output vector. It gets really crowded to think about all

of the vectors all at once, each one is an arrow, So, as I mentioned last video, a nice trick

is to conceptualize each vector, not as an arrow, but as a single point: the point where its

tip sits. That way to think about a transformation taking

every possible input vector to some output vector, we watch every point in space moving to some

other point. In the case of transformations in two dimensions, to get a better feel for the whole “shape”

of the transformation, I like to do this with all of the points on

an infinite grid. I also sometimes like to keep a copy of the

grid in the background, just to help keep track of where everything

ends up relative to where it starts. The effect for various transformations, moving

around all of the points in space, is, you’ve got to admit, beautiful. It gives the feeling of squishing and morphing

space itself. As you can imagine, though arbitrary transformations

can look pretty complicated, but luckily linear algebra limits itself to

a special type of transformation, ones that are easier to understand, called

“linear” transformations. Visually speaking, a transformation is linear

if it has two properties: all lines must remain lines, without getting

curved, and the origin must remain fixed in place. For example, this right here would not be

a linear transformation since the lines get all curvy and this one right here, although it keeps

the line straight, is not a linear transformation because it

moves the origin. This one here fixes the origin and it might

look like it keeps line straight, but that’s just because I’m only showing the

horizontal and vertical grid lines, when you see what it does to a diagonal line,

it becomes clear that it’s not at all linear since it turns that line all curvy. In general, you should think of linear transformations

as keeping grid lines parallel and evenly spaced. Some linear transformations are simple to

think about, like rotations about the origin. Others are a little trickier to describe with

words. So how do you think you could describe these

transformations numerically? If you were, say, programming some animations

to make a video teaching the topic what formula do you give the computer so that

if you give it the coordinates of a vector, it can give you the coordinates of where that

vector lands? It turns out that you only need to record where the two basis vectors, i-hat and j-hat, each land. and everything else will follow from that. For example, consider the vector v with coordinates

(-1,2), meaning that it equals -1 times i-hat + 2

times j-hat. If we play some transformation and follow

where all three of these vectors go the property that grid lines remain parallel

and evenly spaced has a really important consequence: the place where v lands will be -1 times the

vector where i-hat landed plus 2 times the vector where j-hat landed. In other words, it started off as a certain

linear combination of i-hat and j-hat and it ends up is that same linear combination

of where those two vectors landed. This means you can deduce where v must go

based only on where i-hat and j-hat each land. This is why I like keeping a copy of the original

grid in the background; for the transformation shown here we can read

off that i-hat lands on the coordinates (1,-2). and j-hat lands on the x-axis over at the

coordinates (3, 0). This means that the vector represented by

(-1) i-hat + 2 times j-hat ends up at (-1) times the vector (1, -2) +

2 times the vector (3, 0). Adding that all together, you can deduce that

it has to land on the vector (5, 2). This is a good point to pause and ponder,

because it’s pretty important. Now, given that I’m actually showing you the full transformation, you could have just looked to see the v has

the coordinates (5, 2), but the cool part here is that this gives

us a technique to deduce where any vectors land, so long as we have a record of where i-hat

and j-hat each land, without needing to watch the transformation

itself. Write the vector with more general coordinates

x and y, and it will land on x times the vector where

i-hat lands (1, -2), plus y times the vector where j-hat lands

(3, 0). Carrying out that sum, you see that it lands

at (1x+3y, -2x+0y). I give you any vector, and you can tell me

where that vector lands using this formula what all of this is saying is that a two dimensional

linear transformation is completely described by just four numbers: the two coordinates for where i-hat lands and the two coordinates for where j-hat lands. Isn’t that cool? it’s common to package these coordinates into a two-by-two grid of numbers, called a two-by-two matrix, where you can interpret the columns as the two special vectors where i-hat and j-hat each land. If you’re given a two-by-two matrix describing

a linear transformation and some specific vector and you want to know where that linear transformation

takes that vector, you can take the coordinates of the vector multiply them by the corresponding columns

of the matrix, then add together what you get. This corresponds with the idea of adding the

scaled versions of our new basis vectors. Let’s see what this looks like in the most

general case where your matrix has entries a, b, c, d and remember, this matrix is just a way of

packaging the information needed to describe a linear transformation. Always remember to interpret that first column,

(a, c), as the place where the first basis vector

lands and that second column, (b, d), is the place

where the second basis vector lands. When we apply this transformation to some

vector (x, y), what do you get? Well, it’ll be x times (a, c) plus y times (b, d). Putting this together, you get a vector (ax+by,

cx+dy). You can even define this as matrix-vector

multiplication when you put the matrix on the left of the

vector like it’s a function. Then, you could make high schoolers memorize

this, without showing them the crucial part that

makes it feel intuitive. But, isn’t it more fun to think about these columns as the transformed versions of your basis

vectors and to think about the results as the appropriate linear combination of those

vectors? Let’s practice describing a few linear transformations

with matrices. For example, if we rotate all of space 90° counterclockwise then i-hat lands on the coordinates (0, 1) and j-hat lands on the coordinates (-1, 0). So the matrix we end up with has columns

(0, 1), (-1, 0). To figure out what happens to any vector after

90° rotation, you could just multiply its coordinates by

this matrix. Here’s a fun transformation with a special

name, called a “shear”. In it, i-hat remains fixed so the first column of the matrix is (1, 0), but j-hat moves over to the coordinates (1,1) which become the second column of the matrix. And, at the risk of being redundant here, figuring out how a shear transforms a given

vector comes down to multiplying this matrix by that

vector. Let’s say we want to go the other way around, starting with the matrix, say with columns

(1, 2) and (3, 1), and we want to deduce what its transformation

looks like. Pause and take a moment to see if you can

imagine it. One way to do this is to first move i-hat to (1, 2). Then, move j-hat to (3, 1). Always moving the rest of space in such a

way that keeps grid lines parallel and evenly

spaced. If the vectors that i-hat and j-hat land on

are linearly dependent which, if you recall from last video, means that one is a scaled version of the

other. It means that the linear transformation squishes

all of 2D space on to the line where those two vectors sit, also known as the one-dimensional span of those two linearly dependent vectors. To sum up, linear transformations are a way to move around space such that the grid lines remain parallel and

evenly spaced and such that the origin remains fixed. Delightfully, these transformations can be described using

only a handful of numbers. The coordinates of where each basis vector

lands. Matrices give us a language to describe these

transformations where the columns represent those coordinates and matrix-vector multiplication is just a

way to compute what that transformation does to a given vector. The important take-away here is that, every time you see a matrix, you can interpret it as a certain transformation

of space. Once you really digest this idea, you’re in a great position to understand linear

algebra deeply. Almost all of the topics coming up, from matrix multiplication to determinant, change of basis, eigenvalues, … all of these will become easier to understand once you start thinking about matrices as

transformations of space. Most immediately, in the next video I’ll be talking about multiplying two matrices together. See you then!

Instructions unclear. Have applied transformation, now stuck fighting The Machines.

I'm thinking.. a balloon with a tiny dot on it using a sharpie, and stretching the balloon out horizontally. The ink'll undergo a linear transformation? Is that right?

Best effort explanation.Nice illustrations.

The pi’s are adorable

Who is your maths teacher ?

I don't get it – what's the transformation then?

If

[ 1 0 ] [x] = [1 3]

[ 0 1 ] [y]= [2 1]

Then what's x and y (the transformation)? What can we multiply 0 by to get 2? o. O

Edit: Nevermind – the function that gets applied to i hat and j hat isn't the x and y, which just changes the "magnitutde" of these arrows (span?)

gut

7:40 This is it. It appears to me that maths is only taught in schools because the school has to teach it when really they don't want anyone to learn. So they do the absolute minimum to qualify as having taught the subject,

We're all going to sob tears mixed with joy and anger if Grant ever makes a series on complex analysis.

Much better than my high school teachers explanation, and in a very small fraction of the time! Bless his heart.

Oh my god!!!!!!!!

It is worthwhile to start at the beginning.

2:14 – Gray is the domain of the transformation, blue is the range and white is the transformation of the standard basis

3:16 – Some linear transformations are simple (counterclockwise rotation of the space) then a scaling of the space with a mirroring

Go up until 5:38. You can continue watching but we are really interested in the transformation animations.

Your videos and its quality is out of this world. Appreciate the great effort.

Elegant… almost sexy! Makes you want to fall in love again… this time with maths!! Many, many thanks 🙂

GOD of Mathematics. <3

Thank you. I will never look at the matrix again as an ordinary table with several (almost random) numbers.

You are a simple legend, almost a god. Don't y'all think Grant would make an absolutely fantastic complex analysis series?

I made my own version of the transformation animations using python: https://github.com/GOMMB/Linear_Transformation

can someone explain how the transformed basis vectors i and j lands on [1 -2] and [3 0]???

5:12 I pause and ponder

hm…. that makes so much more sense now

2 minutes later

unpause to keep watching and 5:14

LOL

Amazingly clear explanations that are making me understand this topic of linear algebra which I never thought I would and had pretty much given up on..

Thank you!

felicitaciones, la mejor explicación sobre vectores y sus transformaciones

Thank you very much this video has cleared my concepts those I misunderstood.

thank you so much for the resources

Oh matrices do make sense after all.

Wow! If only all high school teachers taught this way…..

A curve γ:[a,b]–>ℝ is defined "regular" if (physically speaking) a point runs the path traced by that curve (actually, one of its parametrizations) without stopping or going backwards during all its journey. I was wondering… if instead of plotting the curve on the xOy plane, where x=t and y=γ(t) I use the coordinates [t, s(t)], with s = ∫ |γ'(x)|dx … I should obtain a straight line since the whole system of reference gets "distorted" precisely as the curve was in the xOy plane. But I'd like to check it out. Is there a way to transform coordinate systems as you do in your videos? thanks <3

Magistral

thanks high school for never showing me this

Wow quoting movies!

I know I'm late to the train but thank you so so much, you made this subject be so intuitve for me. I'm so glad you're the first that teaches me linear algebra (I'm just a curious 17.25 years oldguy), you made it seem so natural that I was able to make an educated guess that the matries gives the cordinates of the i^ and j^ vectors and derive the formula shown here 7:23 before you explained it in the video.

We should just fire all the teachers and just show these videos in schools..

God bless you! Thanks for the video, man!

Why isn't it taught like this in schools, Why????

God damn. I'm an undergrad math major, and you make more sense than all the Algebra related professors that I've had. I naively thought of matrices as "squares of numbers", it makes so much more sense to think of them as linear transformations. Please do a series explaining group theory/abstract algebra at some point.

I love seeing the visualisations of these types of transformations. And I'm addicted to watching these videos even though I study maths as it is (first year). But I seem to find the visual + maths alot more confusing than just the maths for some reason. I did well in physics last year but I found the same thing there. Actually imagining nuclear energy or electron exchanges ect was way harder (and less fun for me) than just the maths. Hopefully I'll improve my ability to apply as I carry on studying.

I'm studying symmetry operations and space groups and often times I come back to these videos because thay can show me how those transformations can take place in a very intuitive way… Thank you very much man, keep the good work!

This is beautiful :')

Your video literally transformed my mind's vectors from negative to positive. <3

I really want to understand this but how do you figure out where transformed i hat and j hat move to at (4:14) in the video? I have sat for about an hour trying to figure this out and I know it must be really simple. Thanks for any help. (And calling me a thicky is not helping!)

Why don’t the books of engineering like Statics rely on the method of matrices and linear algebra instead of the polar form which includes the magnitude and the direction(angle) of vectors?

If so, that would’ve been nice and very organized.

We only use matrices for solving for multiple systems of equations and for solving problems related to the cross-product like the body moments.

Amazing!

Any Game devs here learning for understanding vector transformations in their 3D games ?

im trying to self teach myself this but I still don't understand

wish i could give you a hug brother. This is so satisfying that it makes me cry, beautiful.

I literally have a first class engineering degree, but it took watching this video to understand the concepts behind matrices…

Great…Your are the GOD of maths teaching

Amazing.

The most important video in the history of mankind.

My final exam is coming and I still confused about linear transformation. i saw this video in few weeks ago but I was lazy to watch it. Now after watching this video, I finally have gained some confidence to sit for my exam. (Watching 3Blue1Brown = Revision better than reading my lecturer's notes.) XDDD

it all makes sense now .. thank you so much. i intuitively derived x' = x (cosθ) – y(sinθ), y'=x(sinθ)+y(cosθ) with your explanation. now i don't need to memorize those formulae anymore.

With a linear transformation, each input has only one output. But is the reverse true? So, does every input-output pair have one unique transformation? I’m not talking about all vectors. What I mean is, for 2 specific vectors, can you come up with more than 1 transformation for that specific pair?

The first time i realised the beauty of maths

You are the best teacher I've ever seen.

Thank you.

Your an excellent educator.

Why the hell wasn't I taught this in MATH 212. C'mon man

when I see this video , every time , I think how nice if I were Joseph Stalin and I could put all those unimaginative people in Gulags!

So clear…so beautiful

i love u so much youre the best teacher i have ever seen in my life

I passed linear algebra with an A and still had no idea what the actual fuck I was doing with anything.

I made it about 6 and a half minutes into this video and suddenly everything clicked and fell into place. You are a god amongst men!

Not sure which is more impressive, your intuitive knowledge of math or your programming skills. I think people don't appreciate the latter as much as the former, although theyre not mutually exclusive, but I see it!

3Blue1Brown(Essence of Linear Algebra) + Mit 18.06(Linear Algebra by Prof. Gilbert Strang) = <3

finally understood linearly dependent of matrices

Are there any online graphing tools which allow you to punch in the coordinates of a shape and a transforming matrix, showing you the shape and its image accordingly?

Just perfect!

how is -1(-2) + 2(0) = 2? every time i do it i get 3

(i'm not used to this notation so what i did was -1 – -2 + 2 + 0

can someone explain to me how i should have interpreted the notation?

edit: N(X) is times (same as N * X)

lost you at 4.10 perhaps a little more explanation

I'm sorry but this cannot be free content. You're providing knowledge worth it's weight in gold. I spent 6 months of college studying linear algebra and you've just surpassed everything and everyone. Heck I am still ready to pay you my entire tuition fees if it means learning from you.

Sir you are a Legend.

Thank you, my best professor.

My life has been linearly transformed by 'shear' joy of watching this video!

Thank you

Jesus Christ I had my mind blown watching this!

Never knew there was such an intuitive way to to think about matrices!

They always teach a matrix as being a rectangular array of numbers with no purpose whatsoever in school, when really it’s just a way to numerically describe a linear transformation! That’s crazy!

Really appreciate your stuff 3b1b

Oh my god. I understand linear transformations now.

Yeah, maybe you should stick to pen and paper for this. That looks like trying to read a book that's 2 feet underwater with shadows from a tree obfuscating the letters. I think a simple analogy with an explanation would have been more effective…

I love you?

I'm pretty confused by the matrix notation. Could anyone explain why we call a matrix:

[ a b ]

[ c d ]

and not:

[ a c ]

[ b d ] ?

As the columns of the matrix represent the transformation of î and j, shouldn't they have somehow independent notation form each other? What's the thing that makes rows more connected than columns?

YOU'RE BREATHTAKING! Please don't ever stop making videos. World needs teachers like you!

Meu muitíssimo obrigado a quem traduziu esse vídeo, de verdade, ajudou muito!

Uma observação que foi feita nas traduções dos vídeos posteriores é sobre a tradução de shear, que é "cisalhamento".

Como não é possível editar a tradução para a correção, fica aqui a obs.

Oh my god, this is just amazing! Being a Master's student in physics and having pretty decent general knowledge of linear algebra and calculus I was just literally AMAZED on how intuitively you've explained this topic and I've just realized that I was totally lacking this understanding! Thank you a lot, you've just opened my eyes and made me love linear algebra again! I am definitely gonna finish watching your series!

You're doing god's job, thank you!

Great video , i wish our teachers could be like you.

I wish I had stumbled across this video when I was starting my adventure with computer graphics. up until now matrix-vector multiplication for me was just this mysterious, counter-intuitive way of adding rotation, movement scale and perspective to objects in 3d space… you changed that

Your videos fill in gaps of many a past readings on maths. Things transform from labored comprehension to intuitive knowledge. 🙂

i could have gave you some dozen awards

How this the person who made the video even get access of this crazy information about the grids moving in space?? I also wonder how long and complicated it is to animate this…

Wtf i am not able to understand this video 🤯

I take back my words

I do not understand how to space is moved around ? It is cleared not just moved its zoomed and squished as well. Also if it's moved clockwise or anti clockwise ?I count myself as one of the luckiest people to have stumbled on this video just before taking linear algebra in college

So what I'm gettin for this is that linear transformations are basically "if I (linearly) fuck up my axes, where will this vector (which was previously on standard axes) will land? (When compared to standard axes"

Dude, I literally love you. Well figuratively. Or, like, you are the 2×2 matrix of my understanding of vectors.

This, this is how you do it.

BTW, I have used a portion of a still from your video here as an example of transformations in virtual cork board presentation of a conceptual design for a graphical based ontology modeling tool.

see figure 16 in

https://padlet.com/rlwbeachbum/dxlnakc90jga

2 minutes of this video helped me understand something that a whole semester from Dr. Ukrainian-guy and about 3 hours of various YT videos could not. God bless you, sir!

i just wanna throw some "congrats" too.

Thanks for that over-spimplified version + visualization of matrices and stuff. Understanding those concepts feels great!

I might pass my Linear Algebra exams now..

So can you view matrix addition as a linear transformation? That is, when one is adding two vectors, is their sum a linear transformation of both of those vectors? The resulting vector is still rooted at the origin and it is still a straight line

shout out to all the translator and 3Blue 1 Brown

I don't know after watching this, still, it would be a relevant question or no. But let me ask! … I understood matrix multiplication conceptually!… But still the quirky way it comes up during the actual calculation .. why it is so? because the way originally the coordinates are written. right?

I mean: What made them to write the coordinates in the matrix in that vertical way, so that multiplication turns out to be this weird !!

If matrix multiplication's calculation was known, then the writing trick of coordinate numbers in the form of a matrix could/should have been arranged in a different way right ?? !! so that, during actual picking up of elements from the 2 matrices, it feels like simple element by element multiplication?

Why the vector's coordinates are written vertically ?. or let's say, during multiplication why not transpose the matrix, then multiply in a straight way! ??

Linear dependence at 9:25 couldn't have been explained any better than this!

I tried to reproduce the 4:00 explanation on paper. On my iPad, I created the î, ˆj and ˆv. Then I copied and rotated the entire draw so that the ˆv was positioned somewhere else. By the time I tried to figure out mathematically the location of transformed ˆv through equation, the result and the position didn't match. Then I visually figured it out and realised that I made a mistake in the equation. In other words, I used the vector to see what I did wrong and resolved the equation. That was fucking awesome.

Sorry I’m 13 and taking algebra 1 but I wanted to learn eigenvalues and vectors but what is a input (and output) vector and how does it correspond to a output vector

Genuine question: Without this video, what are the other means by which one can acquire such an intution for the topic? Any specific textbooks, lab exercises? I want to continue developing such intuition after these videos end because unfortunately they do.