The essence of linear algebra, the source video www.bilibili.com/video/BV1ys…
@
directory
-
Matrices and linear transformations
- Matrix multiplication and composition transformation
Unfortunately, no one can be told what the Matrix is. You have to see it for your self.
—— Morpheus
What is the matrix? A Matrix is a set of complex or real numbers arranged according to a rectangular array. A Matrix whose elements are real numbers is called a real Matrix, and a Matrix whose elements are complex numbers is called a complex Matrix. A matrix with rows and columns equal to n is called an n-order matrix or square matrix.
One of the most overlooked but important things in linear algebra is the notion of linear transformations and how they relate to matrices.
Matrices and linear transformations
A transformation, a transformation is just another way of saying a function, it takes an input, and it outputs the corresponding result. In particular, in linear algebra, we’re thinking about a transformation that takes a vector and outputs a vector.
Why do we use transformations? Because transformations imply visualizing the input-output relationship in a particular way, one way to understand the function of vectors is to use motion.
For example, in two dimensions, we move an input vector to the position of an output vector. To understand the transformation, we can imagine each input vector going to the position of the corresponding output vector.
In two dimensional space, we can transform all points on the infinite grid at the same time, but also retain the original coordinate grid, in order to track the position of the starting point and the end point.
So what is a linear transformation? The transformation needs to meet the following two conditions:
- A straight line remains straight after transformation and cannot be bent
- The origin has to be fixed
A transformation that satisfies these two conditions is a linear transformation.
For example, when we perform the following linear transformation on a vector, we actually only need to record the positions of the original two basis vectors, i.ei
和 j
When you transform the vector, everything else changes.
For example, let’s take the vector minus 1,2, the transformed vectorv
The position of PI is minus 1 and the transformationi
The product of PI plus 2 and the transformation of PIj
The product of, in other words,v
Vector isi
与 j
A linear combination of vectors, and the transformation vector, is also the transformation vectori
和 j
A linear combination of theta.
So we just need to get all the way up to the transformationi
和 j
The position of the vector.
For example, for a transformation like the following, we can calculate it like this:
We only need four numbers to determine a linear transformation!
Here the matrix is just a combination of linear transformation information. For the more general case, we can get the following transformation:
With the above foundation, we can make a summary:
Strictly speaking, a linear transformation is a class of functions that take vectors as inputs and outputs.
You can think of a linear transformation as an extruded extension of space, keeping the grid lines parallel and equally spaced, and keeping the origin constant and the point is that a linear transformation is completely determined by what it does to the basis vectors of space.
It’s a basis vector that is orthonormal to the unit in the space of two, because any other vector can be represented as a linear combination of the basis vectors which means that if you just write down the transformation of I hat and j hat, you can figure out a transformation of x and y.
Conventionally, the coordinates of the transformed i-hat and j-hat are taken as the columns of a matrix, and the sum of the two columns multiplied by x and y is defined as the matrix vector product.
So, a matrix represents a particular linear transformation, and when you multiply a matrix times a vector, you’re applying a linear transformation to that vector
Matrix multiplication and composition transformation
In many cases, we may have a combination of transformations, such as rotating a vector 90 degrees clockwise and then shear, resulting in two transformations.
If we take the two transformations separately, we get the same result as if we recorded the total transformation of rotation and shearing.
In this case we call the matrix that captures the two transformationsComposite transform.
But you have to read from right to left, because that’s what the composition function is defined as,f ( g ( x ) )
。
So we need to figure out the final thetai
和 j
Where does the vector go, so we need to do the calculation, so let’s think of the first column of M1 as the transformation, righti
The vector 1, 2, the second column can be viewed as the transformationj
Vector.
So we have this right herei
The vector 1, 2, and then you can do the same thingj
Vectors, and that’s where we get the idea of matrix multiplication and the idea of composition.
Before memorizing the calculation formula of matrix multiplication, it is more important to understand the meaning of matrix multiplication and then perform the calculation.
Now we’re going to learn the determinant.