Let's look at what the transpose of a matrix means intuitively. We'll understand how the transpose of a matrix is needed for trying to find pairs of vectors that have the same dot product before and after some linear transformation. We'll also use the Singular Value Decomposition to get a better geometric intuition for how these transformations appear geometrically. #linearalgebra #transpose #svd #SoMEpi
Correction: Around 13:20, when I say that Sigma-transpose = Sigma, this is only true if A (and therefore Sigma) are square matrices.
Prerequisites: you should already understand how matrices are linear transformations, matrix inverses and the identity matrix, and vector dot products. Knowing about the Singular Value Decomposition would help too, but isn't strictly required.
Chapters:
0:00 Introduction
0:48 Prerequisites
1:19 How to Take the Transpose
1:50 Properties of the Transpose
3:56 Motivating Question
4:56 Linear Transformations Do Not Necessarily Preserve the Dot Product
6:21 Linear Transformations and Dot Products, Visually
7:04 How Can We Preserve the Dot Product?
8:30 Preserved Dot Products, Visually
9:41 Orthogonal Matrices
11:10 Singular Value Decomposition Introduction
12:39 Using the SVD on the Inverse-Transpose
15:28 Additional Examples with the SVD
16:31 What if A is not invertible?
18:25 Main Equation
19:13 Visualization Revisited
19:43 Transpose vs. Inverse
20:38 SVD of the Inverse and Transpose
21:39 SVD of Each Matrix, Visualized
23:34 Symmetric Matrices
24:30 Summary