1 minute read

This post is based on MathMatics for Machine Learning in Coursera.

How Matrices transforms spaces

we studied idea that solving simultaenous equation problems using matrices. And , we know that colums on matrices are transformed basis vectors. Now, we ‘ll see how matrices transform spaces.

img

we assume that \(e_1=(1,0) ,e_2=(0,1) ,\hat e_1=(2,10) , \hat e_2=(3,1)\)

We could make any vector in vector sum of \(\hat e_1 ,\hat e_2\) . That means result of transformation going to be vector sum .

Rule

Here is rules that we defined.

\[A . r = \hat r\] \[A \cdot (nr) = n \cdot \hat r\] \[A \cdot (r + s) = Ar + As\]

Then , we induce that \((A (n_1 \cdot e_1 + n_2 \cdot e_2 ) = n_1 \cdot A \cdot e_1 + n_2 \cdot A \cdot e_2 = n_1 \cdot \hat e_1 + n_2 \cdot \hat e_2\) .

We see that vector sum rule(make any vector in vector sum) works well . Let’s see concrete example.

\[\begin{pmatrix}2&3\\10&1\end{pmatrix} \begin{pmatrix}3\\2\end{pmatrix} = \begin{pmatrix}32\\12\end{pmatrix}\] \[3 \cdot \begin{pmatrix}2\\10\end{pmatrix} + 2 \cdot \begin{pmatrix}3\\1\end{pmatrix}\]

So, it is just multplicatoin vector of vector sum of transformed vectors.

Types of matrix Transformation

\[n, m ( > 0)\]
  • \(\begin{pmatrix}3&0\\0&2\end{pmatrix} \to\)scale x-asis factors n , scale y-axis factors m
  • \(\begin{pmatrix}-1&0\\0&2\end{pmatrix} \to\) scale x-axis factors minus n ,scale y-axis factors n
  • \(\begin{pmatrix}-1&0\\0&-1\end{pmatrix} \to\) scale x-axis facotrs minus n , scale y-axis factors n
  • \(\begin{pmatrix}0 & 1\\1 &0\end{pmatrix} \to\) symmetric about \(y =x\)
  • \(\begin{pmatrix}0 & -1\\-1 &0\end{pmatrix} \to\) symmetric about \(y =-x\)
  • \(\begin{pmatrix}1 &1\\0 &1 \end{pmatrix} \to\) common conversion
  • \(\begin{pmatrix} \cos \theta & \sin \theta \\ - \sin \theta & \cos \theta \end {pmatrix}\) rotation transformation
  • \(\begin{pmatrix} \cos \theta & \sin \theta & 0 \\ - \sin \theta & \cos \theta & 0 \\ 0 & 0& 1 \end {pmatrix}\) rotation transformation preserve z in 3-D

Combination and Composition of matrix transformation

\[A_1 = \begin{pmatrix}0 & 1\\-1 &0\end{pmatrix} A_2 = \begin{pmatrix}-1 & 0\\0 &1\end{pmatrix}\]

We show you two combination of matrix transformation

\[A_1 \cdot A_2 = \begin{pmatrix}0 & 1\\-1 &0\end{pmatrix} \cdot \begin{pmatrix}-1 & 0\\0 &1\end{pmatrix} = \begin{pmatrix}0 & 1\\1 &0\end{pmatrix}\] \[A_2 \cdot A_1 = \begin{pmatrix}-1 & 0\\0 &1\end{pmatrix} \cdot \begin{pmatrix}0 & 1\\-1 &0\end{pmatrix} = \begin{pmatrix}0 & 1\\1 &0\end{pmatrix}\]

\(A_1 \cdot A_2\to\) symmetric about \(y =x\)

\(A_2 \cdot A_1\to\) symmetric about \(y =-x\)

Matrices are associatve but not commutative

we should’be careful about matrix mulitplication

Leave a comment