\section{The Matrix Representation of a Linear Transformation} \begin{definition} \hfill\\ Let $V$ be a finite-dimensional vector space. An \textbf{ordered basis} for $V$ is a basis for $V$ endowed with a specific order; that is, an ordered basis for $V$ is a finite sequence of linearly independent vectors in $V$ that generates $V$.\\ For the vector space $\F^n$, we call $\{e_1, e_2, \dots, e_n\}$ the \textbf{standard ordered basis} for $\F^n$. Similarly, for the vector space $P_n(\F)$, we call $\{1, x, \dots, x^n\}$ the \textbf{standard ordered basis} for $P_n(\F)$. \end{definition} \begin{definition} \hfill\\ Let $\beta = \{v_1, v_2, \dots, v_n\}$ be an ordered basis for a finite-dimensional vector space $V$. For $x \in V$, let $a_1, a_2, \dots, a_n$ be the unique scalar values such that \[x = \sum_{i=1}^{n}a_iv_i.\] We define the \textbf{coordinate vector of $x$ relative to $\beta$}, denoted by $[x]_\beta$, by \[[x]_\beta = \begin{pmatrix} a_1 \\ a_2 \\ \vdots \\ a_n\end{pmatrix}.\] Notice that $[v_i]_\beta = e_i$ in the preceding definition. It can be shown that the correspondence $x \to [x]_\beta$ provides us with a linear transformation from $V$ to $\F^n$. \end{definition} \begin{notation} \hfill\\ The following notation is used to construct a matrix representation of a linear transformation in the following definition.\\ Suppose that $V$ and $W$ are finite-dimensional vector spaces with ordered bases $\beta = \{v_1, v_2, \dots, v_n\}$ and $\gamma = \{w_1, w_2, \dots, w_m\}$, respectively. Let $T: V \to W$ be linear. Then for each $j$, $1 \leq j \leq n$, there exist unique scalars $a_{ij} \in \F$, $1 \leq i \leq m$, such that \[T(v_j) = \sum_{i=1}^{m}a_{ij}w_i\ \ \text{for}\ 1 \leq j \leq n.\] \end{notation} \begin{definition} \hfill\\ Using the notation above, we call the $m \times n$ matrix $A$ defined by $A_{ij} = a_{ij}$ the \textbf{matrix representation of $T$ in the ordered bases $\beta$ and $\gamma$.} and write $A = [T]_\beta^\gamma$. If $V = W$ and $\beta = \gamma$, then we write $A = [T]_\beta$. Notice that the $j$th column of $A$ is simply $[T(v_j)]_\gamma$. Also observe that if $U: V \to W$ is a linear transformation such that $[U]_\beta^\gamma = [T]_\beta^\gamma$, then $U=T$ by the corollary to Theorem 2.6 (\autoref{Corollary 2.1}). \end{definition} \begin{definition} \hfill\\ Let $T,U: V \to W$ be arbitrary functions, where $V$ and $W$ are vector spaces over $\F$, and let $a \in \F$. We define $T + U: V \to W$ by $(T+U)(x) = T(x) + U(x)$ for all $x \in V$, and $aT: V \to W$ by $(aT)(x) = aT(x)$ for all $x \in V$. \end{definition} \begin{theorem} \hfill\\ Let $V$ and $W$ be vector spaces over a field $\F$, and let $T,U: V \to W$ be linear. \begin{enumerate} \item For all $a \in \F$, $aT+U$ is linear. \item Using the operations of addition and scalar multiplication in the preceding definition, the collection of all linear transformations from $V$ to $W$ is a vector space over $\F$. \end{enumerate} \end{theorem} \begin{definition} \hfill\\ Let $V$ and $W$ be vector spaces over $\F$. We denote the vector space of all linear transformations from $V$ to $W$ by $\LL(V, W)$. In the case that $V = W$, we write $\LL(V)$ instead of $\LL(V, W)$. \end{definition} \begin{theorem} \hfill\\ Let $V$ and $W$ be finite-dimensional vector spaces with ordered bases $\beta$ and $\gamma$, respectively, and let $T,U: V \to W$ be linear transformations. Then \begin{enumerate} \item $[T+U]_\beta^\gamma = [T]_\beta^\gamma + [U]_\beta^\gamma$ and \item $[aT]_\beta^\gamma = a[T]_\beta^\gamma$ for all scalars $a$. \end{enumerate} \end{theorem}