# How to think about matrix multiplication

There is more than one way to think about matrix multiplication. Just by looking at the definition

$\begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{n1} & \dots & b_{nn} \end{bmatrix} = \begin{bmatrix} \sum_{i=1}^{n} a_{1i}b_{i1} & \dots & \sum_{i=1}^{n} a_{1i}b_{in} \\ \vdots & \ddots & \vdots \\ \sum_{i=1}^{n} a_{ni}b_{i1} & \dots & \sum_{i=1}^{n} a_{ni}b_{in} \end{bmatrix},$ matrix multiplication is not easy to understand. However, there are multiple ways of looking at it, each one revealing invaluable insights.

Let's take a look at them! First, let's unravel the definition and visualize what happens.

For instance, the element in the 2nd row and 1st column of the product matrix is created from the 2nd row of the left and 1st column of the right matrices by summing their elementwise product.

To move beyond the definition, first, we introduce some notations. A matrix is built from rows and vectors, each of which can be viewed as individual vectors. You can think of them as a horizontal stack of column vectors or a vertical stack of row vectors:

\begin{align*} A = \begin{bmatrix} a_{11} & a_{12} & \dots & a_{1n} \\ a_{21} & a_{22} & \dots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \dots & a_{nn} \end{bmatrix} = \begin{bmatrix} \mathbf{a}^1 \\ \mathbf{a}^2 \\ \vdots \\ \mathbf{a}^n \end{bmatrix} = \begin{bmatrix} \mathbf{a}_1 | \mathbf{a}_2 | \dots | \mathbf{a}_n \end{bmatrix} \end{align*},

where $\mathbf{a}^i$ represents the rows and $\mathbf{a}_i$ represent the columns.

## Matrix multiplication as the mixture of column vectors

Now, let's start by multiplying a matrix and a row vector.

By writing out the definition, it turns out that the product is just a linear combination of the columns, where the coefficients are determined by the vector we are multiplying with!

That is, we get

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix} &= \begin{bmatrix} \sum_{i=1}^{n} a_{1i}x_i \\ \vdots \\ \sum_{i=1}^{n} a_{ni}x_i \\ \end{bmatrix} \\ &= \sum_{i=1}^n x_i \begin{bmatrix} a_{1i} \\ \vdots \\ a_{ni} \end{bmatrix}. \end{align*}

Taking this one step further, we can stack another vector.

This way, we can see that the product of an $n \times n$ and an $n \times 2$ matrix equals the product of the left matrix and the columns of the right matrix, horizontally stacked:

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & b_{12} \\ \vdots & \vdots \\ b_{n1} & b_{n2} \end{bmatrix} = \begin{bmatrix} A \mathbf{b}_1 | A \mathbf{b}_2 \end{bmatrix}. \end{align*}

Applying the same logic, we can finally see that the product matrix is nothing else than the left matrix times the columns of the right matrix, horizontally stacked. This is an extremely powerful way of thinking about matrix multiplication:

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{n1} & \dots & b_{nn} \end{bmatrix} = \begin{bmatrix} A \mathbf{b}_1 | \dots | A \mathbf{b}_n \end{bmatrix}. \end{align*}

By switching our viewpoint a bit, we can also get the product as vertically stacked row vectors, as shown below:

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{n1} & \dots & b_{nn} \end{bmatrix} = \begin{bmatrix} \mathbf{a}^1 B \\ \vdots \\ \mathbf{a}^n B \end{bmatrix}. \end{align*}

## Matrix multiplication as dot products

There is another interpretation of matrix multiplication.

Let's rewind and go back to the beginning, studying the product of a matrix $\textstyle A$ and a column vector $\textstyle x$.

Do the sums in the result look familiar?

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix} = \begin{bmatrix} \sum_{i=1}^{n} a_{1i} x_i \\ \vdots \\ \sum_{i=1}^{n} a_{ni} x_i \end{bmatrix} \end{align*}

These sums are just the dot product of the row vectors of $\textstyle A$, taken with the column vector $\textstyle x$! That is, we have

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix} = \begin{bmatrix} \langle \mathbf{a}^1, x \rangle \\ \vdots \\ \langle \mathbf{a}^n, x \rangle \end{bmatrix}. \end{align*}

In general, the product of $\textstyle A$ and $\textstyle B$ is simply the dot products of row vectors from $\textstyle A$ and column vectors from $\textstyle B$!

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{n1} & \dots & b_{nn} \end{bmatrix} = \begin{bmatrix} \langle \mathbf{a}^1, \mathbf{b}_1 \rangle & \dots & \langle \mathbf{a}^1, \mathbf{b}_n \rangle \\ \vdots & \ddots & \vdots \\ \langle \mathbf{a}^n, \mathbf{b}_1 \rangle & \dots & \langle \mathbf{a}^n, \mathbf{b}_n \rangle \\ \end{bmatrix} \end{align*}

## Conclusion

To sum up, we have three interpretations: matrix multiplication as

1. vertically stacking row vectors,
2. horizontally stacking column vectors,
3. and as dot products of row vectors with column vectors.

\begin{align*} \begin{bmatrix} a_{11} & \dots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{n1} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} b_{11} & \dots & b_{1n} \\ \vdots & \ddots & \vdots \\ b_{n1} & \dots & b_{nn} \end{bmatrix} &= \begin{bmatrix} \mathbf{a}^1 B \\ \vdots \\ \mathbf{a}^n B \end{bmatrix} \\ &= \begin{bmatrix} A \mathbf{b}_1 | \dots | A \mathbf{b}_n \end{bmatrix} \\ &= \begin{bmatrix} \langle \mathbf{a}^1, \mathbf{b}_1 \rangle & \dots & \langle \mathbf{a}^1, \mathbf{b}_n \rangle \\ \vdots & \ddots & \vdots \\ \langle \mathbf{a}^n, \mathbf{b}_1 \rangle & \dots & \langle \mathbf{a}^n, \mathbf{b}_n \rangle \\ \end{bmatrix} \end{align*}

When studying matrices, each of them is immensely useful.