Matrices and Graphs

9 minute read

To study structure,
tear away the flesh till
only the bone shows.

This haiku came to my mind yesterday while I was walking in the streets thinking about matrices and graphs. I am supposed to give an introductory lecture about them for undergraduate students in a few weeks, and it seemed like a good reason to finally write a post about mathematics.

1. Matrices and their graphs. Our goal for today is to introduce a connection between matrices and graphs and to illustrate the strength of graph theoretic methods in linear algebra and vice versa. What is shown here is only the tip of the iceberg. These methods turned out to be very fruitful and they have many applications inside and outside mathematics. My favourite book on this subject is [1], I strongly recommend it for further reading.

Definition 1. Let be a real matrix. The directed graph (or digraph in short) of is a directed and weighted graph , where denotes its vertices, its edges, its weights, and they are given by
(i) ,
(ii) for all ,
(iii) , .

Albeit is a complete graph, if an edge has zero weight, it can be omitted in a sense. Let be a walk on containing the vertices . Its weight is defined as

$$ W(S) = \prod_{l=1}^{k-1} W(v_k, v_{k+1}).$$

The weight of an edge can be thought of as some kind of cost for traversing and similarly, the weight of a walk is the total cost for traversing . By defining as we did, the powers of can be interpreted in a nice way.

Proposition 1. Let be a square matrix and let denote its -th power. Then

$$ a_{i,j}^{(k)} = \sum_{S \in \Gamma_{i,j,k}} W(S), $$

where

$$ \Gamma_{i,j,k} = \{ S: S \text{ is a path of length } k \text{ connecting } i \text{ and } j \}. $$

Proof. The proof goes by induction with respect to . For , the situation looks something like this.

matrix_multiplication
2 step long walks between vertices i and j

The weight of a step long walk between and through is always , therefore the sum of all weight equals to , which equals to .

If the statement holds for , then the case follows using the same argument noticing that every step long walk can be decomposed to a step long walk and a step long walk.

2. Permutation graphs. A large role is played by the so-called permutation matrices.

Definition 2. A matrix is a permutation matrix if its elements are -s and -s arranged in a way such that each row and each column contains precisely one .

If is a permutation then there is a permutation matrix such that the graph of is the graph of as a permutation. Of course, this correspondence is an isomorphism between the group of permutations and the group of permutation matrices. In this case, can be written as , where is the Kronecker delta symbol defined as

$$ \delta_{i,j} =\begin{cases}1 & \text{if } i = j \\0 & \text{otherwise}.\end{cases} $$

Multiplying a matrix with a permutation matrix yields a matrix with same elements but its rows and columns permuted, that is

$$ A P_\pi = (a_{i \pi(j)})_{i,j=1}^{n} $$

and

$$ P_\pi A = (a_{\pi(i) j})_{i,j=1}^{n} $$

holds for all . The properties of permutation matrices are summarized in the following proposition.

Proposition 2. Let be a permutation matrix. Then
(i) ,
(ii) can be factored into the product of matrices , where is obtained from the identity matrix by transposing its -th row with its -th row,
(iii) and for all , the directed graph is isomorphic with .

Proof. (i) This identity can be calculated easily by using the identities describing and , or by noticing that if , then .
(ii) Since the matrix corresponds to the transposition , this statement is immediate.
(iii) Because can be factored into the product of matrices , it is enough to show that is isomorphic to . First note that can be obtained by selecting every edge in going into and redirecting them to and vice versa.

permutation_conjugation_1
Before
permutation_conjugation_2
After

Similarly, is obtained from by selecting every edge going out from and repositioning them such that they are going out from and vica versa. Combining these two observations, it is clear that can be obtained from by simply switching the labels for the vertices and .

3. Nonnegative matrices. Now we direct our attention to nonnegative and positive matrices. A matrix is nonnegative (positive) if it is nonnegative (positive) elementwise. Many important results were proven by Oskar Perron in the beginning of the 20th century for positive matrices and some of them were extended to nonnegative matrices by Georg Frobenius. Now these results are known as ‘‘Perron-Frobenius theory’’ and they play a very important role in applications, for example Markov chains, online search engines, etc. Our aim is to show that every nonnegative matrix can represented in a so-called Frobenius normal form, which is quite useful in applications.

Definition 3. Let be a nonnegative matrix.
(i) is reducible if there is a permutation matrix such that

$$ P^T A P = \begin{pmatrix} B & C \\ 0 & D \end{pmatrix}, $$

where , and .
(ii) is irreducible if it is not reducible.

The main question is how can we describe the structure of such matrices with the aid of graph theory?

Definition 4. Let be a directed graph and let and be two of its vertices.
(i) and are weakly connected if there is a directed path from to .
(ii) and are strongly connected if there is a directed path from to and there is a directed path from to .
(iii) The equivalence classes of the ‘’ and are strongly connected’’ equivalence relation are called strongly connected components of .
(iv) is strongly connected if it has only one strongly connected component.

It turns out that Definition 3 and Definition 4 are strongly connected. (Both in literal and in figurative way.)

Proposition 3. A nonnegative matrix is irreducible if and only if its directed graph is strongly connected. Equivalently, is reducible if and only if its graph is not strongly connected.

Proof. We shall show the latter statement. Suppose that is irreducible and for some permutation matrix we have

$$ P^T A P = \begin{pmatrix} B & C \\ 0 & D \end{pmatrix}, \quad B \in \mathbb{R}^{p \times p}, C \in \mathbb{R}^{p \times (n - p)}, D \in \mathbb{R}^{(n-p) \times (n-p)}. $$

In this case, the vertices in correspond to the first row of and the remaining vertices correspond to the final rows. From the structure of it can be seen that there are no edges going from to .

reducible_matrix_graph

This graph is clearly not strongly connected, since there are no edges going from to . The other direction goes the same way. That is, if is something like in the above picture, than the vertices on the connected part should be labeled with and the remaining part is labeled with the rest.

Now we are ready to prove the main theorem which guarantees the existence of the so-called Frobenius normal form for nonnegative matices.

Theorem 1. (Frobenius normal form) Let be a nonnegative matrix. Then there exists a permutation matrix such that

$$ P^T A P = \begin{pmatrix} A_1 & A_{1,2} & A_{1,3} & \dots & A_{1,k} \\ 0 & A_2 & A_{2,1} & \dots & A_{2,k} \\ 0 & 0 & A_3 & \dots & A_{3,k} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\0 & 0 & 0 & \dots & A_k \end{pmatrix}, $$

where are square irreducible matrices. This form is called the Frobenius normal form.

Proof. This proof is so easy with the concepts we have developed so far that it is much easier to demonstrate on an example rather than give a formal proof. Consider the graph and its strongly connected components. For example, it looks like this.

frobenius_normal_form

The components are ordered in a way such that for all , there are no edges going from vertices corresponding to into vertices corresponding to . Labeling the vertices in a way that it preserves this order yields a matrix which is in the desired form. It is also clear that is irreducible, since its graph is strongly connected.

4. The spectrum of graphs.

Definition 5. Let be a simple graph with vertices . Its adjacency matrix is defined as the matrix , where is given by

$$ a_{i,j} = \begin{cases} 1 & \text{if there is an edge in } \mathcal{G} \text{ connecting } i \text{ and } j \\ 0 & \text{otherwise}. \end{cases} $$

The set

$$ \sigma(\mathcal{G}) = \{ \lambda: \lambda \text{ is an eigenvalue of } A \} $$

is called the spectrum of the graph.

Note that since an adjacency matrix is always symmetric, the spectrum of a graph is a subset of the real numbers. Two graphs and are said to be isospectral if , that is, they have the same spectrum. If two graphs have the same spectrum it is not necessarily true that they are isomorphic, for example the two graphs

isospectral

are isospectral but not isomorphic. Although it is not obvious that the spectrum of a graph contains useful information, it is very much so, as the following theorem illustrates.

Theorem 2. The graph is bipartite (that is, its chromatic number is ) if and only if for every eigenvalue of , is also an eigenvalue.

Proof. We only show one direction. Suppose that is bipartite. Then its adjacency matrix looks like

$$ A = \begin{pmatrix} 0 & B \\ B^T & 0 \end{pmatrix} $$

for some (not necessarily square) matrix . If is an eigenvalue of with eigenvector , then can be split up in two blocks

$$ x = \begin{pmatrix} x_1 \\ x_2 \end{pmatrix}, \quad x_1 \in \mathbb{R}^k, x_2 \in \mathbb{R}^{n - k}. $$

It is easy to see that the vector

$$ \tilde{x} = \begin{pmatrix} x_1 \\ -x_2 \end{pmatrix} $$

is an eigenvector of with the eigenvalue .

The other direction is more difficult. It utilizes the theory developed by Perron and Frobenius. Although it is not complicated, it requires a lengthy buildup. For details, see [3], Chapter 11 Problem 19.

As one can easily see, there are only finitely many subsets in which can be obtained as a spectrum for some graph with vertices. As such, one would expect (at least I did) that isospectral graphs are not that common. The following theorem says exactly the opposite.

Theorem 3. Let denote the set of trees with vertices and define the set

$$ \mathcal{I}_n = \{ T \in \mathcal{T}_n: \text{ there is a } T^* \in \mathcal{T}_n \text{ which is isospectral with } T \}. $$

Then

$$ \lim_{n \to \infty} \frac{|\mathcal{I}_n|}{|\mathcal{T}_n|} = 1. $$

We shall not prove this theorem here, the proof can be found in the conference proceedings [2]. The main idea of the proof is that certain subgraphs of trees can be replaced with a different subgraph without changing the spectrum.

References.

[1] R. A. Brualdi and D. Cvetkovic, A Combinatorial Approach to Matrix Theory and Its Applications, Discrete Mathematics and Its Applications, Chapman & Hall, 2009

[2] F. Harary (editor), New Directions in the Theory of Graphs, New York, Academic Press, 1973

[3] L. Lovász, Combinatorial Problems and Exercises, 2nd edition, North-Holland, 1992

Updated: