Algebraic Combinatorics
by djmathman, Dec 28, 2015, 2:01 AM
aka a way to use algebra to mask the fact that actual combo is hard
I haven't done a true math post in a while, so here goes....
Chapter 1 Key Idea: Let
be a finite graph on
vertices (not necessarily simple), and let
denote its adjacency matrix. Then the number of closed walks on
of length
is
where
is the sequence of eigenvalues of
. (Note that all the
are real by the Spectral Theorem.)
This isn't too hard to prove (and is probably made easier based on the wording of the statement). Now on to the problems I guess?
Solution
Solution
Solution
Solution
Solution
I haven't done a true math post in a while, so here goes....
Chapter 1 Key Idea: Let





![\[f_G(\ell)=\sum_{i=1}^n(A(G)^\ell)_{i,i}=\operatorname{tr}\left(A(G)^\ell\right)=\lambda_1^\ell+\lambda_2^\ell+\cdots+\lambda_n^\ell,\]](http://latex.artofproblemsolving.com/6/0/9/609bcc4e4fdcc62bdca9c7de8ce776ad99658610.png)



This isn't too hard to prove (and is probably made easier based on the wording of the statement). Now on to the problems I guess?
Stanley Chapter 1 Exercise 2 wrote:
Suppose that the graph
has
vertices and that the number of closed walks of length
in
is
for all
. Let
be the graph obtained from
by adding a loop at each vertex (in addition to whatever loops are already there). How many closed walks of length
are there in
?




![\[8^\ell+2\cdot 3^\ell+3\cdot(-1)^\ell+(-6)^\ell+5\]](http://latex.artofproblemsolving.com/8/7/b/87be345b574f9b30a4b84d9aa63d27843f6c0fe4.png)





Solution
To kill this problem, we need a somewhat useful lemma.
LEMMA: Suppose
,
,
and
,
,
are nonzero complex numbers such that for all positive integers
, we have
Then
and the
's are just a permutation of the
's.
Proof. The following proof is due to Richard Stanley (the author of the textbook I'm working from). We use generating functions. Let
be a complex number whose modulus is close to
. Multiply the displayed equation by
and sum on all
. The geometric series we obtain will converge, and we get
This is an identity valid for complex numbers of sufficiently small modulus. By clearing denominators we obtain a polynomial identity. But if two polynomials in
agree for infinitely many values, then they are the same polynomial, so the above is actually valid for all
(excluding ones which give rise to a zero denominator).
Fix a complex number
. Multiply the above by
and let
. The left hand side becomes the number of
's which are equal to
, while the right hand side becomes the number of
's which are equal to
. Hence these numbers agree for all
, so the lemma is proved. 
From here, the problem is simple. Remark that since
is obtained by adding a loop to each vertex of
, the adjacency matrix of
is the same as the adjacency matrix of
but with
added to each diagonal entry. Thus
, meaning that the eigenvalues of
are one greater than the eigenvalues of
. (Note: the eigenvalues of a graph are just the eigenvalues of its adjacency matrix.) From the lemma, the given condition on
, and the fact that
has
vertices, we see that the eigenvalues of
are
,
,
,
,
,
,
,
,
,
,
,
,
,
,
. (Do you see why?) Hence the eigenvalues of
are
,
,
,
,
,
,
,
,
,
,
,
,
,
,
, meaning that the number of closed walks of length
in
is ![\[\boxed{9+2\cdot 4^\ell+5\cdot 2^\ell+(-5)^\ell+3}.\]](//latex.artofproblemsolving.com/b/b/8/bb8fdea23059941b3e6066b6126419483f063c2c.png)
LEMMA: Suppose







![\[\alpha_1^\ell+\cdots+\alpha_r^\ell=\beta_1^\ell+\cdots+\beta_s^\ell.\]](http://latex.artofproblemsolving.com/4/c/3/4c33b8a747861a5d3e9a6bc347ce5d1f3c5cbe05.png)



Proof. The following proof is due to Richard Stanley (the author of the textbook I'm working from). We use generating functions. Let




![\[\dfrac{\alpha_1x}{1-\alpha_1x}+\cdots+\dfrac{\alpha_rx}{1-\alpha_rx}=\dfrac{\beta_1x}{1-\beta_1x}+\cdots+\dfrac{\beta_sx}{1-\beta_sx}.\]](http://latex.artofproblemsolving.com/f/3/e/f3e4360f0965b0fe5786b24b9d40ed62160fdf3a.png)


Fix a complex number









From here, the problem is simple. Remark that since













































![\[\boxed{9+2\cdot 4^\ell+5\cdot 2^\ell+(-5)^\ell+3}.\]](http://latex.artofproblemsolving.com/b/b/8/bb8fdea23059941b3e6066b6126419483f063c2c.png)
Stanley Chapter 1 Exercise 3 wrote:
A bipartite graph
with vertex bipartition
is a graph whose vertex set is the disjoint union
of
and
such that every edge of
is incident to one vertex in
and one vertex in
. Show that the nonzero eigenvalues of
come in pairs
. Equivalently, prove that the characteristic polynomial of
has the form
if
has an even number of vertices or
if
has an odd number of vertices for some polynomial
.
















Solution
Note that since
is bipartite, a walk must consist of the sequence of vertices
,
,
,
,
,
, where
and
for all
and
. As a result, no closed walk in
can be of even length. This means that if
,
,
are the eigenvalues of
, then
Now for each positive integer
, let
denote the sum of the
powers of the eigenvalues and
denote their
symmetric sum. We now prove a lemma.
LEMMA:
for all odd
.
Proof. We use strong induction and Newton's Sums. *looks up what Newton's sums are again* The base case,
, is trivial. Now assume
for all
, where by assumption
is odd. Recall that
Consider the expression
for some arbitrary
. Remark that since
is odd, exactly one of
or
must be odd. If
is odd, then
by assumption, while otherwise
from the original problem statement. Thus
for each
, and so the entire equality reduces to
Hence we're done by induction. 
This allows us to finish off the problem easily. Let
be the characteristic polynomial of
for some sequence of coefficients
. Then by Vieta's Formula's we know
, and so it is not hard to see that
as desired.
















![\[\lambda_1^\ell+\cdots+\lambda_n^\ell=0\qquad\text{for all odd }\ell.\]](http://latex.artofproblemsolving.com/c/2/c/c2c5452fd125900aa0a84fc2f443fbb4a69c845f.png)





LEMMA:


Proof. We use strong induction and Newton's Sums. *looks up what Newton's sums are again* The base case,




![\[P_{i+2}-S_1P_{i+1}+S_2P_i-\cdots+S_{i+1}P_1-(i+2)S_{i+2}=0.\]](http://latex.artofproblemsolving.com/2/9/f/29f42f9817ef9878fd502ca24ef93d409327778b.png)










![\[-(i+2)S_{i+2}=0\quad\implies\quad S_{i+2}=0.\]](http://latex.artofproblemsolving.com/c/8/9/c897dafac635345bc800083a4af190a6db8692ad.png)

This allows us to finish off the problem easily. Let
![\[\lambda(A(G))=\sum_{i=0}^na_i\lambda^i\]](http://latex.artofproblemsolving.com/a/a/7/aa7230c08ed6dfc75a56213023056da7587fad3c.png)



![\[\lambda(A(G))=\begin{cases}a_n\lambda^n+a_{n-2}\lambda^{n-2}+\cdots+a_0,&n\text{ even},\\a_n\lambda^n+a_{n-2}\lambda^{n-2}+\cdots+a_1\lambda,&n\text{ odd}\end{cases}\]](http://latex.artofproblemsolving.com/6/7/f/67f4a48f4001e94104b68a1b896bfe774ac6399a.png)
Stanley Chapter 1 Exercise 5 wrote:
Let
be the complete bipartite graph
with
vertex-disjoint edges removed. Thus
has
vertices and
edges, each of degree
. Show that the eigenvalues of
are
(
times each) and
(once each).











Solution
We first seek to gather information about the adjacency matrix
. Denote by
the vertex bipartition of
. WLOG, number the vertices in
labels
through
and the vertices in
labels
through
such that
(or rather the edge set of
) for each
. (The basic idea behind this is to introduce some consistency in the labeling. The advantages of such a system will become apparent later.) Note that
for all
and
by our labeling system. Furthermore, note that with respect to the remaining entries of
, we will only get a nonzero entry in the
and
entries. Thus, we can conclude that our adjacency matrix has the form
where
is the matrix with all ones. For example, when
, we get
To compute the eigenvalues of this matrix, recall that if
,
,
, and
are square blocks, then
(I seriously need to find the proof for this....) Now for any
, note that
Using the fact that
and
commute and that
(why?), remark that
Thus
(This obviously cannot be done for
, but this is an easy special case to deal with.)
Now we use a lemma. (The reason I'm putting this part off as a lemma is because its proof was mentioned within the chapter.)
LEMMA: If
is
, then the eigenvalues of
are
with multiplicity
and
with multiplicity
.
Proof. Note that
, so
of the eigenvalues of
must be zero (proof: row reductions). To compute the last eigenvalue, note that the trace of
is
. Since the trace of a matrix is the sum of the eigenvalues of that matrix, we see that the last eigenvalue is
as desired. 
Now let
. Then
, so
is an eigenvalue of
. By the lemma, this means that
with multiplicity
or
with multiplicity
. Now plugging back in, we see that either
or
and it's not hard to show that these roots have the desired multiplicities, so we're done.


















![\[A(H_n)=\begin{bmatrix}0&J-I\\J-I&0\end{bmatrix},\]](http://latex.artofproblemsolving.com/e/2/a/e2a26a1cbe4424be2f66c73e6ad8c1b506f624e7.png)


![\[A(H_3)=\begin{bmatrix}0&0&0&0&1&1\\0&0&0&1&0&1\\0&0&0&1&1&0\\0&1&1&0&0&0\\1&0&1&0&0&0\\1&1&0&0&0&0\end{bmatrix}.\]](http://latex.artofproblemsolving.com/8/1/4/814a7f4fdebf592fbb01cc9938a1a22b9c424f10.png)




![\[\det\begin{bmatrix}A&B\\C&D\end{bmatrix}=\det(AD-BC).\]](http://latex.artofproblemsolving.com/b/8/d/b8d4d9c6c2a3330125112ce4015dc1941498967d.png)

![\[\det[A(H_n)-\lambda I]=\det\begin{bmatrix}-\lambda I&J-I\\J-I&-\lambda I\end{bmatrix}=\det(\lambda^2 I-(J-I)^2).\]](http://latex.artofproblemsolving.com/b/3/7/b370ae17dcf086beed2fcc66a08be1775733f5df.png)



![\[(J-I)^2=J^2-2IJ+I^2=nJ-2J+I=(n-2)J+I.\]](http://latex.artofproblemsolving.com/c/1/6/c1617138d9534169849f87c324b1c04db275f5b7.png)
![\[\det[\lambda^2I^2-(I-J)^2]=0\iff \det[J(n-2)-(\lambda^2-1)I]=(n-2)\det\left[J-\left(\dfrac{\lambda^2-1}{n-2}\right)I\right]=0.\]](http://latex.artofproblemsolving.com/5/f/c/5fc0303606b034131782d5bef51af3b11ccc1cb4.png)

Now we use a lemma. (The reason I'm putting this part off as a lemma is because its proof was mentioned within the chapter.)
LEMMA: If







Proof. Note that







Now let








![\[0=\dfrac{\lambda^2-1}{n-2}\quad\implies\quad \lambda=\pm 1\]](http://latex.artofproblemsolving.com/f/d/4/fd4cb25bd43145c02075662040c0c6534a42c021.png)
![\[n=\dfrac{\lambda^2-1}{n-2}\quad\implies\quad \lambda=\pm (n-1)\]](http://latex.artofproblemsolving.com/8/b/6/8b6fc12824058a8daec4b1f152759838178c2454.png)
Stanley Chapter 1 Problem 11 wrote:
Let
denote the complete graph with
vertices, with one loop at each vertex. Let
denote
with the edges of
removed, i.e. choose
vertices of
and remove all edges between these vertices (including loops). Find the number
of closed walks in
of length
.










Solution
Once again, it first suffices to number the vertices of
in an advantageous manner and then consult its adjacency matrix. WLOG number the vertices
such that the three vertices not chosen in
are
. Then the vertices labeled
are connected to every vertex in
, and furthermore these are the only edges in
since all others were removed. Hence, the adjacency matrix is
In other words,
consists of two bands of
s, each of length
, and zeroes everywhere else.
In order to compute the number of closed walks of arbitrary length, it suffices to find the eigenvalues of
. It is most likely possible to do this by actually determining the characteristic polynomial, but this is quite arduous and does not give rise to any meaning behind the problem; instead, we choose an alternate route. Remark that by inspection
, so all but two of the eigenvalues are nonzero. To compute the values of the nonzero eigenvalues, we look to trace. Recall that the eigenvalues of a square matrix
are the squares of the eigenvalues of the original matrix
. With this in mind, we attempt to determine the diagonal entries of
. This is actually not so hard to do! Note that by the definition of matrix multiplication
If
, then all entries of the form
or
are equal to
, and so the corresponding entry equals
. Otherwise, only the three nonzero entries in the corresponding row/column count toward the final sum, and so the corresponding entry equals
. Hence
meaning that 
Finally, let
and
be the two nonzero eigenvalues of
. Since
trivially, we get the system of equations
Solving yields
and
as the solutions to this system. As a result, from the main idea the number of closed walks of length
is
.







![\[A(\Gamma)=\begin{bmatrix}1&1&1&1&\cdots&1\\1&1&1&1&\cdots&1\\1&1&1&1&\cdots&1\\1&1&1&0&\cdots&0\\\vdots&\vdots&\vdots&\vdots&\ddots&\vdots\\1&1&1&0&\cdots&0\end{bmatrix}.\]](http://latex.artofproblemsolving.com/5/d/6/5d612a821e14544e1b46b9e00e6c6b80ad1e61a5.png)



In order to compute the number of closed walks of arbitrary length, it suffices to find the eigenvalues of





![\[(A(\Gamma)^2)_{ii}=\sum_jA(\Gamma)_{ij}A(\Gamma)_{ji}.\]](http://latex.artofproblemsolving.com/4/a/b/4aba38c4fceac3aa8ecdc88459315541be9f3993.png)






![\[A(\Gamma)^2=\begin{bmatrix}21&*&*&*&\cdots&*\\ *&21&*&*&\cdots&*\\ *&*&21&*&\cdots&*\\ *&*&*&3&\cdots&*\\\vdots&\vdots&\vdots&\vdots&\ddots&\vdots\\ *&*&*&*&\cdots&3\end{bmatrix},\]](http://latex.artofproblemsolving.com/c/6/d/c6dc774563d73ce0efdfcb3dcf8584d8c072c859.png)

Finally, let




![\[\begin{cases}\lambda_1+\lambda_2&=3,\\\lambda_1^2+\lambda_2^2&=117.\end{cases}\]](http://latex.artofproblemsolving.com/f/0/8/f08a1408ad8ac1d4a9360467081500d5c0f343cb.png)




Stanley Chapter 1 Exercise 12 wrote:
- Let
be a finite graph and let
be the maximum degree of any vertex of
. Let
be the largest eigenvalue of the adjacency matrix
. Show that
.
- Suppose that
is simple (no loops or multiple edges) and has a total of
edges. Show that
.
Solution
(a)
(b)
Note that the degree of any vertex is the sum of the entries in the corresponding row of the adjacency matrix. With this in mind, the problem is not so hard. Denote by
the adjacency matrix of
for ease of typing. Let
be an eigenvector of
with associated eigenvalue
, and let
be the largest entry in
. Note that by the definition of matrix multiplication and our condition of maximality
But since
is an eigenvector we have
. Hence equating entries yields
for any eigenvalue
, from which the result follows.







![\[(A\vec{x})_i=\sum_jA_{ij}x_j\leq\sum_jA_{ij}x_i=x_i\sum_jA_{ij}\leq\Delta x_i.\]](http://latex.artofproblemsolving.com/e/0/5/e056c8c176b3eefc0cec784f90843cbfe041dd84.png)




(b)
This problem is a bit harder (or at least I think it is - perhaps I'm overcomplicating things like I usually do...). The key is to note that since
is simple, all entries are either
or
and that the sum of the entries in the matrix
is
.
Note that from
we can take norms to get
, which implies
.
Now denote by
,
,
,
the entries of
. I will show that
Once this is proven, the LHS becomes
while the right becomes
, from which the conclusion is easy to reach.
To do this, we use Cauchy-Schwarz. Specifically, note that
Summing over all
yields
Finally, remark that since all entries in
are either
or
,
simply equals the sum of the entries in
, which equals
. Thus we get the desired inequality, which proves the statement we set out to prove.





Note that from



Now denote by





![\[\sum_{i=1}^n\left(\sum_jA_{ij}x_j\right)^2\leq 2q\sum_{j=1}^nx_j^2.\]](http://latex.artofproblemsolving.com/6/5/c/65c4cb4d0e6998e5c58523f188593f1633faf98b.png)


To do this, we use Cauchy-Schwarz. Specifically, note that
![\[\sum_{j=1}^nA_{ij}^2\sum_{j=1}^nx_{ij}^2\geq\left(\sum_{j=1}^nA_{ij}x_j\right)^2.\]](http://latex.artofproblemsolving.com/0/3/b/03b7115e1f3dccb7e59531468db915aed07ee26a.png)

![\[\sum_{i=1}^n\left(\sum_{j=1}^nA_{ij}x_j\right)^2\leq\sum_{i=1}^n\left(\sum_{j=1}^nA_{ij}^2\sum_{j=1}^nx_j^2\right)\leq\left(\sum_{i,j}A_{ij}^2\right)\left(\sum_{j=1}^nx_j^2\right).\]](http://latex.artofproblemsolving.com/b/c/5/bc5ee2b5ffbf393bb8cbfb01ded4045cee3b4655.png)






This post has been edited 3 times. Last edited by djmathman, Dec 28, 2015, 5:03 AM