Chapter 37 Generalized Adjacency Matrix

Friday, April 30, 1993

Lemma 37.1 Let \(\Gamma = (X, E)\) be a distance-regular graph of diameter \(D\geq 3\), and \(Q\)-polynomial with respect to \(E_0, E_1, \ldots, E_D\). Fix a vertex \(x\in X\), and write \(E^*_i\equiv E^*_i(x)\), and \(T\equiv T(x)\). Let \(W\) be an irreducible \(T\)-module of endpoint \(1\). If \(\dim E^*_2W = 1\), then \(W\) is thin.

Proof. Pick \(0\neq v\in E^*_1W\).

We want to show that

  • \(FR^iv \in \mathrm{Span}(R^iv)\) for \(i\in \{0, \ldots, D-1\}\).
  • \(LR^iv \in \mathrm{Span}(R^{i-1}v)\) for \(i\in \{1, \ldots, D-1\}\).

We have that

\((1)\) \(FR^2E^*_j \in \mathrm{Span}(RFRE^*_j, R^2FE^*_j, R^2E^*_j)\) for \(i\in \{0, \ldots, D-3\}\).
\((2)\) \(LR^2E^*_j \in \mathrm{Span}(RLRE^*_j, R^2LE^*_j, F^2RE^*_j, FRFE^*_j, RF^2E^*_j, RFE^*_j, FRE^*_j, RE^*_j)\) for \(i\in \{0, \ldots, D-3\}\)

by Corollary 30.1.

Claim \((a)\) \(FR^iv \in \mathrm{Span}(R^iv)\) for \(i\in \{0, \ldots, D-2\}\),
\(\quad (b)\) \(LR^iv \in \mathrm{Span}(R^{i-1}v)\) for \(i\in \{1, \ldots, D-2\}\).

HS MEMO

Proof of Claim.

\((a)\) By Lemma 34.2, and our assumption

\[\dim E^*_1W = \dim E^*_2W = 1.\] So, \(Rv\neq 0\), and \(E^*_2W = \mathrm{Span}(Rv)\).

We may assume \(i\geq 2\). Then \(R^{i-2}v \in E^*_{i-1}W\), \[\begin{align} FR^iv & = FR^2R^{i-2}v, \quad \text{if }\; i\leq D-2,\\ & = R(FR + RF + R)R^{i-2}v\\ & \in R(\mathrm{Span}(R^{i-1}v)) \\ & = \mathrm{Span}(R^iv), \end{align}\] by the induction hypothesis.

\((b)\) If \(i\leq D-2\), then \(R^{i-2}v\in E^*_{i-1}W\) with \(i-1\leq D-3\). Hence,

\[\begin{align} LR^iv &= LR^2(R^{i-2}v)\\ & = (RLR + R^2L + F^2R + FRF + RF^2 + RF + FR + R)R^{i-2}v\\ & \in \mathrm{Span}(R^{i-1}v), \end{align}\] by induction and \((a)\).

Suppose \(R^{D-1}v = 0\). Then, \[\mathrm{Span}(v, Rv, \ldots, R^{D-2}v) = \widetilde{W}\] is invariant under \(M\) and \(M^*\), hence, under \(T\).

Since \(W\) is irreducible, \(W = \widetilde{W}\), and \(W\) is thin in this case.

Suppose \(R^{D-1}v \neq 0\).

Observe: \(v, Av, \ldots, A^{D-1}v \in \mathrm{Span}(v, Rv, \ldots, R^{D-1}v)\).

Hence, each \(R^iv\) is a polynomial of degree \(i\) in \(A\) applied to \(v\), and \[\mathrm{Span}(v, Av, \ldots, A^{D-1}v) = \mathrm{Span}(v, Rv, \ldots, R^{D-1}v) = \mathrm{Span}(v, A_1v, \ldots, A_{D-1}v).\] Also, \[A_Dv = Jv - \left(\sum_{h=0}^{D-1}A_h\right)v \in \mathrm{Span}(v, A_1v, \ldots, A_{D-1}v).\] Thus, \[Mv = \mathrm{Span}(v, Rv, \ldots, R^{D-1}v).\] Therefore, \[\mathrm{Span}(v, Rv, \ldots, R^{D-1}v) = \widetilde{W}\] is invariant under \(M\), \(M^*\), and hence \(T\). We have \(W = \widetilde{W}\) and \(W\) is thin.

Definition 37.1 Let \(\Gamma = (X,E)\) be any regular graph (not necessarily connected).

Let \(A\) be the adjacency matrix of \(\Gamma\), and let \(J\) be the all \(1\)’s matrix.

Pick \(O\neq B \in \mathrm{Mat}_X(\mathbb{C})\).

\(B\) is a generalized adjacency matrix , if

\((i)\) for all vertices \(x,y\in X\), \(B_{xy}\neq 0\) implies \(A_{xy}\neq 0\) or \(x = y\),
\((ii)\) \(B\) is in the subalgebra of \(\mathrm{Mat}_X(\mathbb{C})\) generated by \(A\) and \(J\).

Example 37.1 Any nonzero matrix of form \[\alpha A + \beta I \quad (\alpha, \beta\in \mathbb{C})\] is a generalized adjacency matrix.

If \(\Gamma\) is distance regular, all generalized adjacecy matrices are of this form.

Let \(\Gamma = (X, E)\) be a distance-regular graph of diameter \(D\geq 3\). Assume \(\Gamma\) is thin, and \(Q\)-polynomial.

Pick a vertex \(x\in X\), and write \(E^*_i \equiv E^*_i(x)\), \(T\equiv T(x)\). Then, \[E^*_1TE^*_1 = \mathrm{Span}(\tilde{J}, E^*_1, \tilde{A}, \tilde{A}^2, \tilde{A}^3),\] and \(\dim E^*_1TE^*_1\leq 5\).

We will produce a ‘nice’ spanning set \[E^*_1TE^*_1 = \mathrm{Span}(\tilde{J}, E^*_1, \tilde{A}, A^+ \;(= R^{-1}E^*_2AE^*_1), A^+\tilde{A}).\]

Lemma 37.2 Let \(\Gamma = (X, E)\) be a thin distance-regular graph of diameter \(D\geq 4\).

Fix a vertex \(x\in X\), and write \(E^*_i \equiv E^*_i(x)\) and \(R \equiv R(x)\).

Let \(\Gamma_1\) denote the vertex subgraph induced on the first subconstituent of \(\Gamma\) relative to \(x\). Then, \[\Delta = (R^{-1})^{i-1}E^*_iA_iE^*_1\] is a generalized adjacency matrix for \(\Gamma_1\) for all \(i\in \{1, \ldots, D-3\}\).

Proof. Write \(T \equiv T(x)\). Fix \(i\in \{1, \ldots, D-3\}\).

Recall \(R^{-1}\in T\) by Lemma 31.1 \((iv)\).

Since \(E^*_{i-1}R^{-1}E^*_i = R^{-1}E^*_i\) by Lemma 31.1 \((ii)\), \[\Delta \in E^*_1TE^*_1 = \mathrm{Span}(\tilde{J}, E^*_1, \tilde{A}, \tilde{A}^2, \ldots)\] by Lemma 34.3 \((iv)\).

Hence, \(\Delta\) satisfied the condition \((ii)\) of Definition 37.1.

To show \((i)\), pick vertices \(y,z\in X\) such that \[\partial(x,y) = \partial(x,z) = 1, \quad \partial(y,z) = 2.\] We need to show \[\Delta_{yz} = 0.\] Suppose \(\Delta_{yz}\neq 0\). Then, \[\langle \Delta\hat{y},\hat{z}\rangle \neq 0.\] We will show this cannot occur.

Notation: Set \[E^*_{ij} = E^*_i(x)E^*_j(y), \; i,j\in \{0, 1, \ldots, D\}.\] Then, \[E^*_{ij}V = \mathrm{Span}(\hat{w}\mid w\in X, \partial(x,w)=i, \partial(y,w)=j) \text{ for } i,j\in \{0, 1, \ldots, D\}.\] Let \(\delta\) denote the all \(1\)’s vector in \(V\). Let \[\delta_{ij} = E^*_{ij}\delta = \sum_{w\in X, \partial(x,w)=i, \partial(y,w)=j}\hat{w}.\]

Now, \[\Delta\hat{y} \in E^*_1(x)V = E^*_{10}V + E^*_{11}V + E^*_{12}V \quad \text{(orthogonal direct sum)}.\] So, there exist \(\delta^+_{10}\in E^*_{10}V\), \(\delta^+_{11}\in E^*_{11}V\), and \(\delta^+_{12}\in E^*_{12}V\) such that \[\Delta \hat{y} = \delta^+_{10} + \delta^+_{11} + \delta^+_{12}.\]

Observe: \(\hat{z}\in E^*_{12}V\) is not orthogonal to \(\Delta \hat{y}\).

So, \(\delta^+_{12}\neq 0\).

Observe: \[\begin{align} R^{i-1}(\delta^+_{10} + \delta^+_{11} + \delta^+_{12}) & = R^{i-1}\Delta\hat{y}\\ & = R^{i-1}(R^{-1})^{i-1}E^*_iA_iE^*_1\hat{y}\\ & = E^*_iA_iE^*_1\hat{y}\\ & = \delta_{ii}\\ & \in E_{ii}V. \end{align}\]

HS MEMO

It is because on each irreducible thin module with standard basis \(w_r, w_{r+1}, \ldots, w_{r+d}\), \[R^{-1}w_i = w_{i-1}, \; i>r, \; R^{-1}w_r = 0,\] and \(E^*_1V\) is an orthogonal direct sum of irreducible modules and \(r\leq 1\).

But we can control \(R^{i-1}\delta^+_{10}\), \(R^{i-1}\delta^+_{11}\), also.

Claim. \(RE^*_{jj}V \subseteq E^*_{j+1,j+1}V + E^*_{j+1, j}V, \; j\in \{1, \ldots, D-1\}.\)

Proof of Claim. Clear.

By Claim \[\begin{align} R^{i-1}\delta^+_{10} &\in E^*_{i,i-1}V, \quad \text{and}\\ R^{i-1}\delta^+_{11} & \in E^*_{i,i-1}V + E^*_{i,i}V. \end{align}\] Hence, we conclude that \[R^{i-1}\delta^+_{12} = R^{i-1}\Delta \hat{y} - R^{i-1}\delta^+_{10} - R^{i-1}\delta^+_{11}\in E^*_{i,i-1}V + E^*_{ii}V.\] But now \[\begin{equation} 0 = E^*_{i,i+1}R^{i-1}\delta^+_{12} = E^*_{i,i+1}A^{i-1}E^*_{12}\delta^+_{12} = R(y)^{i-1}\delta^+_{12}. \tag{37.1} \end{equation}\] By Lemma 32.1 \((ii)\), \[R(y)^{i-1}: E^*_2(y)V \longrightarrow E^*_{i+1}V\] is one-to-one, since \(\Gamma\) is thin, and \(i-1\leq D-4\).

So, \(\delta^+_{12}=0\) by (37.1).

But this contradicts \((2)\). Hence our assumption \(\Delta_{yz}\neq 0\) is false, and the condition \((i)\) of the definition of generalised adjacency matrices is satisfied.

This proves the lemma.