Chapter 15 Parameters of Thin Modules, II
Monday, February 22, 1993
Proof (Proof of Lemma 14.2 Continued).
Also by Lemma 11.1, \(E_0W = 0\).
(As otherwise \(\langle \delta \rangle = E_0V \subseteq W\) and \(r(W) = 0\).)
Hence, \(\lambda - \theta_0 = \lambda - k\) is a factor of \(f_W\) by Lemma 14.1 \((iii)\).
Let \(p_0, p_1, \ldots, p_D\) denote the polynomials for the trivial \(T(x)\)-module from Lemma 9.1.
Recall, \[\begin{align} \sum_{\theta\in \mathbb{R}}m(\theta)p_i(\theta)p_j(\theta) & = \delta_{ij}x_1x_2\cdots x_i \quad (0\leq i,j\leq D)\\ & = \delta_{ij}b_0b_1\cdots b_{i-1}c_1c_2\cdots c_i. \end{align}\] Note that \(x_i = b_{i-1}c_i\) is in the proof of Theorem 13.1.
By construction, \[\begin{align} p_0(\lambda) & = 1,\\ p_1(\lambda) & = \lambda,\\ p_2(\lambda) & = \lambda^2 - a_1\lambda - k. \end{align}\]
Apparently, \[f_W = \sigma_0 p_0 + \sigma_1 p_1 + \sigma_2 p_2\] for some \(\sigma_0, \sigma_1, \sigma_2\in \mathbb{C}\).
Claim: \[\begin{align} \sigma_0 & = 1,\\ \sigma_1 & = \frac{a_0(W)}{k},\\ \sigma_2 & = -\frac{1+a_0(W)}{kb_1}. \end{align}\]
Pf of Claim. \[\begin{align} 1 & = \sum_{\theta\in \mathbb{R}}m_W(\theta)\\ & = \sum_{\theta\in \mathbb{R}}m(\theta)f_W(\theta)\\ & = \sum_{j=0}^2 \sigma_j\left(\sum_{\theta\in \mathbb{R}}m(\theta)p_j(\theta)\right)\\ & = \sigma_0. \end{align}\] We applied Lemma 10.1 \((ib)\), Lemma 14.1 \((ib)\), and Lemma 10.1 \((i)\) in this order.
Next by Lemma 10.1 \((ii)\), and \(p_1(\theta) = \theta\), \[\begin{align} a_0(W) & = \sum_{\theta\in \mathbb{R}}m_W(\theta)\theta\\ & = \sum_{\theta\in \mathbb{R}}m(\theta)f_W(\theta)\theta\\ & = \sum_{j = 0}^2\sigma_j\sum_{\theta\in \mathbb{R}}m(\theta)p_j(\theta)p_1(\theta)\\ & = \sigma_1 x_1(T\delta)\\ & = \sigma_1b_0c_1\\ & = \sigma_1 k. \end{align}\] So far, \[f_W(\lambda) = 1 + \frac{a_0(W)}{k}\lambda + \sigma_2(\lambda^2 - a_1\lambda-k).\] But, \[\begin{align} 0 & = f_W(k)\\ & = 1 + a_0(W) + \sigma_2k(k-a_1-1)\\ & = 1 + a_0(W) + \sigma_2kb_1. \end{align}\] Thus, \[\sigma_2 = -\frac{1+a_0(W)}{kb_1}.\] This proves Claim.
Case: \(a_0(W) = -1\).
Here, \(\sigma_2 = 0\) and \[f_W(\lambda) = 1 + \frac{a_0(W)\lambda}{k} = 1-\frac{\lambda}{k}.\] Also, \[d+1 = |\{\theta \mid \theta \;\text{ is an eigenvalue of $\Gamma$}, \; f_W(\theta)\neq 0\}| = D.\]
Case: \(a_0(W) \neq -1\).
Here, \(\sigma_2\neq 0\), and \(\deg f_W = 2\). So, \[f_W(\lambda) = (\lambda - k)(\lambda - \beta)\alpha\] for some \(\alpha, \beta\in \mathbb{C}, \; \alpha \neq 0\).
Comparing the coefficients in \[(\lambda-k)(\lambda - \beta)\alpha = 1 + \frac{a_0(W)}{k}\lambda - \frac{a_0(W)+1}{kb_1}(\lambda^2 - a_1\lambda -k),\] we find \[\begin{align} \alpha & = -\frac{a_0(W) + 1}{kb_1},\\ -(k+\beta)\alpha & = \frac{a_0(W)}{k} + \frac{a_0(W)+1}{kb_1}a_1,\\ k\beta\alpha & = 1 + \frac{a_0(W)+1}{b_1}. \end{align}\] Hence, \[-\beta(a_0(W) + 1) = b_1 + (a_0(W) + 1).\] Thus, we have \[\begin{equation} (1+a_0(W))(1+\beta) = -b_1. \tag{15.1} \end{equation}\] In particular, \(\beta \neq -1\), and \[\alpha = -\frac{1+a_0(W)}{kb_1} = \frac{1}{k(\beta+1)}.\] Also, by Definition 9.2, \[\begin{align} 0 &\leq m_W(\theta) \\ & = m(\theta)f_W(\theta) \quad (\text{for all} \; \theta\in \mathbb{R}). \end{align}\] But if \(\theta\) is an eigenvalue of \(\Gamma\), \[0 < m(\theta).\] So, \[\begin{align} 0 & \leq f_W(\theta)\\ & = \frac{(\theta-k)(\theta-\beta)}{k(\beta+1)}. \end{align}\] Either \[\beta+1 >0 \to \theta-\beta \leq 0 \;\text{ or }\; \beta \geq \theta_1,\] or \[\beta+1 < 0 \to \theta-\beta \geq 0 \;\text{ or }\; \beta \leq \theta_D.\] If \(\beta = \theta_1\), \[\begin{align} a_0(W) & = - \frac{b_1}{\beta+1}-1 = -\frac{b_1}{\theta_1+1}-1\\ f_W(\lambda) & = \frac{(\lambda - k)(\lambda - \theta_1)}{k(\theta_1+1)}, \end{align}\] and we have \((i)\).
If \(\beta = \theta_D\), \[\begin{align} a_0(W) & = -\frac{b_1}{\theta_D+1}-1\\ f_W(\lambda) & = \frac{(\lambda - k)(\lambda - \theta_D)}{k(\theta_D+1)}, \end{align}\] and we have \((ii)\).
If \(\beta \not\in \{\theta_1, \theta_D\}\), \[\beta \in (-\infty, \theta_D) \cup (\theta_1, \infty),\] we have \((iv)\).
Note using (15.1), we have \((iv)\).
Note. Using (15.1), \[a_0(W) \to \beta \to f_W \to m_W \to \text{isomorphism class of $W$}.\]
Note on Lemma 14.2. In fact, \(\theta_1 > -1\), \(\theta_D < -1\) if \(D\geq 2\).
Definition 15.1 The complete graph \(K_n\) has \(n\) vertices and diameter \(D = 1\), i.e., \(xy\in E\) for all vertices \(x,t\).
\(K_n\) is distance-regular with valency \(k = n-1\) and \(a_1 = n-2\), \(D = 1\). Moreover, it has two distince eigenvalues \(\theta_0\), \(\theta_1\).
Recall, \(\theta_0, \ldots, \theta_D\) are roots of \(p_{D+1}\), i.e., \(D+1\) st polynomial for the trivial module. \[\begin{align} p_0 & = 1,\\ p_1 & = \lambda,\\ p_2 & = \lambda^2 - a_1\lambda - k\\ & = \lambda^2 - (n-2)\lambda - (n-1)\\ & = (\lambda - (n-1))(\lambda +1). \end{align}\] The roots are \(\theta_0 = n-1 = k\) and \(\theta_1 = -1\).
Lemma 15.1 Let \(\Gamma = (X, E)\) be distance-regular of diameter \(D\geq 1\) with distinct eigenvalues \[k = \theta_0 > \theta_1 > \cdots > \theta_D.\]
Proof. \((i)\) Suppose \(\theta_D \geq -1\).
Then \(I + A\) is positive semi-definite.
By Lemma 2.1, there exists vectors \(\{v_x\mid x\in X\}\) in a Euclidean space such that \[\begin{align} \langle v_x,v_y\rangle & = (I+A)_{xy}\\ & = \begin{cases} 1 & \text{if $x = y$ or $xy\in E$,}\\0 & \text{othewise}. \end{cases} \end{align}\] For every \(xy\in E\), \[\langle v_x, v_y\rangle = \|v_x\|\|v_y\| = 1.\] Hence, \(v_x = v_y\), and \(v_x\) is independent of \(x\in X\).
Thus \(\langle v_x,v_y\rangle = 1\) for all \(x,y\in X\).
We have \(I + A = J\), (all 1’s matrix), and \(D = 1\).
\((ii)\) Let \(m\) be the trivial measure. Then, \[\begin{align} 1 & = \sum_{\theta\in \mathbb{R}}m(\theta) + \sum_{\theta\in \mathbb{R}}m(\theta)\theta \\ & = \sum_{\theta\in \mathbb{R}}m(\theta)(\theta+1)\\ & = m(k)(k+1) + \sum_{\theta\neq k}m(\theta)(\theta+1)\\ & \leq (k+1)|X|^{-1}. \end{align}\] Note that \(m(k) = |X|^{-1}\dim E_0V = |X|^{-1}\).
So \(k+1 \geq |X|\) or \(k = |X|-1\). Thus, \(xy\in E\) for every \(x,y\in X\), and \(D = 1\).
Note. Lemma 15.1 does not require distance-regular assumption.