Quadratic forms and the eigenvector problem

**Acknowledgement**: Constrained Extrema of Quadratic Forms. (2020, November 17). Trinity University. https://math.libretexts.org/@go/page/17318.

Suppose $A$ is an $m$-by-$m$ *real symmetric matrix*. For any vector $\mathbf{x} \in \mathbb{R}^m$, we can define the quadratic form by

$$ Q(\mathbf{x}) = \mathbf{x}^T A\mathbf{x} = \sum_{i=0}^{m-1} \sum_{j=0}^{m-1} a_{ij}x_i x_j. $$

To find $Q$'s extrema (maximum and minimum) subject to the restriction $g(\mathbf{x}) = ||\mathbf{x}||_2^2 = \sum_{i=0}^{m-1}x_i^2 = 1$, we construct the Lagrangian form

$$ L(\mathbf{x}) = Q(\mathbf{x}) - \lambda g(\mathbf{x}) = Q(\mathbf{x}) - \lambda\sum_{i=0}^{m-1}x_i^2 $$

for some scalar multiplier $\lambda$. Then, taking the gradient of $L(\mathbf{x})$, we obtain a system of $m$ linear equations

$$ L_{x_i} = 2\left(\sum_{j=0}^{m-1}a_{ij}x_j\right) - 2\lambda x_i = 0, \quad \textrm{for }i = 0, 1, \dots, (m-1), $$

where the factor of $2$ in front of the sum comes from the fact that $a_{ij}x_i x_j$ appears twice due to the symmetry in $A$. Simplifying the previous system of equations, we get

$$ \sum_{j=0}^{m-1}a_{ij}w_j = \lambda w_i, \quad \textrm{for }i = 0, 1, \dots, (m-1), $$

which is satisfied by some vector $\mathbf{w} = [w_0, w_1, \dots, w_{m-1}]^T$. Therefore, $\mathbf{w}$ is a constrained **critical point** of $Q$ subject to $||\mathbf{w}||_2^2 = 1$ if and only if

$$ A\mathbf{w} = \lambda\mathbf{w} $$

for some $\lambda$; that is, if and only if $\lambda$ is an eigenvalue and $\mathbf{w}$ is an associated unit eigenvector of $A$.

If $A\mathbf{w} = \lambda\mathbf{w}$ and $\sum_{i=0}^{m-1}w_i^2 = 1$, then

$$ \begin{split} Q(\mathbf{w}) &= \sum_{i=0}^{m-1}\left(\sum_{j=0}^{m-1}a_{ij}w_j\right) w_i \\ &= \sum_{i=0}^{m-1}(\lambda w_i)w_i \\ &= \lambda \sum_{i=0}^{m-1} w_i^2 \\ &= \lambda. \end{split} $$

Therefore, the largest and smallest eigenvalues of $A$ are the **_maximum_ and _minimum_ values** of $Q$ attained at ther respective eigenvectors subject to $||\mathbf{w}||_2 = 1$.

Eigen and SVD decompositions of SP(S)D matrices

When we work with symmetric matrices that are positive (semi) definite, both the eigendecomposition and the SVD are identical, up to a reverse ordering.

For example, the graph Laplacian matrices that we need for computing Laplacian quadratic forms are SPSD, and it doesn't matter whether we compute their SVD or eigenpairs.

The mesh graph

Spectral drawing

**Acknowledgement**: The following information is a summary of the notes at https://www.cis.upenn.edu/~cis515/cis515-15-graph-drawing.pdf

If $G = (V,E)$ is our indirected graph, the idea of drawing $G$ consists of assigning a coordinate (say in two dimensions) to every node $v_i \in V$ and drawing a line segment between $v_i$ and $v_j$ whenever the edge $e_{ij} = (i,j)$ exists in $E$.

Spectral drawing originates from the goal of finding an $m$-by-$n$ drawing matrix $X$, where $m = |V|$, $n \ll m$, and the $i$th row vector contains the coordinates for $v_i$. To find a reasonable $X$, we can imagine building a physical model of $G$ by replacing the edeges in $E$ with identical springs. A good drawing of $G$ thus requires these springs to be less extended.

Formally, the energy of a drawing $X$ is

$$ \mathcal{E}(X) = \sum_{(i,j)\in E}||\mathbf{x}_i - \mathbf{x}_j||_2^2, $$

where $\mathbf{x}_i$ is the $i$th row of $X$ corresponding to node $v_i$. A good drawing seeks to minimize $\mathcal{E}(X)$.

As seen in lecture this week, it turns out we can model $\mathcal{E}(X)$ by using the Laplacian matrix of $G$. That is, if $\mathbf{e}_{ij} = [0, \cdots, 0, 1, 0, \cdots, 0, -1, 0, \cdots, 0]^T$ is a vector with zeros except for a $1$ at the $i$th location and a $-1$ at the $j$th location (representing the edge $e_{ij}$), then

$$ X^T L X = X^T \left(\sum_{(i,j)\in E}\mathbf{e}_{ij}\mathbf{e}_{ij}^T\right) X = \sum_{(i,j)\in E}(X^T\mathbf{e}_{ij})(\mathbf{e}_{ij}^T X) = \sum_{(i,j)\in E}\begin{pmatrix} (x_i - x_j)^2 & (x_i - x_j)(y_i - y_j) \\ (x_i - x_j)(y_i - y_j) & (y_i - y_j)^2 \end{pmatrix}, $$

where $L$ is the graph Laplacian and $x_i$ and $y_i$ are the entries (e.g., two-dimensional coordinates) for the $i$th node in $X$. From this Laplacian quadratic form, we can see that the energy function is equivalent to

$$ \mathcal{E}(X) = \textrm{trace}(X^T L X). $$

Furthermore, $X^T L X$ is symmetric positive semidefinite, and its trace is equal to the sum of its positive eigenvalues.

Now, if we choose $X$ to be the second and third columns from $W$ (i.e., $\mathbf{w}_1$ and $\mathbf{w}_2$, where $L = W S W^T$ is the eigendecomposition of $L$), our drawing satisfies the equation $X^T X = I$, and we can rule out the trivial minimizer for $\mathcal{E}(X)$. In this case, the corresponding drawing given by $X$ is called an orthogonal drawing.

The following result tells us how to find minimum energy drawings, assuming that $G$ is connected.

Theorem. Let $G = (V, E)$ be an undirected graph with $|V| = m$. If $L$ is the Laplacian of $G$, and if the eigenvalues of $L$ are $0 = \lambda_0 \leqslant \lambda_1 \leqslant \cdots \leqslant \lambda_{m-1}$, then the minimal energy of any balanced orthogonal graph drawing of $G$ in $\mathbb{R}^n$ is equal to $\lambda_1 + \cdots + \lambda_{n}$ (of course, implying that $n \ll m$). The $m$-by-$n$ matrix $X$ consisting of any unit vectors $\mathbf{w}_1, \cdots, \mathbf{w}_n$ associated with $\lambda_1 \leqslant \cdots \leqslant \lambda_n$ yields a balanced orthogonal graph drawing of minimal energy and satisfies the condition $X^T X = I$.

Note: A representation is balanced if and only if the sum of the entries of every column is zero: $\mathbf{1}^T X = \mathbf{0}$.

Coordinate cut