normalized graph laplacian |
complete |
💡 |
|
normalized adjacency matrix |
complete |
💡 |
|
network diffusion process |
complete |
💡 |
|
neighborhood (graph) |
complete |
💡 |
|
graph |
complete |
💡 |
|
graph signals |
complete |
💡 |
|
graph shift operator |
complete |
💡 |
|
graph laplacian |
empty |
💡 |
|
degree matrix |
complete |
💡 |
|
adjacency matrix |
complete |
💡 |
|
random walk laplacian |
complete |
💡 |
|
random walk matrix |
complete |
💡 |
|
undirected graph |
complete |
💡 |
|
unweighted graph |
complete |
💡 |
|
interpretation of the graph laplacian |
complete |
💡 |
|
graph fourier transform |
complete |
💡 |
|
interpretation of the symmetric graph laplacian |
complete |
💡 |
|
symmetric laplacian |
complete |
💡 |
|
total variation energy |
complete |
💡 |
|
linear graph filter |
complete |
💡 |
|
inverse graph fourier transform |
complete |
💡 |
|
graph convolution |
complete |
💡 |
|
convolutional graph filters are shift equivariant |
complete |
🧮 |
|
convolutional graph filters are permutation equivariant |
complete |
🧮 |
|
convolutional graph filters are local |
complete |
🧮 |
|
conditions for finding a convolutional graph filter |
complete |
🧮 |
|
spectral representation of a convolutional graph filter |
complete |
💡 |
|
the spectral representation of a graph filter is independent from the graph |
complete |
💡 |
|
the spectral graph filter operates on a signal pointwise |
complete |
🧮 |
|
analytic function |
complete |
💡 |
|
Empirical risk minimization problem |
empty |
💡 |
|
node-level task |
complete |
💡 |
|
inductive learning |
complete |
💡 |
|
hypothesis class |
complete |
💡 |
|
graph-level problem |
complete |
💡 |
|
graph signal processing problem |
complete |
💡 |
|
approximation of heaviside functions using convolutional graph filters |
empty |
🧮 |
|
spectral graph filter |
complete |
💡 |
|
statistical risk minimization problem |
complete |
💡 |
|
supervised learning |
empty |
💡 |
|
we can represent any analytic function with convolutional graph filters |
complete |
🧮 |
|
multi-layer graph perceptron |
complete |
💡 |
|
graph perceptron |
complete |
💡 |
|
convolutional filter bank |
complete |
💡 |
|
Graph Neural Networks |
complete |
💡 |
|
balanced stochastic block model |
complete |
💡 |
|
almost exact recovery |
complete |
💡 |
|
almost exact recovery is impossible when the signal to noise ratio is less than the threshold |
complete |
🧮 |
|
signal to noise ratio |
complete |
💡 |
|
spectral clustering |
empty |
💡 |
|
stochastic block model |
complete |
💡 |
|
unsupervised |
empty |
💡 |
|
information theoretic threshold |
complete |
💡 |
|
feature-aware spectral embeddings |
complete |
💡 |
|
coordinate representation |
complete |
💡 |
|
contextual stochastic block model |
empty |
💡 |
|
compressed sparse row representation |
complete |
💡 |
|
sometimes spectral algorithms fail |
complete |
💡 |
|
spectral embedding |
complete |
💡 |
|
we can use GNNs to solve feature-aware semi-supervised learning problems |
complete |
💡 |
|
message passing neural network |
complete |
💡 |
|
MPNNs can be expressed as graph convolutions |
complete |
💡 |
|
leaky ReLU |
complete |
💡 |
|
graph attention model |
complete |
💡 |
|
fully connected readout layer |
complete |
💡 |
|
readout layer |
complete |
💡 |
|
graph convolutional network |
complete |
💡 |
|
graph SAGE |
complete |
💡 |
|
chebyshev polynomials are orthogonal |
complete |
💡 |
|
chebyshev equioscillation theorem |
complete |
🧮 |
|
GCN layers can be written as graph convolutions |
complete |
💡 |
|
graph isomorphism |
complete |
💡 |
|
color refinement algorithm |
in progress |
💡 |
|
aggregation readout layer |
complete |
💡 |
|
Weisfeiler-Leman Graph Isomorphism Test |
in progress |
💡 |
|
We can verify whether graphs without node features and different laplacian eigenvalues are not isomorphic |
complete |
🧮 |
|
graph isomorphism is not solvable in polynomial time |
complete |
💡 |
|
graph homomorphism |
complete |
💡 |
|
computational graph |
complete |
💡 |
|
Graph Isomorphism Network |
complete |
💡 |
|
GINs are maximally powerful for anonymous input graphs |
complete |
💡 |
|
homomorphism density |
complete |
💡 |
|
cycle homomorphism density is given by the trace of the adjacency matrix |
complete |
💡 |
|
random graphs in a gin are good for graph isomorphism |
complete |
💡 |
|
quasi-symmetry |
complete |
💡 |
|
operator distance modulo permutations |
complete |
|
|
operator dilation |
complete |
💡 |
|
graph convolutions are stable to perturbations in the data and coefficients |
complete |
💡 |
|
graph automorphism |
complete |
💡 |
|
Lipschitz continuous |
complete |
💡 |
|
integral lipschitz filters are stable to dilations |
complete |
🧮 |
|
integral Lipschitz filter |
complete |
💡 |
|
filter permutation invariance |
complete |
💡 |
|
eigenvector misalignment |
complete |
💡 |
|
discriminability of a graph filter |
in progress |
💡 |
|
Lipschitz filters are stable to additive perturbations |
complete |
🧮 |
|
relative perturbation edge changes are tied to node degree |
complete |
💡 |
|
relative perturbations |
complete |
💡 |
|
stability and size tradeoff for realistic sparsity pattern considerations setting |
in progress |
💡 |
|
stability-discriminability tradeoff for Lipschitz filters |
complete |
💡 |
|
stable graph filter |
complete |
💡 |
|
GNNs perform better than their constituent filters |
complete |
💡 |
|
GNNs inherit stability from their layers |
complete |
🧮 |
|