| we can use GNNs to solve feature-aware semi-supervised learning problems |
complete |
💡 |
|
| we can represent any analytic function with convolutional graph filters |
complete |
🧮 |
|
| unweighted graph |
complete |
💡 |
|
| unsupervised |
empty |
💡 |
|
| undirected graph |
complete |
💡 |
|
| total variation energy |
complete |
💡 |
|
| the spectral representation of a graph filter is independent from the graph |
complete |
💡 |
|
| the spectral graph filter operates on a signal pointwise |
complete |
🧮 |
|
| symmetric laplacian |
complete |
💡 |
|
| supervised learning |
empty |
💡 |
|
| stochastic block model |
complete |
💡 |
|
| statistical risk minimization problem |
complete |
💡 |
|
| stable graph filter |
complete |
💡 |
|
| stability-discriminability tradeoff for Lipschitz filters |
complete |
💡 |
|
| stability and size tradeoff for realistic sparsity pattern considerations setting |
in progress |
💡 |
|
| spectral representation of a convolutional graph filter |
complete |
💡 |
|
| spectral graph filter |
complete |
💡 |
|
| spectral embedding |
complete |
💡 |
|
| spectral clustering |
empty |
💡 |
|
| sometimes spectral algorithms fail |
complete |
💡 |
|
| signal to noise ratio |
complete |
💡 |
|
| relative perturbations |
complete |
💡 |
|
| relative perturbation edge changes are tied to node degree |
complete |
💡 |
|
| readout layer |
complete |
💡 |
|
| random walk matrix |
complete |
💡 |
|
| random walk laplacian |
complete |
💡 |
|
| random graphs in a gin are good for graph isomorphism |
complete |
💡 |
|
| quasi-symmetry |
complete |
💡 |
|
| operator distance modulo permutations |
complete |
|
|
| operator dilation |
complete |
💡 |
|
| normalized graph laplacian |
complete |
💡 |
|
| normalized adjacency matrix |
complete |
💡 |
|
| node-level task |
complete |
💡 |
|
| neighborhood (graph) |
complete |
💡 |
|
| network diffusion process |
complete |
💡 |
|
| multi-layer graph perceptron |
complete |
💡 |
|
| message passing neural network |
complete |
💡 |
|
| linear graph filter |
complete |
💡 |
|
| leaky ReLU |
complete |
💡 |
|
| inverse graph fourier transform |
complete |
💡 |
|
| interpretation of the symmetric graph laplacian |
complete |
💡 |
|
| interpretation of the graph laplacian |
complete |
💡 |
|
| integral lipschitz filters are stable to dilations |
complete |
🧮 |
|
| integral Lipschitz filter |
complete |
💡 |
|
| inductive learning |
complete |
💡 |
|
| information theoretic threshold |
complete |
💡 |
|
| hypothesis class |
complete |
💡 |
|
| homomorphism density |
complete |
💡 |
|
| graph |
complete |
💡 |
|
| graph-level problem |
complete |
💡 |
|
| graph signals |
complete |
💡 |
|
| graph signal processing problem |
complete |
💡 |
|
| graph shift operator |
complete |
💡 |
|
| graph perceptron |
complete |
💡 |
|
| graph isomorphism |
complete |
💡 |
|
| graph laplacian |
empty |
💡 |
|
| graph isomorphism is not solvable in polynomial time |
complete |
💡 |
|
| graph homomorphism |
complete |
💡 |
|
| graph fourier transform |
complete |
💡 |
|
| graph convolutions are stable to perturbations in the data and coefficients |
complete |
💡 |
|
| graph convolutional network |
complete |
💡 |
|
| graph convolution |
complete |
💡 |
|
| graph automorphism |
complete |
💡 |
|
| graph attention model |
complete |
💡 |
|
| graph SAGE |
complete |
💡 |
|
| fully connected readout layer |
complete |
💡 |
|
| filter permutation invariance |
complete |
💡 |
|
| feature-aware spectral embeddings |
complete |
💡 |
|
| eigenvector misalignment |
complete |
💡 |
|
| discriminability of a graph filter |
in progress |
💡 |
|
| degree matrix |
complete |
💡 |
|
| cycle homomorphism density is given by the trace of the adjacency matrix |
complete |
💡 |
|
| coordinate representation |
complete |
💡 |
|
| convolutional graph filters are shift equivariant |
complete |
🧮 |
|
| convolutional graph filters are permutation equivariant |
complete |
🧮 |
|
| convolutional graph filters are local |
complete |
🧮 |
|
| convolutional filter bank |
complete |
💡 |
|
| contextual stochastic block model |
empty |
💡 |
|
| conditions for finding a convolutional graph filter |
complete |
🧮 |
|
| computational graph |
complete |
💡 |
|
| compressed sparse row representation |
complete |
💡 |
|
| color refinement algorithm |
in progress |
💡 |
|
| chebyshev polynomials are orthogonal |
complete |
💡 |
|
| chebyshev equioscillation theorem |
complete |
🧮 |
|
| balanced stochastic block model |
complete |
💡 |
|
| approximation of heaviside functions using convolutional graph filters |
empty |
🧮 |
|
| analytic function |
complete |
💡 |
|
| almost exact recovery |
complete |
💡 |
|
| almost exact recovery is impossible when the signal to noise ratio is less than the threshold |
complete |
🧮 |
|
| aggregation readout layer |
complete |
💡 |
|
| adjacency matrix |
complete |
💡 |
|
| Weisfeiler-Leman Graph Isomorphism Test |
in progress |
💡 |
|
| We can verify whether graphs without node features and different laplacian eigenvalues are not isomorphic |
complete |
🧮 |
|
| MPNNs can be expressed as graph convolutions |
complete |
💡 |
|
| Lipschitz filters are stable to additive perturbations |
complete |
🧮 |
|
| Lipschitz continuous |
complete |
💡 |
|
| Graph Neural Networks |
complete |
💡 |
|
| Graph Isomorphism Network |
complete |
💡 |
|
| GNNs perform better than their constituent filters |
complete |
💡 |
|
| GNNs inherit stability from their layers |
complete |
🧮 |
|
| GINs are maximally powerful for anonymous input graphs |
complete |
💡 |
|
| GCN layers can be written as graph convolutions |
complete |
💡 |
|
| Empirical risk minimization problem |
empty |
💡 |
|