2025-02-05 graphs lecture 5

Data

2025-02-05

Table of Contents

Summary

1. Graph Signals and Graph Signal Processing

Graph filters work ok for many things, but are limited to linear representations. Therefore, they lack expressivity if the task is complicated. ie, since these operations are only linear, we not able to represent more complex relationships.

Thus, we need to add nonlinearities (ex. ReLU etc) to help with the expressivity of our model. These techniques have been implemented in CNNs for image processing and other domains. GNNs will extend the ideas seen in CNNs to the graph domain - graph convolution .

Simple CNN

2025-02-05_graph-1.png

We can also think of any image as a graph, where the graph signals are the RGB values at each pixel, and the graph structure is determined by the pixel layout/grid of the image.

Graph Perceptron

Let σ:RR. The graph perceptron is defined as

xRnu=k=0K1hkSkx graph convuuyσ(u)pointwise nonlinearityy

ie a graph convolution followed by a pointwise nonlinear function

yi=σ(ui)
Examples

  • σ(x)=max(0,x) ReLU
  • σ(x)=11+ex Sigmoid
  • σ(x)=tanh(x)=exexex+ex hypertangent

(see graph perceptron)

Question

Is a graph perceptron local?

Answer

Yes, we can still write the perceptron in terms of the local nodes, since convolutional graph filters are local

Multi-Layer Graph Perceptron

A(n -layer) multi-layer graph perceptron (MGLP) is simply several graph perceptrons in sequence. This forms a function with input x0=x, and output y=xL.

Where at each layer, $$x_{\ell} = \sigma(u_{\ell}) = \sigma\left( \sum_{k=0}^{K-1} h_{\ell,k} S^k x_{\ell-1} \right), ; 1 \leq \ell \leq L$$
We call x the -layer embedding.

For conciseness, we define our weights H={hk,|0kK1,1L} and represent our MLGP as $$y = \Phi(x, S; H)$$

Illustration

2025-02-05_graph-2.png

(see multi-layer graph perceptron)

Full-Fledged GNNs

Real-world graph data is often high dimensional.

Example

2025-02-05_graph-3.png
Suppose a,b,c are drones communicating via wifi and x is their spatial coordinate in 3D: xR3×3.

In order to do this, we need graph convolution that take in multi-dimensional data.

Convolutional Filter Bank

Let XRn×d, HkRd×d for k=0,,K1. Then a convolutional filter bank (or multiple input multiple output graph convolution) is a multi-dimensional graph convolutiongiven by

Y=k=0K1SkXHk
  • SkRn×n is still a graph diffusion/shift
  • Hk is a linear transformation mapping d features to d features

(see convolutional filter bank)

GNN

A "full-fledged" GNN layer is given by

X=σ(u)=σ(k=0K1SkX1H,k)

H,kRd1×d for 1L, where

  • X0=XRn×d0
  • Y=XLRn×d

Here, X is still called an -layer embedding.

(see Graph Neural Networks)