[[concept]]fully connected readout layer
In a fully connected readout layer we define Where
- in general vectorizes matrices into vectors
- can be the identity or some other pointwise nonlinearity (ReLU, softmax, etc)
Note
There are some downsides of a fully connected readout layer
The number of parameters depends on - adding learning parameters, which grows with the graph size. This is not amenable to groups of large graphs
No longer permutation invariant because of the operation
fully connected readout layers are no longer permutation invariant
Verify that
- No longer transferrable across graphs.
- unlike in GNNs, depends on . So if the number of nodes changes, we have to relearn
These make this a not-so-attractive option, so we usually use an aggregation readout layer
Mentions
TABLE
FROM [[]]
FLATTEN choice(contains(artist, this.file.link), 1, "") + choice(contains(author, this.file.link), 1, "") + choice(contains(director, this.file.link), 1, "") + choice(contains(source, this.file.link), 1, "") as direct_source
WHERE !direct_source