Member-only story
Improving GNNs
Using Subgraphs for More Expressive GNNs
The expressive power of Message-Passing Graph Neural Networks is inherently limited due to their equivalence to the Weisfeiler-Lehman graph isomorphism test. Several concurrent recent works show that this limitation can be overcome by applying a GNN on a collection of subgraphs obtained by removing nodes or edges from the input graph. In this post, we review the common themes and nuances of these different approaches.

This post was co-authored with Leonardo Cotta, Fabrizio Frasca, Haggai Maron, Christopher Morris, and Lingxiao Zhao. See also an introduction to the Weisfeiler-Lehman test and structural encoding.
Message-passing Graph Neural Networks (MPNNs) [1] consist of several layers performing node-wise aggregation of information from neighbour nodes. The appealing characteristic of such architectures is their locality (i.e., every computation requires only the immediate neighbours of a node) and linear complexity in the number of edges [2]. The drawback is however in their limited expressive power: it was shown in [3–4] that MPNNs are at most as expressive as the Weisfeiler-Lehman (WL) graph isomorphism test [5], a classical method attempting to determine whether two graphs are structurally equivalent (“isomorphic”) by means of iterative colour refinement [6].
The Weisfeiler-Lehman test is a necessary but insufficient condition: in fact, some non-isomorphic graphs might be indistinguishable [7]. One such example is shown in the following Figure:

In graph theory, the graph isomorphism test originally introduced by Boris Weisfeiler and Andrei Lehman in the 1960s was extended in the late 1970s [8] to a hierarchy of increasingly more powerful higher-dimensional k-WL tests that operate on k-tuples of nodes. The authors of this post, Christopher Morris and Haggai Maron — along with their…