Member-only story
Thoughts and Theory, Rethinking GNNs
Graph Neural Networks as Neural Diffusion PDEs
Graph neural networks (GNNs) are intimately related to differential equations governing information diffusion on graphs. Thinking of GNNs as partial differential equations (PDEs) leads to a new broad class of GNNs that are able to address in a principled way some of the prominent issues of current Graph ML models such as depth, oversmoothing, bottlenecks, and graph rewiring.
This blog post was co-authored with Ben Chamberlain and James Rowbottom, and is based on our paper B. Chamberlain, J. Rowbottom et al., GRAND: Graph Neural Diffusion (2021) ICML.

In March 1701, the Philosophical Transactions of the Royal Society published an anonymous note in Latin titled “A Scale of the Degrees of Heat” [1]. Though no name was indicated, it was no secret that Isaac Newton was the author (he would become “Sir Isaac” four years later). In a series of experiments, Newton observed that
the temperature a hot body loses in a given time is proportional to the…