Abstract: Message passing algorithms are the core of most neural networks that process information on graphs. Conventionally, such methods are invariant under permutation of the messages and hence forget how the information flows through the network. Analyzing the local symmetries of the graph, we show that a more general message passing network can in fact be sensitive the flow of information by using different kernels on different edges. This leads to an equivariant message passing algorithm that is more expressive than conventional invariant message passing, overcoming fundamental limitations of the latter. We derive the weight sharing and kernel constraints by modelling the symmetries using elementary category theory and show that equivariant kernels are “just” natural transformations between two functors. This general formulation, which we call Natural Networks, gives a unified theory to model many distinct forms of equivariant neural networks.