Neural Message Passing for Quantum Chemistry

A unified framework for graph neural networks applied to molecular property prediction

Neural Message Passing for Quantum Chemistry introduced the Message Passing Neural Network (MPNN) framework, unifying various graph neural network architectures under a single formalism. It achieved state-of-the-art results on molecular property prediction.

The Problem

Molecules are naturally graphs: atoms are nodes, bonds are edges. Predicting properties like energy or toxicity requires learning from this graph structure.

Message Passing Framework

MPNNs operate in two phases:

1. Message Passing Phase

For TT timesteps, each node collects messages from neighbors:

mvt+1=uN(v)Mt(hvt,hut,evu)m_v^{t+1} = \sum_{u \in N(v)} M_t(h_v^t, h_u^t, e_{vu})

Then updates its hidden state:

hvt+1=Ut(hvt,mvt+1)h_v^{t+1} = U_t(h_v^t, m_v^{t+1})

2. Readout Phase

Aggregate node features into a graph-level prediction:

y^=R({hvTvG})\hat{y} = R(\{h_v^T | v \in G\})

Interactive Demo

Watch messages flow through a molecular graph:

Neural Message Passing

Layer 1/3
COOHH
Example: Formaldehyde (CH₂O) molecule
Phase: Message Collection
mv = Σu∈N(v) M(hu, hv, euv)
M
Message Function
U
Update Function
R
Readout Function

Key Components

ComponentFunctionCommon Choices
M (Message)Computes edge messagesMLP, attention
U (Update)Updates node statesGRU, LSTM
R (Readout)Graph-level outputSum, attention

Unifying Prior Work

The MPNN framework encompasses:

  • Convolutional Networks on Graphs (Duvenaud et al.)
  • Gated Graph Neural Networks (Li et al.)
  • Interaction Networks (Battaglia et al.)
  • Deep Tensor Neural Networks (Schütt et al.)

Each is an MPNN with specific MM, UU, and RR functions.

Virtual Graph Elements

The paper introduced “virtual edges” connecting all atom pairs:

evu=(dvu,bond type)e_{vu} = (d_{vu}, \text{bond type})

where dvud_{vu} is the 3D distance. This allows the network to reason about non-bonded interactions.

Results on QM9

Predicting molecular properties on the QM9 dataset:

PropertyUnitsMPNN Error
HOMOeV0.043
LUMOeV0.038
GapeV0.066
μDebye0.030

MPNN achieved chemical accuracy on most targets.

Why This Matters

  1. Unified framework: Clarified the design space of graph neural networks
  2. Practical impact: Enabled ML-accelerated drug discovery
  3. Architectural insight: Showed that message passing is the key inductive bias

Legacy

MPNNs became the foundation for:

  • SchNet — Continuous-filter convolutions
  • DimeNet — Directional message passing
  • Equivariant GNNs — Respecting 3D symmetries

Key Paper

Found an error or want to contribute? Edit on GitHub