Implicit Neural Representations for Deformable Image Registration
Jelmer M. Wolterink, Jesse C. Zwienenberg, Christoph Brune
Deformable medical image registration has in past years been revolutionized by deep learning with convolutional neural networks. These methods surpass conventional image registration techniques in speed but not in accuracy. Here, we present an alternative approach to leveraging neural networks for image registration. Instead of using a neural network to predict the transformation between images, we optimize a neural network to represent this continuous transformation. Using recent insights from differentiable rendering, we show how such an implicit deformable image registration (IDIR) model can be naturally combined with regularization terms based on standard automatic differentiation techniques. We demonstrate the effectiveness of this model on 4D chest CT registration in the DIR-LAB data set and find that a single three-layer multi-layer perceptron with periodic activation functions outperforms all published deep learning-based methods, without any folding and without the need for training data. The model is flexible enough to be extended to include different losses, regularizers, and optimization schemes and is implemented using standard deep learning libraries.
Wednesday 6th July
Oral Session 1.3 - 16:20 - 17:20 (UTC+2)