InvertibleNetworks.jl – Memory efficient deep learning in Julia

  • ,
  • Mathias Louboutin ,
  • Ali Siahkoohi ,
  • Felix J. Herrmann ,
  • Gabrio Rizzuti ,
  • Bas Peters

JuliaCon 2021 |

We present InvertibleNetworks.jl, an open-source package for invertible neural networks and normalizing flows using memory-efficient backpropagation. InvertibleNetworks.jl uses manually implement gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. We present the architecture and features of the library and demonstrate its application to a variety of problems ranging from loop unrolling to uncertainty quantification.