Jack Richter-Powell

Hi, I’m Jack Richter-Powell. Currently, I’m a PhD student at the Massachusetts Institute of Technology under the superivision of Justin Solomon. Before that, I was a visiting researcher under the supervision of David Duvenaud at the Vector Institute from 2022-2023. Concurrently, I worked out of the MILA group headed by Yoshua Bengio for the summer of 2023.

My research involves developing principled and structured learning algorithms with strong inductive biases derived from differential geometry, PDE theory and more recently physics. Put simply: stop learning principles you already know – save the learning for the hard bits.

An example of this can be seen in our recently accepted NeurIPS 2022 paper, Neural Conservation Laws (along with Ricky Chen and Yaron Lipman). A brief synopsis: we exploited the classical identity d^2=0 to parameterize exact solutions of the continuity equation. This allows us to structurally enforce conservation of mass for applications like variable density fluid simulation and dynamical optimal transport.

Before that, I collaborated with Rustum Choksi and Carola Bibiane-Schönlieb on a data-driven extension of the Maximum Entropy on the Mean Method for Image Deblurring / Denoising.


  • Optimal Transport
  • Differential Geometry
  • Computational Physics / Chemistry
  • Differentiable Programming


McGill University
2017 - 2021
B.A Joint Hons. Math & CS
Completed 9 courses at the graduate level, including core grad math curriculum


Recent Publications

Neural Conservation Laws, 2022, NeurIPS 2022
Jack Richter-Powell , Yaron Lipman , Ricky T. Q. Chen
Input Convex Gradient Networks, 2021, NeurIPS OTML Workshop
Jack Richter-Powell , Jonathan Lorraine , Brandon Amos




Some other things I worked on
Aphelion - Minigame based on 2d orbital mechanics in JS