profile_picture
Jack Richter-Powell
(they/them)

Hi, I’m Jack Richter-Powell. I’ve just finished my undergraduate at McGill in Mathematics and Computer Science. I’ll be starting a PhD next fall, but I’m not sure where yet.

Currently, my research involves applying differentiable programming to problems from Differential Geometry and Optimal Transport. Some baby steps in this direction were detailed in our recent paper “Input Convex Gradient Networks”, but this is just one of many avenues I hope to explore within this paradigm. We are working on a follow up paper that will hopefully go deeper in this direction.

Concurrently, I am also collaborating with Rustum Choksi and Carola Bibiane-Schönlieb on expanding the Maximum Entropy on the Mean Method for Image Deblurring / Denoising. We have developed a new method that allows us to incorporate high level features from a dataset into a prior distribution – which we refer to as a learned prior. We will be detailing this in a upcoming JMLR submission in early 2022.

Additionally, I am also beginning to work with Ricky Chen, Jesse Bettencourt and David Duvenaud on extending their work on Neural ODEs to a class of partial differential equations.

Interests

  • Differential Geometry
  • Optimal Transport
  • Differentiable Programming
  • Partial Differential Equations
  • Generative Modelling

Academia

McGill University
2017 - 2021
B.A Joint Hons. Math & CS
Completed 9 courses at the graduate level, including core grad math curriculum

News

Recent Publications

Learned Priors for the Maximum Entropy on the Mean Method (MEMM) for Image Processing, 2022, JMLR (Soon to be submitted)
Jack Richter-Powell , Carola Bibiane-Schönlieb , Rustum Choksi
Input Convex Gradient Networks, 2021, NeurIPS OTML Workshop
Jack Richter-Powell , Jonathan Lorraine , Brandon Amos

Extra

Miscellaneous

Projects

Some other things I worked on
Aphelion - Minigame based on 2d orbital mechanics in JS