My research involves applying differentiable programming to problems from differential geometry and optimal transport. Much of this consists of exploiting constructions from exterior and tensor calculus for computational problems found in machine learning.
An example of this can be seen in our recently accepted NeurIPS 2022 paper, Neural Conservation Laws (along with Ricky Chen and Yaron Lipman). A brief synopsis: we exploited the classical identity d^2=0 to parameterize exact solutions of the continuity equation. This allows us to structurally enforce conservation of mass for applications like variable density fluid simulation and dynamical optimal transport.
In the past, I collaborated with Rustum Choksi and Carola Bibiane-Schönlieb on a data-driven extension of the Maximum Entropy on the Mean Method for Image Deblurring / Denoising.