Hi, I’m Jessie Richter-Powell. As of May 2025, I’m an intern at the NVIDIA Toronto AI lab. Previously, I spent 2023-2024 at MIT in an EECS PhD program, until I got distracted by shiny things elsewhere. Before that, I was a visiting researcher under the supervision of David Duvenaud at the Vector Institute from 2022-2023. Concurrently, I worked out of the MILA group headed by Yoshua Bengio for the summer of 2023, and still can be found frequently wandering that area.
Recently, my interests have pivoted towards work in the area of sound and machine learning. My recent preprint on using score distillation to program synthesizers and perform source separation, Score Distillation Sampling for Audio, marks a first exploration in this space.
Previously, my research was more concerned developing principled and structured learning algorithms with strong inductive biases derived from differential geometry, PDE theory and later physics. Put simply: stop learning principles you already know – save the learning for the hard bits.
An example of this can be seen in my NeurIPS 2022 paper, Neural Conservation Laws (along with Ricky Chen and Yaron Lipman). A brief synopsis: we exploited the classical identity d^2=0 to parameterize exact solutions of the continuity equation. This allows us to structurally enforce conservation of mass for applications like variable density fluid simulation and dynamical optimal transport.
Before that, I collaborated with Rustum Choksi and Carola Bibiane-Schönlieb on a data-driven extension of the Maximum Entropy on the Mean Method for Image Deblurring / Denoising.