Beyond scores: proximal diffusion models

Wednesday, March 4, 2026 - 12:30

427 Thackeray

Speaker Information
Jeremias Sulam
John Hopkins

Abstract or Additional Information

 Diffusion models have quickly become some of the most popular and powerful generative models for high-dimensional data. The key insight that enabled their development was the realization that access to the score (the gradient of the log-density at different noise levels) allows for sampling from data distributions by solving a reverse-time stochastic differential equation (SDE) via forward discretization, and that popular denoisers allow for unbiased estimators of this score. In this talk I will demonstrate that an alternative, backward discretization of these SDEs, using proximal maps in place of the score, leads to theoretical and practical benefits. We will leverage recent results in proximal matching to learn proximal operators of the log-density and, with them, develop Proximal Diffusion Models (ProxDM). Theoretically, and assuming oracle access to distributional quantities, these models provide faster convergence to target distributions. Empirically, I will show that ProxDM achieves significantly faster convergence within just a few sampling steps compared to conventional score-matching methods for unconditional, conditional, and latent diffusion models.