Starkly Speaking: Generative Modeling via Drifting
45 views
Skip to first unread message
Hannes Stärk
unread,
Feb 21, 2026, 5:03:07 PM (13 days ago) Feb 21
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to stark...@googlegroups.com
Hi together,
On monday we will have MingYang and Kaiming's paper on Drifting models!
Speaker: MingYang Deng who is a PhD student at MIT working with Kaiming He.
Paper: Generative Modeling via Drifting https://arxiv.org/abs/2602.04770v1 (Mingyang Deng, He Li, Tianhong Li, Yilun Du, Kaiming He) Generative modeling can be formulated as learning a mapping f such that its pushforward distribution matches the data distribution. The pushforward behavior can be carried out iteratively at inference time, for example in diffusion and flow-based models. In this paper, we propose a new paradigm called Drifting Models, which evolve the pushforward distribution during training and naturally admit one-step inference. We introduce a drifting field that governs the sample movement and achieves equilibrium when the distributions match. This leads to a training objective that allows the neural network optimizer to evolve the distribution. In experiments, our one-step generator achieves state-of-the-art results on ImageNet at 256 x 256 resolution, with an FID of 1.54 in latent space and 1.61 in pixel space. We hope that our work opens up new opportunities for high-quality one-step generation.