You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to stark...@googlegroups.com
Hi together!
Today (in 1h) we will discuss:
Speaker: Ben Murrell Department of Microbiology, Tumor and Cell Biology, Karolinska Institutet
Paper: Branching Flows: Discrete, Continuous, and Manifold Flow Matching with Splits and Deletions https://arxiv.org/abs/2511.09465 (Hedwig Nora Nordlinder (1), Lukas Billera (1), Jack Collier Ryder (1), Anton Oresten (1), Aron Stålmarck (1), Theodor Mosetti Björk (1), Ben Murrell (1)) Diffusion and flow matching approaches to generative modeling have shown promise in domains where the state space is continuous, such as image generation or protein folding & design, and discrete, exemplified by diffusion large language models. They offer a natural fit when the number of elements in a state is fixed in advance (e.g. images), but require ad hoc solutions when, for example, the length of a response from a large language model, or the number of amino acids in a protein chain is not known a priori. Here we propose Branching Flows, a generative modeling framework that, like diffusion and flow matching approaches, transports a simple distribution to the data distribution. But in Branching Flows, the elements in the state evolve over a forest of binary trees, branching and dying stochastically with rates that are learned by the model. This allows the model to control, during generation, the number of elements in the sequence. We also show that Branching Flows can compose with any flow matching base process on discrete sets, continuous Euclidean spaces, smooth manifolds, and `multimodal' product spaces that mix these components. We demonstrate this in three domains: small molecule generation (multimodal), antibody sequence generation (discrete), and protein backbone generation (multimodal), and show that Branching Flows is a capable distribution learner with a stable learning objective, and that it enables new capabilities.