DEM-Engine SPH model integration

115 views
Skip to first unread message

Sabrina Lanfranco

unread,
Aug 21, 2025, 12:38:14 PMAug 21
to ProjectChrono

Hello everyone,
I am currently using DEM-Engine to model planetary regolith in scenarios involving interactions with space exploration objects. I would now like to extend this modeling to include the study of regolith–fluid interactions.

In your opinion, what would be the most convenient approach: integrating DEM with a solver such as DualSPHysics, or directly implementing a fluid model within DEM itself? In both cases, this would require modifications to the DEM codebase. That is why I am writing here, hoping to get some feedback from the developers: perhaps there is already something undocumented, or maybe you have already considered an approach in this direction.

Thanks in advance,

Sabrina

Dan Negrut

unread,
Aug 21, 2025, 12:46:13 PMAug 21
to Sabrina Lanfranco, ProjectChrono

Sabrina,

In theory, you can do this in the SPH solver in Chrono, hopefully my colleague Radu will comment on this. It’d require very large sim times because the number of SPH particles would be really large, which is needed to capture the dynamics of the grains.

Another way to do it is DEM-LBM. Chrono has no support for this, and no plan to implement in the immediate future. The sim times would probably be very long, but it’d be a nice approach. If Ruochun sees this, he might comment on this idea.

Lastly, you can homogenize this and represent the regolith–fluid interactions through a continuum and then use the CRM solver in Chrono. You’d need to have the right material model, which means that you’ll have to go beyond the hypo-elastoplastic material model that we have there right now (Drucker-Prager plasticity, with no cap).

Dan

---------------------------------------------

Bernard A. and Frances M. Weideman Professor

NVIDIA CUDA Fellow

Department of Mechanical Engineering

Department of Computer Science

University of Wisconsin - Madison

4150ME, 1513 University Avenue

Madison, WI 53706-1572

608 772 0914

http://sbel.wisc.edu/

http://projectchrono.org/

---------------------------------------------

--
You received this message because you are subscribed to the Google Groups "ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email to projectchron...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/projectchrono/fa00b521-e8a1-4dd6-be54-7e37358cd230n%40googlegroups.com.

Sabrina Lanfranco

unread,
Aug 21, 2025, 1:23:00 PMAug 21
to ProjectChrono
Thank you for your reply,
I was already aware of the possibility offered by Chrono, but I necessarily have to continue using DEM, since my entire master’s thesis was developed on it. By customizing the CUDA kernels, I was able to implement a thermal model and modify the electrostatic one. The goal was to build a comprehensive regolith model, not just a mechanical one, and moving to Chrono would mean losing this work. For my PhD, I will also need to extend what I have done so far to include interactions with plasma, which makes it essential to keep the electrostatic model. Thank you again for your response and for the suggestion regarding DEM-LBM. I now look forward to any comments from Ruochun as well.

Best regards,
Sabrina  

Ruochun Zhang

unread,
Aug 23, 2025, 8:59:46 AMAug 23
to ProjectChrono

Hi Sabrina,

This is very ambitious indeed. I can comment on it based on what I know, and you can decide if it is relevant for your research.

First, I hope the APIs provided in DEM-Engine are enough to allow your inclusion of the thermal and electrostatic systems. It seems they are, but feel free to post threads here if you later discover other needs.

The biggest bottleneck in regolith–fluid simulations is the enormous scale required, and that's why Dan suggested using another, unified model for it. But since your focus is on building a comprehensive model, not an engineering solution for a particular problem, that's not an option, and I assume you'd want two-way coupling (i.e., as much coupling as possible) in your simulation. I'd also assume you don't need extreme fluid velocities, like over 0.3 Ma. Then the biggest question is: since your DEM side model is already heavy, how much emphasis would you like to put on the fluid part? Or, put another way, I think it's a problem of what fluid–solid ping-pong paradigm to use, not what package to use. One thing is for sure: none of the approaches will be “convenient” to make happen.

Using SPH is fine, but I suspect you'll need markers much smaller than DEM particles, so limiting the overall problem scale is important. It may face more challenges if the Reynolds number is high. Also, it would involve the integration between two GPU packages, which is a more serious software engineering task, and there might be people who have tried that on DualSPHysics' forum. I'd say if you go this route, you are certainly treating the fluid part no less seriously than the DEM part, and consulting the developers there beforehand is certainly needed.

FVM- or FEA-based CFD solvers are fine too, and I can imagine myself building/using a PETSc-based solver for this task. The key would be to update the DEM particle-represented boundary (if moving mesh) or track/mark the moving boundary-influenced nodes (if immersed boundary), which has very little to do with DEM itself — it only needs particle pos/vel info, which DEM-Engine can certainly supply. I'd probably recommend an immersed boundary approach for reasons I'll give in the LBM-related part. This is also how I imagined DEM-Engine users would do fluid co-simulation. As you will have a lot of things to do on the host anyway (mesh adjustment, node marking...), you'll use DEM-Engine's tracker to bring the information to the host, update the mesh and fluid solver, run it, and then feed the fluid force back to DEM-Engine. This should position you more as a user of computing packages, rather than a solver developer. This approach can be used regardless of whether you think the fluid is an emphasis, as you can always choose to use fewer features of the solver to make the fluid part easier and faster, or do the opposite. But you probably won’t modify the fluid solver that much, so there might be more restrictions in coding flexibility.

You can also write your own fluid solver, but I think most likely that means the fluid is not a main focus of the research you want to present. And if you do, like Dan said, I would say LBM is a good choice. I only recently became interested in LBM’s usage in related co-simulations. Two main benefits:

  1. It's fully Eulerian, therefore easy to use alongside DEM, as the DEM particles are the only moving part. For the LBM part, you just mark the DEM particle-shadowed grid points as solid. It's similar to why I think the immersed boundary is better for your use case. The method is also in general easy to implement. You can literally ask ChatGPT to write one for you, after you read the basics of it.

  2. It's massively parallel, and should go well with DEM-Engine on GPUs.

The downside is that LBM is certainly much less used and appreciated than, say, FVM. While it should be very serviceable for you, convincing the community might be another issue. You could, of course, implement a FVM solver yourself — it's again very doable if you don't aim too high. It really doesn't matter if it's fully on GPU and only exchanges device arrays with DEM (“implementing a fluid model within DEM itself”), or if it brings data to the host for processing; I think in the grand scheme of such an ambitious project, it's a minor issue and we can always resolve it later if it matters.

As for the software we can provide: Publishing a GPU-based LBM solver is a possibility in the longer term, but you have a PhD to finish, so it doesn't seem you can wait for us. You could write it yourself, as making one that is at least usable is not too hard. I do have a plan to provide a performance-centric FEA/FVM-based fluid solver on GPU relatively soon. If you are going to spend a couple of months looking into the DEM model before having to consider fluid, then the timing may line up. It should naturally go well in co-simulation with DEM-Engine or Chrono, as it's the same family and allows for step-wise advancing of the simulation, too. However, as it stands, we cannot make promises to you about a ready-to-use DEM-capable fluid solution right now.

Let me know if you have further questions,
Ruochun

Dan Negrut

unread,
Aug 23, 2025, 9:29:28 AMAug 23
to Ruochun Zhang, ProjectChrono, Sabrina Lanfranco

Ruochun – I read your email, super thoughtful.

Circling back to Sabrina – the question is also how big the problem is.

The FSI solver in Chrono has made huge progress over the last 12 months – Radu and Luning and Huzaifa can speak to that.

If we are talking about 1000 DEM particles here, then I think that the easiest way to go is to simply simulate the entire thing in Chrono: the DEM part, using DEME; the CFD side, in Chrono::FSI. The solution would be entirely on the GPU. We never optimized the communication for DEM-FSI “on-the-GPU” simulation since we’ve never been faced with such requests. But the big deal here is that the memory for DEME and FSI is GPU resident and therefore can draw on the TB/s bandwidth and small latency of device memory (compared to host-device traffic). I truly think that if there was funding to do this GPU-GPU, FSI-DEME co-simulation, a full blown Chrono solution would be top notch.

However, if for the problem at hand Sabrina needs say 1,000,000 DEM particles, that’s a different story. I think no matter what approach is taken in that case, it’s going to be really, really slow if one fully resolves the dynamics of both the particles and the fluid.

Dan

---------------------------------------------

Bernard A. and Frances M. Weideman Professor

NVIDIA CUDA Fellow

Department of Mechanical Engineering

Department of Computer Science

University of Wisconsin - Madison

4150ME, 1513 University Avenue

Madison, WI 53706-1572

608 772 0914

http://sbel.wisc.edu/

http://projectchrono.org/

---------------------------------------------

 

Ruochun Zhang

unread,
Aug 23, 2025, 9:54:21 AMAug 23
to ProjectChrono
Hi Dan,

Yes I agree with that. It's only when the problem's scale is massive, you have to consider what method or tool would allow for a reasonable fluid representation at the cost of maybe 2 times the DEM system, not 10 times...

Speaking of using DEME along with another GPU tool like C::FSI, there's one thing I should've done long ago and that's allowing the user to select the device the solver runs on. Right now it just aggressively takes the two devices it first sees and uses them, perhaps the friendliest collaborator...

Ruochun

Sabrina Lanfranco

unread,
Aug 25, 2025, 5:19:18 AMAug 25
to Ruochun Zhang, ProjectChrono

Thank you very much for your replies and for the suggestions.

To give you an idea of the scale of my work: during my Master’s thesis I carried out simulations with between 500,000 and 1,000,000 elements. For the fluid extension, I expect the order of magnitude to be similar.

Addressing the point raised by Ruochun about fluid velocities: in my case I need to consider both the solar wind impacting the surface and the rocket exhaust gases. These are therefore regimes that can exceed Mach 1. Moreover, the focus of my work will not be so much on defining the fluid model itself, but rather on analyzing the interactions with the regolith. For this reason, and based on your suggestions, it seems to me that the most reasonable options are:

  • to evaluate CFD solvers based on FVM or FEA, if I can find a package suitable for my case;

  • or to consider developing an LBM solver, which from what you say could be easier to integrate.

In the coming months I will be focusing on reviewing the literature, and I will certainly keep in mind the solver you are developing, should it become available in the near future.

I would also like to ask for your advice on hardware. At the moment I am using a workstation with two RTX A4000 GPUs: with this setup, a simulation with about one million particles and a duration of 430 seconds takes around 6 days of computation. Looking ahead to future co-simulations, I have the possibility to upgrade: which GPUs would you recommend, while staying within a mid or mid-high range, without considering top-of-the-line models?

Thank you again for your availability and support.

Best regards,
Sabrina


Ruochun Zhang

unread,
Aug 25, 2025, 1:52:03 PMAug 25
to ProjectChrono
Hi Sabrina,

I'd say I basically agree. But if you have plans for fluid over Mach 1, I doubt that I am able to comment too much on the CFD side. I can say that for LBM, you probably need to implement special modified versions for compressible and shock wave capture, and if it's much higher than Mach 1, I doubt if any model works. It's not to say it's that much easier using FVM either. But I think in such extreme conditions, it's still better to select an interaction scheme between the two physics domains that is straightforward, stable and simpler to implement, such that in the end you at least have results, so that part of what I said still holds.

About the hardware, like you tested, the solver runs well on professional cards, but in general, getting the top-tier gaming card that your budget allows for gives you the most performance-per-dollar.

Thank you,
Ruochun

Radu Serban

unread,
Aug 25, 2025, 7:25:30 PMAug 25
to ProjectChrono

Sabrina – a few comments:

 

  • With an SPH-based fluid solver (whether Chrono::SPH or DualSPHysics), the size of the solid phase elements will dictate the SPH spatial resolution (particle-spacing) due to the approach these codes take to enforce the phase coupling. With DEM simulations of the size you mention that is not a feasible approach.
  • If you are interested in FSI problems with fewer and larger solid (rigid or flexible) objects, then an SPH solver for the fluid phase will work just fine. These are the type of problems we target with our current Chrono::FSI framework and Chrono::SPH fluid solver.  By the way, I am in contact with DualSPHysics folks and will work with them in bringing in DualSPHysics within the new Chrono::FSI framework.
  • For FSI problems like yours (with a DEM solid phase), other fluid solvers and different phase-coupling techniques are necessary.  One such option would be OpenFOAM with IBM (immersed boundary method).
  • We will start working soon on an extension of Chrono::FSI to allow coupling Chrono multibody systems to other fluid solvers (SPH, but also OpenFoam) by providing preCICE adapters. In principle, that approach could be extended to include a DEM solid phase (although I do not plan on doing that in the immediate future).
  • It is also likely that we will look into providing an LBM option for Chrono::FSI.  I strongly oppose reinventing the wheel, so we will first look for an existing third-party alternative (and implement our own only if we do not find a suitable external solver).

 

I believe you underestimate the work required to implement a fluid solver from scratch and couple it with a multibody and/or DEM solver.  The work on the preCICE interfaces will start relatively soon (in a couple of months or so). I think you’re better off waiting for us to have an updated architecture of the Chrono::FSI framework and first implementations or preCICE adapters for Chrono MBD and Chrono::SPH, at which point you could look into providing a similar adapter for DEME.

 

--Radu

Reply all
Reply to author
Forward
0 new messages