Modeling finite char inventory in Cantera 3.1 for biomass gasification – seeking advice on approach

23 views
Skip to first unread message

Omkar Kedge

unread,
Dec 1, 2025, 11:39:06 AM (5 days ago) Dec 1
to Cantera Users' Group
Hi everyone,

I apologize for the very long question in advance, I wanted to make sure I am covering the entire context in detail. I also uploaded a barebone version of my script for more context.

I'm an industry researcher working on biomass gasification, and I've recently transitioned from Chemkin to Cantera over the past few months. I'm using the CRECK char mechanism and trying to anchor my model against experimental data. It's been a learning curve, and I've run into what feels like a fundamental limitation so I wanted to reach out and see if I'm missing something or if others have found good workarounds.

The core challenge is representing a finite solid char inventory that gets consumed by heterogeneous reactions. In my system, I have a batch of char (bulk solid species plus reactive surface sites) with a continuous flow of reactant gas  passing over it. The heterogeneous reactions couple gas, surface, and bulk phases, and as the char gasifies, some reactive carbon is consumed while unreactive ash remains behind.

What I really need is for the bulk char moles and mass to be part of the dynamical state — evolving alongside the gas composition, temperature, and surface coverages. I also want to compute the reactive surface area from the remaining char mass using an area law, so that as char is consumed, the available surface decreases.

I've gone through the earlier discussions on this topic, including the Google Groups thread  and the enhancement proposal #202. The enhancement issue makes it clear that ReactorNet currently treats bulk phases as infinite reservoirs, and that surface reaction rates aren't limited when an adjacent bulk phase is depleted. That matches what I've observed, but I wanted to confirm my understanding and see what others recommend..

My initial attempt was straightforward: a single gas-phase reactor with a ReactorSurface, where the surface Interface has a BulkPhase registered as an adjacent phase. I expected that since the surface reactions involve bulk species, the reactor network would evolve the bulk moles as those reactions proceed. That didn't happen  the gas composition and surface coverages evolved as expected, but the bulk composition and moles stayed fixed. The bulk phase acts as an infinite reservoir, providing thermodynamic properties but not participating in the mass balance.

I also tried a dual-reactor setup with separate gas and bulk reactors sharing a surface, thinking I could manually update the bulk mass using setDensity(). That didn't work either — the bulk density isn't part of the reactor ODE state, so any changes I make get ignored or cause errors.

So my takeaway from these attempts is that in the current Reactor + ReactorSurface framework, a BulkPhase is always treated as an infinite reservoir. Its moles and mass aren't part of the state vector that ReactorNet integrates. Is that understanding correct?

Manual bulk evolution — my current workaround :

Given those limitations, I've settled on a workaround where I wrap a manual integration loop around the ReactorNet. The basic idea is to let Cantera handle the gas-phase species and surface coverages as usual, but track the bulk moles myself in a separate ledger and update them using the surface production rates.

For the area law, I compute the reactive surface area from the instantaneous bulk mass as:

    A(t) = reactive_SA * M_solid(t)^alpha

Here, reactive_SA is the BET surface area multiplied by an active fraction (in m^2/g), M_solid is the total bulk mass in grams, and alpha is an exponent (I use 1.0 for a linear relationship, though 2/3 would correspond to a shrinking sphere model). The physical intuition is that as char is consumed and M_solid drops, the available surface area decreases, which in turn reduces the reaction rates — a stabilizing negative feedback.

The integration loop looks roughly like this:

    # 1. Advance Cantera (gas + surface)
    sim.advance(t + dt)

    # 2. Normalize coverages (Cantera doesn't enforce sum(theta) = 1)
    theta = np.array(rsurf.coverages)
    rsurf.coverages = theta / theta.sum()

    # 3. Get bulk production rates
    #    RECOMMENDED: omega = surf.net_production_rates[bulk_species_index]
    #    ALTERNATIVE: omega = nu_s @ surf.net_rates_of_progress
    #    Units: kmol/m^2/s

    # 4. Update bulk moles
    dN_bulk_i = omega_i * A(t) * dt   # kmol

    # 5. Non-negativity clamp
    Nbulk = np.maximum(Nbulk, 0.0)

    # 6. Update area from new mass
    M_solid = sum(Nbulk * MW)
    A(t) = reactive_SA * M_solid^alpha

    # 7. Feed area back to ReactorSurface
    rsurf.area = A(t)

The feedback loop here is that the surface rates depend on area, the area depends on mass, and the mass changes based on the rates. When things work well, consumption reduces mass, which reduces area, which slows down the rates — a nice self-limiting behavior. But when the rates are very fast relative to the timestep, things can go wrong.

Where this approach breaks down :

The main issue I've encountered is with the non-negativity clamp. When a reaction rate is fast enough that rate * dt exceeds the available moles of a species, the explicit Euler update tries to drive that species negative. I clamp it to zero to avoid unphysical values, but this breaks mass conservation in a subtle way.

Here's a concrete example: suppose I have a reaction A(B) -> B(B) running at 1000 mol/s, and I'm using dt = 0.001s. If A has only 0.01 mol left, the update wants to remove 1.0 mol from A (way more than available) and add 1.0 mol to B. After clamping, A goes to zero, but B still gets the full 1.0 mol — so I've effectively created 0.99 mol out of nothing. This is a problem.

The issue gets worse when internal solid-phase reactions (like phase transformations or polymerization steps in the mechanism) are much faster than the gas-solid gasification reactions. The intermediates accumulate faster than they're consumed, and the clamp keeps injecting artificial mass at every overshoot.

Non-isothermal operation makes this even harder. With energy="on", the temperature can rise during exothermic reactions, which accelerates the Arrhenius rates exponentially. What was a manageable rate at the initial temperature can become impossibly fast at elevated temperatures, and the whole thing runs away.

I've also had to deal with coverage sum drift. Cantera's ODE solver integrates the coverage derivatives, but it doesn't enforce that the coverages sum to one. I have to manually renormalize after each step, which adds another layer of bookkeeping.


What does work : For what it's worth, the approach does work reasonably well under certain conditions. Isothermal operation at moderate temperatures (around 1100K for my system) with a timestep of 0.001s gives sensible results — the bulk mass decreases over time as expected. Using tighter tolerances (rtol=1e-10, atol=1e-14) helps avoid CVodes errors from the stiff surface kinetics. Non-isothermal cases need much smaller timesteps (around 0.00001s), and even then they can fail if the temperature rises too much.

Questions for the community :

I'd really appreciate any feedback on the following:

  1. First, is my understanding of the current framework correct? In a Reactor + ReactorSurface setup today, is a BulkPhase always treated as an infinite reservoir, with its moles and mass not part of the ODE state that ReactorNet integrates?
  2. Second, is this manual bulk evolution approach the recommended way to handle TGA-style finite solid consumption in Cantera right now, or is there a better-established pattern that I'm missing
  3. Third, does the area law approach make physical sense to you? Using A(t) = reactive_SA * M^alpha to couple the reactive area to the remaining mass, and feeding that updated area back into the ReactorSurface each timestep — is that a reasonable way to model a shrinking or evolving char particle? I'm also curious whether the area update should happen before or after advancing the reactor in each step.
  4. And finally, for anyone who has modeled similar systems: how do you handle the feedback loop robustly? Are there tricks for enforcing non-negativity without the artificial mass injection that I'm seeing when the solid is nearly depleted?
I'm am also exploring ExtensibleReactor to bring the bulk evolution directly into the reactor state vector, which might solve some of these issues more elegantly. I did run into problems there as well, for which I will upload another question since this question in itself is way too long. 

Thanks in advance for any guidance or suggestions. I'm happy to share more detail if that would help.

Best regards,
Omkar

Char_section_PSR_barebone_FORUM.py
Reply all
Reply to author
Forward
0 new messages