multiphases (water-steam) on falcon?

230 views
Skip to first unread message

Jean Francois Leon

unread,
Nov 15, 2015, 8:55:16 AM11/15/15
to moose-users
Hi All

I have started spending some time looking at multiphase flow porous material options with Moose.
One obvious candidate to start with is Moose's App falcon. ( I am also looking to redback as well- but this post is about Falcon)

Its on github and its documentation state ( quote):

"FALCON is highly extensible and can accommodate both multi-species and multi-phase formulations."

I have looked at the source code and the tests and I fail to see how to include 2 phases into it within the framework of the module.
is there any relevant examples or tests  that could be shared that include 2 phases?

Specifics questions:
1-there is One material PTGeothermal use the option "wseos" ( which I assume stand for water steam eos) but I fail to see any connections in the source file with real thermodynamic calculations or coupling with the IAWS moose module.
This option  is used in 2 Falcon tests PT_TH_Injection and PT_TH_Faust.
I fail to see where the steam/water eos is used in these tests.
 I tried to change the conditions to be in the pure steam regime for these tests and nothing obvious happened ( how do we keep track of saturation or water/steam ratio...?
 I see nothing in the code regarding  and corresponding energy exchange ( latent heat for example) and capillary pressure...as well.

2- More generally There is a bunch of physical stuff associated with multiphase calculations ( or species for that matter)
This include capillary pressure and relative permeability (see for example the very well written documentations in the RIchards equation module that explain it along with some issues attached to it that are valid well outside the formalism of richards equation.
Unfortunately I dont see anything in Falcon or moose source code that account for these puece of physics.
Any pointer please?
Any comments or indication to move efficiently forward will be much appreciated.
Thanks
JFL


Cody Permann

unread,
Nov 16, 2015, 5:02:02 PM11/16/15
to moose-users
Hi Jean,

Falcon may not contain all of the models it did when it was behind our firewall. Also, there have been a lot of changes to EOS and water steam modules over the years. I'll forward this message to a couple of the main developers who don't normally monitor this list and hopefully they can better answer your questions.

Cody

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.
Visit this group at http://groups.google.com/group/moose-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/moose-users/2432c3b9-0dfb-4583-866c-575cd093598f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Yidong Xia

unread,
Nov 17, 2015, 12:27:51 PM11/17/15
to moose-users
Hi Jean,

The current open-source version of Falcon provides only the single-phase liquid water component, but not the water/steam two-phase component at this moment. We have had the full water/steam EOS version in-house, yet the plan to migrate it into the open-source version is to be determined.

o-- In PTGeothermal (Pressure and Temperature based), the way to calculate the liquid water density and viscosity in the open-source Falcon is part of the in-house WSEOS, and is updated to use an automatic differentiation (AD) generated code to simultaneously calculate the properties, and their derivatives with respect to the main nonlinear variables. It is assumed to obtain accurate Jacobians than the old way of divided differencing. However, the same strategy would be tricker to implement for the water/steam two-phase regions as the calculation of properties involve Newton iterations. The convergence of the iterative process would be different for the properties and the derivatives. This is a main challenge to effectively implement the full WSEOS using automatic differentiation. 

o-- energy change and capillary pressure calculation shall be released in the future PHGeothermal (Pressure and Enthalpy based) formulation for full WSEOS, though it is already there in the in-house version.

o-- We will carry out a technical document of Falcon physics soon.

Cody Permann

unread,
Nov 17, 2015, 2:20:07 PM11/17/15
to moose-users
I'll expand on one aspect of this as well. We've had a difficult time finding a suitable steam table implementation that is compatible with MOOSE's LGPL license. There are a several good ones but they all carry the GPL license which is more restrictive than ours. We are interested in supplying our own but that is an ongoing process for which I don't have an end date that I can give you at this time. 

Cody

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.
Visit this group at http://groups.google.com/group/moose-users.
Message has been deleted

Jean Francois Leon

unread,
Nov 20, 2015, 9:04:44 AM11/20/15
to moose-users
Hi All

Thanks for the explanations to both of you.

I will keep an interested eye on possible developments of true multiphase capabilities "on the open side of the firewall"
Implementing them are clearly beyond my coding abilities but I will be willing to help on such an effort.

JF

JasonT404

unread,
Jun 9, 2016, 9:36:55 AM6/9/16
to moose-users
Yidong,

I am working on a rotor-craft icing code. My colleague has built and tested a bunch of different surface film solvers and ice crystal growth models in MatLab. I will be converting some of these to run in MOOSE to create a unified model. Before I do that though, I was hoping to find multi-phase and multi-species kernels capable of keeping track of elemental variables like mass fraction and surface energy. Does your Falcon app support this? Can it only handle water and steam? To use your app do I need to download something, or is it all stored under the water steam eos module?


Let me know if you have any questions for me,

Jason Turner
Applied Research Lab
Penn State

Daniel Schwen

unread,
Jun 9, 2016, 5:15:26 PM6/9/16
to moose-users
Jason,
the MOOSE phase field module supports building models for multi-phase multi-component systems. You can enable the module by uncommenting https://github.com/idaholab/falcon/blob/devel/Makefile#L23 (if the current version does not include all modules by default).
Cheers,
Daniel

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.

Andrew....@csiro.au

unread,
Jun 9, 2016, 7:10:28 PM6/9/16
to moose...@googlegroups.com

I really encourage you to contribute to MOOSE's new porous_flow module.  It is in its infancy at present, but it will grow significantly during 2016.  Currently it only does isothermal multi-phase multi-component porous flow coupled to geomechanics, but with no diffusion of individual species.  In the next months we will be adding more boundary conditions and dirac kernels, etc, then heat and diffusion, and later this year chemistry.


Please feel free to use Falcon, but i'm hoping it will become obsolete by the end of 2016.  Of course i may be just dreaming, but with your help it will be more likely to occur!


andy




From: moose...@googlegroups.com <moose...@googlegroups.com> on behalf of Daniel Schwen <dan...@schwen.de>
Sent: Friday, 10 June 2016 7:15 AM
To: moose-users
Subject: Re: multiphases (water-steam) on falcon?
 

Yidong Xia

unread,
Jun 10, 2016, 2:17:18 AM6/10/16
to moose-users
Hi Jason,

Thanks for your interest in FALCON.

o-- In addition to Daniel's and Andy's comments and suggestions, I would like to know that how urgent your project needs are?

o-- Currently FALCON-public only supports compressible, non-isothermal liquid water EOS, where the input variables are pressure and temperature. The nice feature of the water EOS in FALCON is that,  I have implemented the Automatic Differentiation generated code to compute Jacobians, which makes implicit solving very efficient. However, unfortunately, there may not be the budget to support me to continue working on extending the EOS into supporting both water and steam right now. Thus I am fully open to potential funding based collaboration. 

o-- Nevertheless, as indicated by Daniel, you may find something more readily available from the MOOSE phase field module. I did not have a chance to look into it yet. How Jacobians are computed? Maybe Daniel could give a little more introduction.

o-- Also, Andy's new well-funded project of porous flow module is a promising and ambitious one that intends to integrate all the basic needs for modeling  thermo-hydro-mechanical-chemical processes. I hope this project will mature soon.    

JasonT404

unread,
Jun 10, 2016, 10:27:19 AM6/10/16
to moose-users
Daniel, Andy and Yidong

We are looking into MOOSE as an alternative to something like STAR-CCM+. MOOSE is intriguing because we can see the code and control the simulation parameters more. Also, we don't have to wait for CD-Adapco to add in the tools we want now. STAR also uses some numerical correctors and preconditioners to make an otherwise unstable numerical scheme stable and this worries us a little bit.

Our ultimate end goal would be to simulate a fully coupled system. For example a spinning rotor blade (with circumferential forces and span-wise variation of Reynolds number) undergoing glaze icing, ice shedding and surface film flow with integral electric rotor blade heaters. But, this is a very far-reaching goal. Right now, I am looking to build a simple 2D ice accretion solver that operates in the rime ice regime (instant ice freezing on impact with no surface water film flow). This is easily validated and could demonstrate the promise of MOOSE. Key problems moving forward in the near term are adding compressible Eulerian multi-phase flow for liquid water and air, as well as modeling the freezing of liquid water droplets on the 2D airfoil. Daniel, the phase field module could potentially be used here. I was under the impression that was more for metal crystal/grain simulation, but I'll be looking into its capabilities. Can it model freezing phenomenon and heat transfer at a boundary? Would Falcon be the place to find these physics kernels?

Andy, the porous flow could be useful to us as well. An alternative to full CFD analysis of iced roughened airfoils is to model surface roughness in the boundary conditions. The simple 1D model of this is called equivalent sand-grain roughness height and it is a single length scale measure of local surface roughness irrespective of surrounding roughness elements. This could be something you may have already actually built. If not, a single BC object may be able to represent this. This method looses representative capacity as the ice growth reaches large scales (especially larger than the height between cells at the surface). The next step up in complexity is a 2D version of this.The Discrete Element Roughness Method (DERM) is a way of representing surface roughness as virtual geometric shapes like cones or cylinders and then superimposing their effects to local flow momentum and energy values. This would involve momentum and energy source and sink terms operating on individual quadrature points. This method has been validated on fine and large scale ice features. This is a much more serious modeling development from equivalent sand grain roughness height and fully developing something like this is on our to-do list, but as of right now, is not something we are actively pursuing.

With all of this said, we are looking into MOOSE, and hopefully we can add something to the modules you guys have been developing, especially modeling capability for traditional aerospace problems.

-Jason

Yidong Xia

unread,
Jun 10, 2016, 11:24:59 AM6/10/16
to moose...@googlegroups.com
Hi Jason,

Thanks for the detailed explanation of your project and purpose. Regarding the CFD capability based on MOOSE, I will think over it and reply you in a separate message. Currently I am funded to explore and establish a CFD module based on MOOSE. So far as I can tell, the MOOSE capability is being massively extended to make the 2nd-order finite volume CFD schemes possible. I gave a talk over this topic in the MOOSE workshop two days ago. And I will try to let Cody publish some preliminary results of my shock wave, blast wave test cases on MOOSE website soon. 

I will keep you posted.
--
You received this message because you are subscribed to a topic in the Google Groups "moose-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/moose-users/9cr5hVhEV9Y/unsubscribe.
To unsubscribe from this group and all its topics, send an email to moose-users...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.


--
Sent from Gmail Mobile

JasonT404

unread,
Jun 10, 2016, 11:27:14 AM6/10/16
to moose-users
Thank you Yidong

Daniel Schwen

unread,
Jun 10, 2016, 12:14:02 PM6/10/16
to moose-users
Jason,
while our group deals mostly in sold matter we do have an example for a solidification model and one member of the MOOSE team (Andrew Slaughter) is an expert in snow modelling (and has done work on vapor transport in porous snow packs). Seems related you your work.
You can specify chemical free energies and mechanical properties of the different phases. For the phase field equations we offer dynamic automatic differentiation to get the Jacobians right even for custom free energies and mobilities.
Daniel

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.

Andrew....@csiro.au

unread,
Jun 11, 2016, 5:13:39 AM6/11/16
to moose...@googlegroups.com

Jason, thanks for the explanation of your research - interesting!


I've a bit of experience in stabilising numerical solvers, and i wouldn't discount anything that a commercial package has implemented.  Often the details in the solvers are extremely hard-won: it's much harder making a robust solver than just naively solving a PDE.  You may find yourselves eventually re-implementing what STAR has done!


I'm not sure how appropriate porous-flow is to your problem.  On the other hand, i think we need to have some sort of common base for material properties like ice-liquid-steam.  This seems to have been mentioned quite a lot recently!


a



Sent: Saturday, 11 June 2016 2:13 AM

To: moose-users
Subject: Re: multiphases (water-steam) on falcon?

Yidong Xia

unread,
Jun 13, 2016, 11:49:49 AM6/13/16
to moose-users
Hi Jason,

After reading your description a few more times, I can now add a little bit more to our discussion, following my introduction to our ongoing MOOSE CFD module development. At this moment, I am not aiming to address the multiphase EOS, because I assume we have more fundamental issues to consider under the MOOSE framework. Please also forgive me if my words overlap what you are already very familiar with.
  • First, a little bit of history of commercial CFD codes. So far as I know, CD-adapco was founded by a few of the "Founding Fathers" of FLUENT. Though with different code design, STAR-CCM+ should deliver the same/similar solution to that of FLUENT for simple problems. With that being said, we should realize both commercial codes are based on the same finite volume methods (with differences on the choice of some advanced turbulence models). There are basically two sources of oscillation in the solutions as you may worry about.
    • First, if you choose to use a second-order finite volume method, you may have numerical oscillations when there are discontinuities or strong gradients in the physics being resolved. Slope limiters or selective artificial viscosity approaches may be used to suppress such oscillations. 
    • Second, even if you choose to use a first-order finite volume method, you may still meet with oscillations due to possible inherent issues associated to multiphase material models in the commercial codes. In this case, the use of artificial viscosity approach may alter the solution you really want to see.
    •  However, as you've noted, commercial codes are like kind of black box -- even you may know the algorithms from their theory manual, you may not know how things are actually implemented in the source codes. This is the agony that people may have when they are trying the advanced features of the commercial CFD codes. If you want to track how everything is calculated and executed, your own in-house or open-source code may be your better choice. 
  • Now, speaking of the feasibilities to use MOOSE as an alternative to STAR-CCM+ for your projects.
    • First, we need to be aware of the fact that the basic discretization algorithms / logics employed by mature commercial CFD codes like FLUENT, STAR-CCM+, and CFX are proved to be optimal for many research / industry areas. The finite volume methods implemented in them are all based on "the loop over the faces/edges" (except the external body force terms, source terms). That design, together with some element/node renumbering algorithms, guarantees the minimal cache missing, and therefore the best efficiency for solvers.
    • However, what remain largely uncertain and open about the commercial CFD codes today are the models they chose for advanced multi-fluid / multi-phase flow modeling. But at least, for preliminary implementation of CFD capabilities for MOOSE, we may have been ending up with implementing the very mature algorithms found in those commercial codes. But at least, we know what we are exactly doing by exposing all the source codes.
    • However, a critical difference between MOOSE and the mature commercial CFD codes is about the main loop logic. MOOSE loops over the elements first, and on each element, it loops over its surrounding faces/edges for interfacial flux calculation (this is the case for MOOSE DG). The cell-centered finite volume schemes we are developing now is thus based on this existing loop logic. This fixed design has made the implementation of lots of classical slope reconstruction / limiting difficult, or a little awkward in MOOSE (as concluded from the job I've done for the compressible Euler equations). It could be possible to start up a new loop logic (i.e. "loop-over-the-face") for MOOSE, but we have to go down to libMesh to create it, which is beyond my reach (and out of my funding scope) at this moment. To this respect, we cannot expect the future MOOSE CFD module to run as fast as those commercial CFD codes. But in terms of necessary features to solve the problems, we do not compromise anything.
  • Final note, in a future separate post, I could summarize why continuous finite element method is not immediately pursued for building the MOOSE CFD module. I am running out of time this morning.
Plus, I happen to know CFD a little more than average, because I wrote my own CFD solver codes and also explored different commercial CFD codes during my PhD. It is always my pleasure to dive into discussion into any CFD related discussions.

JasonT404

unread,
Jun 14, 2016, 7:00:15 PM6/14/16
to moose-users
Yidong,

Thank you. I think I understand the status of MOOSE CFD better now. Can you elaborate on what you mean by "... we may have been ending up with implementing the very mature algorithms found in those commercial codes. But at least, we know what we are exactly doing by exposing all the source codes." (Second section Second subparagraph). 

One last thought: have you or anyone else considered using a Lattice Boltzmann method instead of the traditional 2nd order finite volume discretization? It would be a lengthy thing to implement fully, but my understanding is that it is a finite element method and thus more compatible with MOOSE. I don't know much more about the method other than the elementary concepts, so I could be completely wrong.

-Jason Turner

Yidong Xia

unread,
Jun 15, 2016, 10:49:42 AM6/15/16
to moose-users
Hi Jason,

To elaborate it using a plain tone, what I meant is that:
  • To develop an open-source or in-house CFD toolkit, what people will do at the beginning (i.e., the first few years) is nothing but establishing and validating the basic features of a compressible or incompressible CFD solver. During this stage, people usually have very mature reference document / literature to look at in order to implement and validate the various CFD algorithms. It is more likely that people will want to implement what had been chosen by the commercial CFD codes, e.g., FLUENT, CFX, STAR-CCM+, as those algorithms have been widely tested and used. So if people start from scratch, they may be doing nothing special, but trying to implement and validate what has already been in the commercial CFD codes.
  • The real difference is when you start to look at complex multi-fluid multi-phase flow problems, and want to choose proper two-phase or multi-phase models for your problems. From my own perspective, there is still no consensus what model is "good" or "bad" in this area. Each model may work better than the others for some specific cases. Commercial CFD code companies may not have spent enough efforts in comparing and validating the various two-phase / multi-phase models for a variety of scenarios. So I think this is still a wide open research area, where having your own open-source or in-house CFD code may work more flexibly for your project. P.S. This is like the situation of turbulence models: there is no one ultimate turbulence model that works for all scenarios; instead, users have to be quite familiar with the features of different turbulence models, so they can understand which to choose for their applications.
  • Regarding the Lattice Boltzmann method (LBM), I think it is "possible" to implement in MOOSE. People at INL have been trying other particle based methods in MOOSE, e.g., peridynamics, discrete element method. But so far as I know, there may be issues how the implementation details are being taken care of, e.g., how to manage good parallel computing performance, how to manage well organized data storage. So in a word, I would say, LBM is "possible", but may not be deliverable too soon in an ideal way in terms of computer programming.

Yidong 

Derek Gaston

unread,
Jun 20, 2016, 12:40:51 PM6/20/16
to moose-users
I'm really enjoying this discussion!  Yidong, thank you for your insightful comments!

Let me add something on the subject of looping over faces: the issue there is that we don't currently _store_ faces at all... so there are none to loop over.  We could certainly (in a preprocessing step) go through and generate the face elements for every element in the mesh and store it... which would allow us to loop over them... but there would be a memory penalty involved (might still be worth it... just pointing it out).

Also: not all finite-element basis functions can be evaluated on sides based solely on the DoFs on those sides... meaning that evaluating variable values on sides may need "non-local" (to the side) DoFs that reside in the element or on other sides of the elements.  So even if you're looping over sides you would need to know which element is on each face of the side to be able to get all of the DoFs.  Again, not-insurmountable... but it's currently infrastructure that doesn't exist (how to best store that side->elem linkage).  We _may_ be able to abuse the idea of an "interior parent" ( http://libmesh.github.io/doxygen/classlibMesh_1_1Elem.html#a6b6161aa2c540a7c75822c5962475ebf ) but that only gets us one of the elements... not both.

Anyway - I just wanted to chime in and say that I'm open to growing a new set of loops to make DG / Finite Volume better in MOOSE... but if we're going to do it it needs to be designed properly and there will probably need to be some low-level data structures built to facilitate it.

Derek

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/moose-users.

Derek Gaston

unread,
Jun 20, 2016, 12:42:55 PM6/20/16
to moose-users
Oh - since we're talking about DG I always have to throw in there that stateful material properties are still not handled properly when using DG.  It's not insurmountable, but it will be quite a bit of work to make that work.  Stateful material properties with DG and mesh adaptivity will also get _very_ complicated...

Derek

Yidong Xia

unread,
Jun 20, 2016, 1:25:37 PM6/20/16
to moose-users
Hi Derek,

Your comments and suggestions are always informative and appreciated. With several months of collaborative development of the MOOSE CFD module with David, I've come to better understand the "low-level" infrastructure of MOOSE, and the concerns you pointed out. While we do not have a plan to touch things the bottom at this moment, I envision this will happen sooner or later, as we see the very need to carry out a multi-dimensional CFD solver for nuclear applications. We can talk about this more when you are at INL next month.

Yidong 
Reply all
Reply to author
Forward
0 new messages