implicit pseudo time stepper?

105 views
Skip to first unread message

Junting Chen

unread,
Jul 15, 2019, 5:30:46 PM7/15/19
to PyFR Mailing List
Hello,

Any work ongoing to make the computation in pseudo time steps implicit?

As Niki mentioned, implicit pseudo time stepper caused storage issues (in one of the posts). Any chance an implicit option can be offered in the next release? Explicit forces pseudo dt being extremely small therefore physical dt kind of small even though physical time stepper is implicit. I am working on a bluff body problem. Aiming to reduce the number of physical time steps within a vortex shedding cycle to be around than 20, while now it has to be more than 300 to maintain stability.

Thanks!

Junting Chen
  

Niki Loppi

unread,
Jul 16, 2019, 1:38:10 PM7/16/19
to pyfrmai...@googlegroups.com

Hi Junting,

High memory footprint is not the only issue. Construction of the global linear system for high-order polynomials is very expensive on modern hardware (high memory requirements with low arithmetic intensity). Also solving the linear system is challenging due to global communications.

You can read about implicit time-stepping on GPUs from

http://aero-comlab.stanford.edu/Papers/Dissertation_Jerry_Watkins-augmented.pdf

We are working on local implicit (pseudo) time stepping approaches, but I'm afraid they won't be ready for the next release.

Regarding the dt/pseudo-dt ratio, I suggest keeping it in the range of 20-50. You can send me your case files if you want me to try it.

Thanks,
Niki
--
You received this message because you are subscribed to the Google Groups "PyFR Mailing List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmailingli...@googlegroups.com.
To post to this group, send email to pyfrmai...@googlegroups.com.
Visit this group at https://groups.google.com/group/pyfrmailinglist.
To view this discussion on the web, visit https://groups.google.com/d/msgid/pyfrmailinglist/a6ba9d83-aba8-4687-8864-ee2202574fbd%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Junting Chen

unread,
Jul 16, 2019, 5:21:46 PM7/16/19
to PyFR Mailing List
Thanks Niki, good to know! I understand.

We are running a simple case where the geometry is just a bluff body. Only criteria that we are using to do validation in this case are Strouhal number (shedding frequency), drag and lift. We noticed that increasing dt/pseudo-dt to ~100 can still maintain stability but requiring a bit more niters to let residual drop to a desirable level (equivalent to having dt/pseudo-dt = 30 and niter =20). As far as i understand, the consequence of larger dt/pseudo-dt ratio is more pseudo-steps are required to converge within a physical step, correct me if I am wrong. Also noticed that high residual will lead to wrong result (off shedding frequency).

It would be really nice of you if you can take a look at my setup. I first found out highest value of pseudo-dt I can use is 2.5E-5. Then I slightly pushed dt to 3E-3 trying to find out the boundary where the simulation either breaks or spits out wrong result. 

In fact, another feature we are really hoping to see in the next few release is a portal that allows users to implement a specific velocity profile at a boundary. By saying that, another critical test case of ours requires using a time-varying velocity profile at the inlet (so basically we have a bunch of velocity profiles to be implemented at each every time step). 

Thanks a lot again for the clarification!

Junting Chen






On Tuesday, July 16, 2019 at 1:38:10 PM UTC-4, Niki Loppi wrote:

Hi Junting,

High memory footprint is not the only issue. Construction of the global linear system for high-order polynomials is very expensive on modern hardware (high memory requirements with low arithmetic intensity). Also solving the linear system is challenging due to global communications.

You can read about implicit time-stepping on GPUs from

http://aero-comlab.stanford.edu/Papers/Dissertation_Jerry_Watkins-augmented.pdf

We are working on local implicit (pseudo) time stepping approaches, but I'm afraid they won't be ready for the next release.

Regarding the dt/pseudo-dt ratio, I suggest keeping it in the range of 20-50. You can send me your case files if you want me to try it.

Thanks,
Niki

On 15/07/19 22:30, Junting Chen wrote:
Hello,

Any work ongoing to make the computation in pseudo time steps implicit?

As Niki mentioned, implicit pseudo time stepper caused storage issues (in one of the posts). Any chance an implicit option can be offered in the next release? Explicit forces pseudo dt being extremely small therefore physical dt kind of small even though physical time stepper is implicit. I am working on a bluff body problem. Aiming to reduce the number of physical time steps within a vortex shedding cycle to be around than 20, while now it has to be more than 300 to maintain stability.

Thanks!

Junting Chen
  
--
You received this message because you are subscribed to the Google Groups "PyFR Mailing List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmai...@googlegroups.com.
concaveTower.zip

Niki Loppi

unread,
Jul 18, 2019, 5:40:19 AM7/18/19
to pyfrmai...@googlegroups.com

Hi Junting,

I tried running the case and it appears to be very difficult (nearly impossible) to convergence properly. I think the main reason is that you have specified your front and back back boundaries (top and bottom) as slip-wall.

[soln-bcs-top]
type = slp-wall

[soln-bcs-bottom]
type = slp-wall

I'm pretty sure that using a periodic z-direction would help significantly and allow you to drive the pseudo-residuals down several orders of magnitude. Please see Arvind's response in

https://groups.google.com/forum/#!topic/pyfrmailinglist/JLhiy8TV9xo

In summary, you just need name your top and bottom boundary-fields as

"periodic_0_l" and "periodic_0_r"

in you .msh / cgns file, and PyFR builds the connectivity during the import step. In the .ini file you don't have to specify anything.

Cheers,
Niki
To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmailingli...@googlegroups.com.

To post to this group, send email to pyfrmai...@googlegroups.com.
Visit this group at https://groups.google.com/group/pyfrmailinglist.

Junting Chen

unread,
Jul 18, 2019, 10:45:16 AM7/18/19
to PyFR Mailing List
In fact, that was the first thing I tried. But I was having difficulties in creating conformal periodic surfaces using GMSH. It is really tedious as Arvind said. I am going back and look at it again. It's gmsh issue...

Btw when running pyfr across multiple GPU, do you use mpi or something else?

We recently upgraded the virtual machine and trying to run the same case with 4 gpu. After partitioning the mesh, i went:
mpirun -n 4 pyfr run -b cuda -p ****.pyfrm ****.ini
It keeps on telling me "3 more processes have sent help message help-mpi-btl-base.txt" and stopped from running. Have you had this issue before? It seems a issue from mpi4py because when i was trying to uninstall and reinstall pyfr and its dependencies (we thought perhaps the old setup was not adequate for the new environment) , mpi4py cannot be installed anymore on this virtual machine with 4 gpu. We are looking into this issue, hopefully can get some hints from you.

Thanks Niki

Junting Chen

Junting Chen

unread,
Jul 22, 2019, 9:42:37 AM7/22/19
to PyFR Mailing List
Hello Niki, 

I have implemented periodic condition on those two walls but still residuals are pretty big. Pressure residual dropped from 0.08 at the begining of a physical time step to 0.02, and velocity residual dropped from 0.5 to 1E-3, with multi-grid method implemented.

I seen that you or your colleague has ran simulation of flow past a infinitely long cylinder. What does the residuals of that case look like?

Junting Chen

Niki Loppi

unread,
Jul 22, 2019, 10:28:03 AM7/22/19
to pyfrmai...@googlegroups.com

Hi Junting,

I have run an infinitely long SD7003 airfoil case at Re=60,000 where I used 1e-4 for velocity and 1e-3 for the pressure. I remember that your mesh was quite coarse for the given Reynolds number. You may want to try adding flux anti-aliasing (may allow you to use larger pseudo-time step sizes). Also the using tets instead of hexa in the boundary layer might impose a stricter pseudo-cfl. Did you curve the surface mesh using gmsh?

Please send your periodic mesh and I can have a go later tonight.

Niki

To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmailingli...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/pyfrmailinglist/a8df5fff-8926-46b4-9625-3fd5b1cda417%40googlegroups.com.

Niki Loppi

unread,
Jul 22, 2019, 10:48:51 AM7/22/19
to pyfrmai...@googlegroups.com

In addition to my previous email.

Large pressure residuals are normal in the beginning of the simulation. If you do a fresh start (not restarting from a developed solution), strong pseudo-waves occur and you have to keep running the simulation until the waves get dissipated by the boundaries.

Niki

Junting Chen

unread,
Jul 22, 2019, 2:16:17 PM7/22/19
to PyFR Mailing List
Thanks Niki, you can find it in the attachment. 

Yes I was using Gmsh. One of the issue I am running into in Gmsh is that in the case I created a "refinement field", the periodic conformal surface pair created by "extrude" were no longer conformal any more. 

I need to think of another way of creating some hex mesh (BL mesh) around the body's surface.

I didn't curve surface in Gmsh. I created in CAD software, then take the surface polyline from paraview, last transfer those points to Gmsh.

I will look into the flux anti-aliasing setting later. 

Thanks for your help.

Junting Chen

To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmai...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "PyFR Mailing List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pyfrmai...@googlegroups.com.
concaveTower_722.zip

nnunn

unread,
Jul 26, 2019, 10:03:00 PM7/26/19
to PyFR Mailing List
Hi Junting Chen,
Re: periodic boundaries for your bluff body test mesh, can you attach/send a gmsh "geometry" file (*.geo) which defines the profile you'd like to extrude?
I'm about to start looking at {implict/pseudo/dual} stepping, and also enjoy wrestling with gmsh meshing issues.
Nigel

Junting Chen

unread,
Jul 29, 2019, 8:55:29 AM7/29/19
to PyFR Mailing List
Hello Nigel, 

I have figured out how to do it in Gmsh. I used to use a boolean operation and never allowing me to create a pair of conformal surface mesh. Instead of doing that, I used paraview to read in the stl file, take a section of the geometry, and output a number of points which can define the geometry. Then I created a script that translate the coordinate of the output points to .geo. By doing that, I created a 2d rectangular surface with my target geometry removed. Then extruding this 2d plane will create conformal surface mesh. Refinement fields is likely to mess up the surface mesh. What needs to be added at the end is something like:

Periodic Surface{1201} = {1} Translate {0,0,0.6};

It will clean the mess from refinements. Hope this may help someone.

The next challenge would be creating boundary layers near the body surface to slightly increase the cell size and better define the geometry in the mesh (recovering big curvatures). I was off on work this last week and will be back on track very soon. If you are interested in the new .geo, please find it in the attachment. 

Junting Chen
concaveTower.geo
Reply all
Reply to author
Forward
0 new messages