Step-22 with more than 20 million DOFs

79 views
Skip to first unread message

Alex Jarauta

unread,
Jun 24, 2020, 7:39:48 PM6/24/20
to deal.II User Group
Hi,

I am solving the Stokes flow problem in three dimensions using OpenFCST, which is an open-source platform for fuel cell simulations and it is based on deal.II v8.4.1. The code that I use is mostly based on the code in Step-22, and I have noticed that my simulations fail when my mesh has more than 20 million DOFs. In order to check if the problem comes from deal.ii and not from OpenFCST, I did the following:

1. Modified slightly the code in step-22 (see file "step-22.cc" attached)  to consider a domain in 3D that is of size 100x100x50 and it has 50 divisions per direction.
2. Applied the same boundary conditions that are applied in step-22
4. Ran the simulation both in deal.II v8.4.1 and v.9.0.1 with less than 20 million DOFs, and I was able to obtain a solution (see file "DealII_test.png" attached).
5. Ran the simulation in deal.II v8.4.1 -> it resulted in a "Segmentation fault" error (see file "output_v8_4_1.out").
5. Ran the simulation in deal.II v9.0.1 -> it resulted in a "nan residual" error (see file "output_v9_0_1.out").

I was wondering if someone could please tell me why this issue is appearing when I consider a mesh with more than 20 million DOFs, and if there is a solution to this.

Thanks!


Alex
output_v8_4_1.out
output_v9_0_1.out
step-22.cc
DealII_test.png

Wolfgang Bangerth

unread,
Jun 26, 2020, 12:24:20 PM6/26/20
to dea...@googlegroups.com

Alex,

> I am solving the Stokes flow problem in three dimensions using OpenFCST
> <https://nam01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.openfcst.mece.ualberta.ca%2F&data=02%7C01%7CWolfgang.Bangerth%40colostate.edu%7Ccd3182fe6dfc4066e6b208d81897e47a%7Cafb58802ff7a4bb1ab21367ff2ecfc8b%7C0%7C1%7C637286387952996923&sdata=5F%2BU95iUemz1oaDHrEaZ8C3xevwIzmoSqrI3UMU%2F2%2FA%3D&reserved=0>,
20 million unknowns is quite a large number, especially if you are using the
solver used in step-22. My suspicion is that in both cases, the error message
really just points out that you're run out of memory.

It would be interesting to see how your program's memory use increases as you
go from smaller to large problems. I suspect that you will see that for 20
million unknowns on a single machine, you will need ~100GB of memory and that
that exceeds what the operating system is willing to give you.

20 million unknowns is solidly in the region where you need (i) a parallel
machine, and (ii) a better linear solver than the one used in step-22. Both
exist in deal.II: a better linear solver is discussed in the "Possibilities
for extensions" of step-22, and is implemented in a parallel fashion in
step-32 among others.

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

Alex Jarauta

unread,
Jun 26, 2020, 5:28:14 PM6/26/20
to deal.II User Group
Hi Wolfgang,

thank you for your reply. I will look at the extensions of step-22, as well as the parallelization of the code detailed in step-32.

Cheers,


Alex

El divendres, 26 juny de 2020 10:24:20 UTC-6, Wolfgang Bangerth va escriure:
Reply all
Reply to author
Forward
0 new messages