MPI creates several OpenMP-Threads FDS 6.2.0

1,293 views
Skip to first unread message

Christian Kraft

unread,
May 18, 2015, 7:51:01 AM5/18/15
to fds...@googlegroups.com
Hello,

since I use FDS 6.2.0 every MPI-Process creates 12 OpenMP-Threads. This causes a lot of overhead and slows done the speed of the calculation (see attachment). To solve this problem I set the OMP_NUM_THREADS - variable to "1". Now every MPI-Process only creaets one OpenMP-Thread.

The calculation was started with the following line on a arch-linux with openmpi 1.8.4.

mpirun -np 8 fds calculation.fds

Is there a way to run MPI without the use of OpenMP or to disable OpenMP?

Thanks a lot for your help.
Christian
mpiopenmp.png

Lukas A.

unread,
May 18, 2015, 10:19:37 AM5/18/15
to fds...@googlegroups.com
Hey Christian,

by setting OMP_NUM_THREADS to 1 you basically disable OpenMP functionality. Does this answer your question?

Best,
Lukas
> --
> You received this message because you are subscribed to the Google Groups "FDS and Smokeview Discussions" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to fds-smv+u...@googlegroups.com.
> To post to this group, send email to fds...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/fds-smv/4305f843-e29a-41bd-befd-4aac1d41cb1c%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
> <mpiopenmp.png>

Christian Kraft

unread,
May 18, 2015, 2:42:53 PM5/18/15
to fds...@googlegroups.com
Hey Lukas,

thank you for your answer. I set the OMP_NUM_THREADS to 1 but there is still one OpenMP-process under every MPI-process. Is that how it should be or should there be only MPI-processes?

Thank You!
Christian




gsztar

unread,
May 18, 2015, 3:03:09 PM5/18/15
to fds...@googlegroups.com
Christian
Yes, it is.

Grzegorz

SR

unread,
May 18, 2015, 3:39:18 PM5/18/15
to fds...@googlegroups.com

Based on our experience with the current MPI version of FDS, OMP_NUM_THREADS = 1 is the best choice.

Lukas A.

unread,
May 19, 2015, 1:41:41 AM5/19/15
to fds...@googlegroups.com
Dear SR,

this is the best choice for what?

Best,
Lukas
> --
> You received this message because you are subscribed to the Google Groups "FDS and Smokeview Discussions" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to fds-smv+u...@googlegroups.com.
> To post to this group, send email to fds...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/fds-smv/d795a4ab-2d6e-42ce-9e72-8c9922123bc9%40googlegroups.com.

SR

unread,
May 19, 2015, 12:11:35 PM5/19/15
to fds...@googlegroups.com

for the shortest wall clock time.

cuub

unread,
Jun 10, 2015, 5:46:03 AM6/10/15
to fds...@googlegroups.com
Hello,
I set OMP_NUM_THREADS to 1 on both computers (use 2 of them, CentOS system). When start simulation, OpenMP creates only one thread on the first mashine, but one the second four. How to deal with it?
Best wishes.
Jakub

[abc@linux5 FDS_L5]$ mpirun --hostfile hostfile --np 19 -bind-to core:overload-allowed --wdir ./ ./fds_mpi_6.2 ./RBC_01_A.fds
 OpenMP thread   0 of   0 assigned to MPI process   2 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   3 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   1 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   0 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   5 of  18 is running on linux5
 OpenMP thread   0 of   3 assigned to MPI process  17 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  17 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  17 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  17 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  12 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  12 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  12 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  12 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process   8 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process   8 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process   8 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process   9 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process   9 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process   9 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  10 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  10 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  10 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  10 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process   8 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  14 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  14 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process   7 of  18 is running on linux5
 OpenMP thread   0 of   3 assigned to MPI process  16 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  16 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  16 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  16 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  14 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  14 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  15 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  15 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  15 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  15 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process   9 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  18 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  18 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  18 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  18 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  13 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  13 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process   4 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   6 of  18 is running on linux5
 Mesh   1 is assigned to MPI Process   0
 Mesh   2 is assigned to MPI Process   1
 Mesh   3 is assigned to MPI Process   2
 Mesh   4 is assigned to MPI Process   3
 Mesh   5 is assigned to MPI Process   4
 Mesh   6 is assigned to MPI Process   5
 Mesh   7 is assigned to MPI Process   6
 Mesh   8 is assigned to MPI Process   7
 Mesh   9 is assigned to MPI Process   8
 Mesh  10 is assigned to MPI Process   9
 Mesh  11 is assigned to MPI Process  10
 Mesh  12 is assigned to MPI Process  11
 Mesh  13 is assigned to MPI Process  12
 Mesh  14 is assigned to MPI Process  13
 Mesh  15 is assigned to MPI Process  14
 Mesh  16 is assigned to MPI Process  15
 Mesh  17 is assigned to MPI Process  16
 Mesh  18 is assigned to MPI Process  17
 Mesh  19 is assigned to MPI Process  18
 OpenMP thread   3 of   3 assigned to MPI process  13 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  13 of  18 is running on linux4
 OpenMP thread   3 of   3 assigned to MPI process  11 of  18 is running on linux4
 OpenMP thread   0 of   3 assigned to MPI process  11 of  18 is running on linux4
 OpenMP thread   2 of   3 assigned to MPI process  11 of  18 is running on linux4
 OpenMP thread   1 of   3 assigned to MPI process  11 of  18 is running on linux4

 Fire Dynamics Simulator

 Compilation Date : Sat, 11 Apr 2015
 Current Date     : June 10, 2015  13:48:13
 Version          : FDS 6.2.0
 SVN Revision No. : 22343

 MPI Enabled; Number of MPI Processes:    19
 OpenMP Enabled; Number of OpenMP Threads:   1
......

Lukas A.

unread,
Jun 10, 2015, 6:59:26 AM6/10/15
to fds...@googlegroups.com
Jakub,

make sure, that you pass the OMP_NUM_THREADS environment variable to all MPI processes. The syntax depends on the MPI implementation, but common options to start looking might be '-e' and/or '-x'; i.e. look for "environment to export to foreign nodes".

Best,
Lukas
> To view this discussion on the web visit https://groups.google.com/d/msgid/fds-smv/ac3c5c5f-de30-4695-9d01-080531704cd2%40googlegroups.com.

cuub

unread,
Jun 10, 2015, 8:46:38 AM6/10/15
to fds...@googlegroups.com
Thanks a lot Lukas,
it solved the problem. I use (Open MPI) 1.8.4 - as I understand it was recomended for FDS 6.2. I'm not so expert about MPI/OpenMP. I typed:
mpirun --x OMP_NUM_THREADS=1 --hostfile hostfile --np 19 -bind-to core:overload-allowed --wdir ./ ./fds_mpi_6.2 ./RBC_01_A.fds
and now get:

[abc@linux5 FDS_L5]$ mpirun --x OMP_NUM_THREADS=1 --hostfile hostfile --np 19 -bind-to core:overload-allowed --wdir ./ ./fds_mpi_6.2 ./RBC_01_A.fds
 OpenMP thread   0 of   0 assigned to MPI process   3 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   0 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   1 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   2 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   4 of  18 is running on linux5

 OpenMP thread   0 of   0 assigned to MPI process   6 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   5 of  18 is running on linux5
 OpenMP thread   0 of   0 assigned to MPI process   7 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process   9 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  16 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  10 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  11 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  13 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  14 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  15 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  17 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  18 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process  12 of  18 is running on linux4
 OpenMP thread   0 of   0 assigned to MPI process   8 of  18 is running on linux4

 Mesh   1 is assigned to MPI Process   0
 Mesh   2 is assigned to MPI Process   1
 Mesh   3 is assigned to MPI Process   2
 Mesh   4 is assigned to MPI Process   3
 Mesh   5 is assigned to MPI Process   4
 Mesh   6 is assigned to MPI Process   5
 Mesh   7 is assigned to MPI Process   6
 Mesh   8 is assigned to MPI Process   7
 Mesh   9 is assigned to MPI Process   8
 Mesh  10 is assigned to MPI Process   9
 Mesh  11 is assigned to MPI Process  10
 Mesh  12 is assigned to MPI Process  11
 Mesh  13 is assigned to MPI Process  12
 Mesh  14 is assigned to MPI Process  13
 Mesh  15 is assigned to MPI Process  14
 Mesh  16 is assigned to MPI Process  15
 Mesh  17 is assigned to MPI Process  16
 Mesh  18 is assigned to MPI Process  17
 Mesh  19 is assigned to MPI Process  18

 Fire Dynamics Simulator

 Compilation Date : Sat, 11 Apr 2015
 Current Date     : June 10, 2015  16:52:41

 Version          : FDS 6.2.0
 SVN Revision No. : 22343

 MPI Enabled; Number of MPI Processes:    19
 OpenMP Enabled; Number of OpenMP Threads:   1
.......

Best wishes,
Jakub

cuub

unread,
Jun 10, 2015, 8:52:24 AM6/10/15
to fds...@googlegroups.com
As a comment, using that parameter in my case didn't change the calculation speed.
Reply all
Reply to author
Forward
0 new messages