FDS PARALLEL PROCESSING ON MULTICORE WORKSTATION

198 views
Skip to first unread message

Narayan pandit

unread,
Jun 24, 2016, 5:34:30 AM6/24/16
to FDS and Smokeview Discussions
Dear sir/madam,
I have 32 core 32 gb ram workstation
I am simulating fire in a compartment. i have divided compartment into 4 meshes. Now i want assign each mesh to  mpi_process. am using following codes
&MESH ID='mesh1',IJK=25,40,40, XB=0,2.5,0,4,0,4,MPI_PROCESS=0/
&MESH ID='mesh2',IJK=25,40,40, XB=2.5,5,0,4,0,4,MPI_PROCESS=1/
&MESH ID='mesh3',IJK=25,40,40, XB=5,7.5,0,4,0,4,MPI_PROCESS=2/
&MESH ID='mesh4',IJK=25,40,40, XB=7.5,10,0,4,0,4,MPI_PROCESS=3 /

am getting an error message of "mpi_process greater than number of process."
I have enabled 16 cores by typing command line "export OMP_NUM_THREADS=16". Is it necessary to have multiple processor to run mpi or can i run mpi on multicore single processor? if yes Also i want to know how to do it in a centos 7 operating system.

jacques Frezabeu

unread,
Jun 24, 2016, 5:44:37 AM6/24/16
to fds...@googlegroups.com
I am not a specialist but have you tried to set OMP_NUM_THREADS to 4 in order to  assign 4 threads to the 4 MPI process ?

What command line did you use to run the calculation ?

--
You received this message because you are subscribed to the Google Groups "FDS and Smokeview Discussions" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fds-smv+u...@googlegroups.com.
To post to this group, send email to fds...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/fds-smv/56506073-a0ad-449c-a599-805101aedf2a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Narayan pandit

unread,
Jun 24, 2016, 5:58:21 AM6/24/16
to FDS and Smokeview Discussions
 am using centos 7 which is a linux based operating system. so i used command line export OMP_NUM_THREADS=16. Since i have 32 core processor. also am able to use 16 cores out of 32.  But the problem is am not able to assign mpi_process to cores. ( i want to assign mpi to cores not to processors)
regards
narayan pandit

dr_jfloyd

unread,
Jun 24, 2016, 6:16:12 AM6/24/16
to FDS and Smokeview Discussions
Are you following the instructions in the Guide for starting an mpi job? What are you typing on the command line?

Narayan pandit

unread,
Jun 24, 2016, 8:07:17 AM6/24/16
to FDS and Smokeview Discussions
Yes am following the instructions given in the guide. And following these codes.

&MESH ID='mesh1',IJK=25,40,40, XB=0,2.5,0,4,0,4,MPI_PROCESS=0/
&MESH ID='mesh2',IJK=25,40,40, XB=2.5,5,0,4,0,4,MPI_PROCESS=1/
&MESH ID='mesh3',IJK=25,40,40, XB=5,7.5,0,4,0,4,MPI_PROCESS=2/
&MESH ID='mesh4',IJK=25,40,40, XB=7.5,10,0,4,0,4,MPI_PROCESS=3 /

I have also mentioned everything in my 1st post.

dr_jfloyd

unread,
Jun 24, 2016, 8:11:19 AM6/24/16
to FDS and Smokeview Discussions
Adding MPI_PROCESS to the input file does not start an MPI job. You must instruct the computer to on the number of processes to use on the command line per the instructions in 3.2.2 of the User's Guide

What command are you typing on the command line?

Narayan pandit

unread,
Jun 24, 2016, 8:32:55 AM6/24/16
to FDS and Smokeview Discussions
Yes am following the instructions given in the guide. And following these codes.

&MESH ID='mesh1',IJK=25,40,40, XB=0,2.5,0,4,0,4,MPI_PROCESS=0/

&MESH ID='mesh2',IJK=25,40,40, XB=2.5,5,0,4,0,4,MPI_PROCESS=1/
&MESH ID='mesh3',IJK=25,40,40, XB=5,7.5,0,4,0,4,MPI_PROCESS=2/
&MESH ID='mesh4',IJK=25,40,40, XB=7.5,10,0,4,0,4,MPI_PROCESS=3 /

I have also mentioned everything in my 1st post.

Narayan pandit

unread,
Jun 24, 2016, 8:38:38 AM6/24/16
to FDS and Smokeview Discussions
Ok I don't have 3.2.2 in my manual. Am using fds user guide for 6.3.2

dr_jfloyd

unread,
Jun 24, 2016, 8:49:32 AM6/24/16
to FDS and Smokeview Discussions
3.1.2 in the 6.3.2 guide.

Narayan pandit

unread,
Jun 24, 2016, 9:38:57 AM6/24/16
to FDS and Smokeview Discussions

I used following commands
$ mpirun -np 4 fds -hostfile my_hosts.txt TRAIN_FIRE.FDS

resulted in these message

[Broccoli:05781] [[64407,0],0] tcp_peer_recv_connect_ack: received different version from [[64407,1],1]: 1.8.4 instead of 1.10.0
[Broccoli:05781] [[64407,0],0] tcp_peer_recv_connect_ack: received different version from [[64407,1],2]: 1.8.4 instead of 1.10.0
[Broccoli:05781] [[64407,0],0] tcp_peer_recv_connect_ack: received different version from [[64407,1],0]: 1.8.4 instead of 1.10.0
[Broccoli:05781] [[64407,0],0] tcp_peer_recv_connect_ack: received different version from [[64407,1],3]: 1.8.4 instead of 1.10.0

am new to FDS please help..

jacques Frezabeu

unread,
Jun 24, 2016, 9:44:36 AM6/24/16
to fds...@googlegroups.com
have you tried to set OMP_NUM_THREADS to 4   instead of 16 ?

--
You received this message because you are subscribed to the Google Groups "FDS and Smokeview Discussions" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fds-smv+u...@googlegroups.com.
To post to this group, send email to fds...@googlegroups.com.

dr_jfloyd

unread,
Jun 24, 2016, 11:40:26 AM6/24/16
to FDS and Smokeview Discussions
See:

https://groups.google.com/forum/#!topic/fds-smv/Ux3rOC0Txs4

Also you should not have (# MPI processes) * OMP_NUM_THREADS > available cores.  So on a 32 core machine with 4 MPI processes you should not have OMP_NUM_THREADS > 8; however, as we note in the guide, OMP_NUM_THREADS>4 does not seem to have much benefit.

Narayan pandit

unread,
Jun 25, 2016, 10:42:11 PM6/25/16
to FDS and Smokeview Discussions
Thank you dr_jfloyd. I think I got my answer.
Regards,
Narayan Pandit
Reply all
Reply to author
Forward
0 new messages