MPI_Process usage in the fds file

1,558 views
Skip to first unread message

RNstu08

unread,
Aug 5, 2022, 9:45:28 AM8/5/22
to FDS and Smokeview Discussions

Hello all,

 
I have some questions about using MPI PROCESS options on the windows machine with an AMD 24 core processor with 4 sockets and 16 virtual processors.
 
I have created a domain and split it into 6 meshes and ran the same case simultaneously to understand how the performance is affected while running multiple jobs (i.e., Job1 and Job2) at a time. I kept the fds case file constant for both the jobs (attached PVC.fds) where I did not specify the MPI_PROCESS='VALUE' in the mesh line. As expected, fds automatically allocates the meshes in both cases to the same number of processors, i.e., within 0 to 5, as below. That means, out of 24 processors, FDS uses only 6 processors to accomplish two jobs running at the same time.

Job1
C:\FDS\2> mpiexec -np 6 fds PVC.fds / command used

 Starting FDS ...

 MPI Process      0 started on VD
 MPI Process      4 started on VD
 MPI Process      2 started on VD
 MPI Process      3 started on VD
 MPI Process      5 started on VD
 MPI Process      1 started on VD

 Reading FDS input file ...

 Fire Dynamics Simulator

 Current Date     : August  5, 2022  15:07:55
 Revision         : FDS6.7.9-0-gec52dee42-release
 Revision Date    : Sun Jun 26 14:36:40 2022 -0400
 Compiler         :
 Compilation Date : Tue 06/28/2022  11:11 PM

 MPI Enabled;    Number of MPI Processes:       6
 OpenMP Disabled

Job2

C:\FDS\3>mpiexec -np 6 fds PVC.fds / command used

 Starting FDS ...

 MPI Process      0 started on VD
 MPI Process      5 started on VD
 MPI Process      3 started on VD
 MPI Process      1 started on VD
 MPI Process      2 started on VD
 MPI Process      4 started on VD

 Reading FDS input file ...

 Fire Dynamics Simulator

 Current Date     : August  5, 2022  15:08:53
 Revision         : FDS6.7.9-0-gec52dee42-release
 Revision Date    : Sun Jun 26 14:36:40 2022 -0400
 Compiler         :
 Compilation Date : Tue 06/28/2022  11:11 PM

 MPI Enabled;    Number of MPI Processes:       6
 OpenMP Disabled

####################

To overwrite this, I assigned the MPI_PROCESS='VALUE' on the mesh line (attached overwrite_PVC.fds). In this case, even though I have assigned a particular MPI_PROCESS (i.e., from 6 to 11) to each mesh, FDS allocates the same processors as in the previous case (i.e., within 0 to 5) and alongside I got the following error while using this method.

C:\FDS\1>mpiexec -np 6 fds PVC.fds / command used

 Starting FDS ...

 MPI Process      0 started on VD
 MPI Process      3 started on VD
 MPI Process      5 started on VD
 MPI Process      2 started on VD
 MPI Process      1 started on VD
 MPI Process      4 started on VD

 Reading FDS input file ...

ERROR: MPI_PROCESS for MESH 1 greater than total number of processes (CHID: PVC)

ERROR: FDS was improperly set-up - FDS stopped (CHID: PVC)

So, what is the procedure to use all the available 24 processors? For example, if I want to run job1 simultaneously 4 times, which has six meshes, In doing so, I want FDS to use all the 24 processors, not just the same 6 processors for all the 4 times.

Looking forward for your opinions.
PVC.fds
overwrite_PVC.fds

Kevin McGrattan

unread,
Aug 5, 2022, 9:53:57 AM8/5/22
to fds...@googlegroups.com
This is not the proper use of MPI_PROCESS. In general, you cannot run multiple jobs on a computer without some kind of job scheduling software. That is, software that assigns multiple jobs to the computer's cores. I suggest that in your case, you run single jobs with 6 MPI processes and 4 OpenMP threads each using the fds_local command

fds_local -p 6 -o 4 PVC.fds

RNstu08

unread,
Aug 5, 2022, 10:52:09 AM8/5/22
to FDS and Smokeview Discussions

It looks like OpenMP is disabled, i.e., hyperthreading. Therefore, when I include -o in the command, I get an error saying "unrecognized argument o".

Even if I managed to run with the command mentioned, it uses only 6 processors out of 24. While remaining might be still stay idle? I am looking to capitalize on that as well by running parallel jobs.
 
What are the job scheduling options for windows?

Kevin McGrattan

unread,
Aug 5, 2022, 11:17:18 AM8/5/22
to fds...@googlegroups.com
Show me the screen shot with the error with -o option.

RNstu08

unread,
Aug 5, 2022, 11:38:28 AM8/5/22
to FDS and Smokeview Discussions
C:\FDS\3>mpiexec -np 6 -o 4 fds PVC.fds
[mpiexec@VD] match_arg (arg\hydra_arg.c:91): unrecognized argument o
[mpiexec@VD] Similar arguments:
[mpiexec@VD]        outfile-pattern
[mpiexec@VD]        outfile
[mpiexec@VD]        info
[mpiexec@VD]        bind-to
[mpiexec@VD]        ordered-output
[mpiexec@VD] HYD_arg_parse_array (arg\hydra_arg.c:128): argument matching returned error
[mpiexec@VD] mpiexec_get_parameters (mpiexec_params.c:1359): error parsing input array
[mpiexec@VD] wmain (mpiexec.c:1783): error parsing parameters

Kevin McGrattan

unread,
Aug 5, 2022, 11:40:55 AM8/5/22
to fds...@googlegroups.com
Is this the command that I told you to use?

RNstu08

unread,
Aug 5, 2022, 11:58:55 AM8/5/22
to FDS and Smokeview Discussions
It works with the exact same command.

I tried with and without -o and checked the time difference.

fds_local -p 6 -o 4 PVC.fds took 140 sec with 100% CPU utilization
fds_local -p 6 PVC.fds took 258 sec with 40% CPU utilization

But, my aim is to run multiple jobs at a time. For example, a domain with meshes and want to run case twice parallelly, then I might use
fds_local -p 2 -o 6 PVC.fds /job1
fds_local -p 2 -o 6 PVC.fds /job2
But, in above cases, it uses same processors. How do we allocate separate ones? Can we use something like torque scheduler for windows?

Kevin McGrattan

unread,
Aug 5, 2022, 12:09:23 PM8/5/22
to fds...@googlegroups.com
I don't know. I do not use a Windows computer to do FDS simulations. I use a linux cluster using SLURM for job scheduling. I do not know what options there are for job scheduling under Windows. Maybe someone else does.

RNstu08

unread,
Aug 5, 2022, 1:13:29 PM8/5/22
to FDS and Smokeview Discussions
Thanks for your response. Looking forward for other comments 
Reply all
Reply to author
Forward
0 new messages