I'm also using an AWS EC2 instance to run FDS in a Linux shell. It is a c6i.24xlarge instance that has 24 processors and 96 vCPU (I'm not using any HPC cluster). The program works when running simple cases and just using the command "fds" + "input.file". However, I need to specify the MPI processes and how many CPUs I want per MPI task. I haven't been able to figure out how to do it, if someone could tell me the lines I should type that would be awesome.
I'm using this line:
mpiexec -np 5 /home/ec2-user/FDS/FDS6/bin/fds
2_Baixa_cat.fds
-np specifies the 5 MPI processes, however I don't know how to assign CPUs. And the "export OMP_NUM_THREADS="
doesn't work either, I can't activate the OpenMP
1. On the other hand, the simulation starts but it gives me the following error:
[ec2-user@ip-X 2_Baixa]$ bash fds_input.sh
Starting FDS ...
MPI Process 3 started on
ip-Xeu-west-1.compute.internal
MPI Process 0 started on
ip.Xeu-west-1.compute.internal
MPI Process 1 started on
ip.Xeu-west-1.compute.internal
MPI Process 2 started on
ip-Xeu-west-1.compute.internal
MPI Process 4 started on
ip-Xeu-west-1.compute.internal
Reading FDS input file ...
WARNING: SPEC FUEL VAPOR is not in the table of
pre-defined species. Any unassigned SPEC variables in
the input were assigned the properties of nitrogen.
Fire Dynamics Simulator
Current Date : February 2, 2024 12:50:54
Revision : FDS-6.8.0-0-g886e009-release
Revision Date : Tue Apr 18 07:06:40 2023 -0400
Compiler : ifort version 2021.7.1
Compilation Date : Apr 18, 2023 15:20:17
MPI Enabled; Number of MPI Processes: 5
OpenMP Disabled
MPI version: 3.1
MPI library version: Intel(R) MPI Library 2021.6 for
Linux* OS
Job TITLE : (Concatenated) Simulaci� AET
article: continuitat baixa, FCC < 50%, density = 0.2
Job ID string : 2_Baixa_cat
forrtl: severe (174): SIGSEGV, segmentation fault
occurred
Image PC Routine
Line Source
libc.so.6 00007F9B40C54DD0
Unknown Unknown Unknown
fds 0000000006D848A6
Unknown Unknown Unknown
fds 0000000006D70077
Unknown Unknown Unknown
fds 0000000006AB7C99
Unknown Unknown Unknown
fds 000000000040A71D
Unknown Unknown Unknown
libc.so.6 00007F9B40C3FEB0
Unknown Unknown Unknown
libc.so.6 00007F9B40C3FF60
__libc_start_main Unknown Unknown
fds 000000000040A636
Unknown Unknown Unknown
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 0 PID 8749 RUNNING AT
ip-172-31-6-71.eu-west-1.compute.internal
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 1 PID 8750 RUNNING AT
ip-172-31-6-71.eu-west-1.compute.internal
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 2 PID 8751 RUNNING AT
ip-172-31-6-71.eu-west-1.compute.internal
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 4 PID 8753 RUNNING AT
ip-172-31-6-71.eu-west-1.compute.internal
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================
===================================================================================
[ec2-user@ip-X 2_Baixa]$ bash prova.sh
Starting FDS ...
MPI Process 11 started on
ip.X.eu-west-1.compute.internal
MPI Process 26 started on
ipX.eu-west-1.compute.internal
MPI Process 28 started on
ip-X.eu-west-1.compute.internal
MPI Process 42 started on
ip-X.eu-west-1.compute.internal
MPI Process 0 started on
ip-X.eu-west-1.compute.internal
MPI Process 3 started on
ip.Xeu-west-1.compute.internal
MPI Process 4 started on
ip-X.eu-west-1.compute.internal
MPI Process 12 started on
ip-X.eu-west-1.compute.internal
MPI Process 13 started on
ip-X.eu-west-1.compute.internal
MPI Process 15 started on
ip-X.eu-west-1.compute.internal
MPI Process 16 started on
ip-X.eu-west-1.compute.internal
MPI Process 33 started on
ip-X.eu-west-1.compute.internal
MPI Process 35 started on
ip-X.eu-west-1.compute.internal
MPI Process 46 started on
ip-X.eu-west-1.compute.internal
MPI Process 2 started on
ip-X.eu-west-1.compute.internal
MPI Process 5 started on
ip-X.eu-west-1.compute.internal
MPI Process 6 started on
ip-X.eu-west-1.compute.internal
MPI Process 9 started on
ip-X.eu-west-1.compute.internal
MPI Process 10 started on
ip-X.eu-west-1.compute.internal
MPI Process 14 started on
ip-X.eu-west-1.compute.internal
MPI Process 19 started on
ip-X.eu-west-1.compute.internal
MPI Process 20 started on
ip-X.eu-west-1.compute.internal
MPI Process 23 started on
ip-X.eu-west-1.compute.internal
MPI Process 25 started on
ip-X.eu-west-1.compute.internal
MPI Process 34 started on
ip-X.eu-west-1.compute.internal
MPI Process 40 started on
ip-X.eu-west-1.compute.internal
MPI Process 7 started on
ip-X.eu-west-1.compute.internal
MPI Process 17 started on
ip-X.eu-west-1.compute.internal
MPI Process 18 started on
ip-X.eu-west-1.compute.internal
MPI Process 22 started on
ip-X.eu-west-1.compute.internal
MPI Process 24 started on
ip-X.eu-west-1.compute.internal
MPI Process 27 started on
ip-X.eu-west-1.compute.internal
MPI Process 29 started on
ip-X.eu-west-1.compute.internal
MPI Process 30 started on
ip-X.eu-west-1.compute.internal
MPI Process 31 started on
ip-X.eu-west-1.compute.internal
MPI Process 32 started on
ip-X.eu-west-1.compute.internal
MPI Process 36 started on
ip-X.eu-west-1.compute.internal
MPI Process 38 started on
ip-X.eu-west-1.compute.internal
MPI Process 45 started on
ip-X.eu-west-1.compute.internal
MPI Process 1 started on
ip-X.eu-west-1.compute.internal
MPI Process 8 started on
ip-X.eu-west-1.compute.internal
MPI Process 21 started on
ip-X.eu-west-1.compute.internal
MPI Process 37 started on
ip-X.eu-west-1.compute.internal
MPI Process 39 started on
ip-X.eu-west-1.compute.internal
MPI Process 41 started on
ip-X.eu-west-1.compute.internal
MPI Process 43 started on
ip-X.eu-west-1.compute.internal
MPI Process 44 started on
ip-X.eu-west-1.compute.internal
MPI Process 47 started on
ip-X.eu-west-1.compute.internal
Reading FDS input file ...
ERROR: The number of MPI processes, 48, exceeds the
number of meshes, 9 (CHID: 2_Baixa_cat)
ERROR: FDS was improperly set-up - FDS stopped (CHID:
2_Baixa_cat)