MPI_FATAL ERROR

1,048 views
Skip to first unread message

Danish

unread,
May 10, 2016, 2:08:43 AM5/10/16
to FDS and Smokeview Discussions
Dear FDS User,

I have a workstation of the following details:

Intel(R)Xeon(R)CPU E5-2630 v3 @ 2.40GHz
 
Sockets: 2    Physical Cores: 16    Logical Processors: 32    Hyper-Threading: Enable

Problem Description:
FDS version : fds6_mpi_win64
Run command: mpiexec.exe -n (no. of mesh) -localonly fds6_mpi_win64.exe (filename).fds

I have installed a mpich2 libraries ( Argonne National Laboraory) for parallel runs, When I run the model in parallel, it supports upto 12 meshes and if the model consist more than 12 meshes or run another program using the same  version of fds, it shows the following error.

Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
joMPID_nem_ini
t(234)..............: b aborted:
raMPID_nem_newtcp_module_init(96).: nk: no0: simulator8: 1: process 0 exited wMPID_nem_newtcp_module_bind(431): ithout calling finalizede: exit code[: error message]



1:MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to p2: simulator8: 1: process 2 exited without calling finalizeort range [8670, 8690] simFatal error in MPI_Init: Other MPI error, error stack:ulato
r8: 1: process 1 exited without calling finalize


MPIR_Init_thread(392)...........:
3:MPID_Init(139)..................: channel initialization failed simulator8: 1: process 3 exited without calling finalize
5: simulator8: 1: process 5 exited without calling finalize
4: simulator8: 1: process 4 exited without calling finalize

MPIDI_CH3_IniMPID_nem_newtcp_module_init(96).: t(38)..............:
MP
ID_nem_init(234)..............:
MPID_nem_newtcp_module_bind(431):
6:MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to
 port range [8670, 8690] simulator8: 1: process 6 exited without calling finalize
Fatal error in MPI_Init: Other MPI error, error stack:

7: simulator8: 1: process 7 exited without calling finali
MPIR_Init_thread(392)...........: ze8:MPID_Init(139)..................: channel initialization failed simulator8: 1: process 8 exited without calling finalize

9:MPIDI_CH3_Init(38)..............:  simulator8: 1: process 9
 exited without calling finalize

10: simulator8: 1: process 10 exitedMPID_nem_newtcp_module_init(96).:  without calling finalizeMP
ID_nem_init(234)..............:
MPID_nem_newtcp_module_bind(431):
11MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to p
ort range [8670, 8690]: simulator8: 1: process 11 exited without calling finalize
Fatal error in MPI_Init: Other MPI error, error stack:

MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 8690]


Kindly help me to overcome from this situation.

Thanks in advance.

Regards
Danish

Kevin

unread,
May 10, 2016, 8:50:48 AM5/10/16
to FDS and Smokeview Discussions
We build the Windows executable of FDS with the Intel MPI library. The library files needed to run this version are included with the FDS download. Why are you using MPICH?

Danish

unread,
May 13, 2016, 2:13:00 AM5/13/16
to FDS and Smokeview Discussions
Dear Kevin,

Thanks for your suggestion, but I have one more issue since I have mentioned that my workstation consists 32 logical processor, when I run the program upto 12 meshes it show 100% CPU usage from the task bar manger. Can you please suggest me in this, is this some kind of system issue?

Regards
Danish

Kevin

unread,
May 13, 2016, 8:40:16 AM5/13/16
to FDS and Smokeview Discussions
No, I do not have any suggestions to make other than to use the MPI libraries that we bundle with the FDS download.

Brian L

unread,
May 13, 2016, 12:32:13 PM5/13/16
to FDS and Smokeview Discussions
Hi Danish,

When you run the run on FDS5.5 do you still have the same issue?

Best regards,

Danish

unread,
May 26, 2016, 2:17:45 AM5/26/16
to FDS and Smokeview Discussions
Dear Kevin,

I have fds6.4.0 in my system, but when I through multiple jobs in mpi, using the exe file from the fds package after running one job it shows an error.

Kindly help me to overcome from this situation.

Regards
Danish



On Tuesday, May 10, 2016 at 11:38:43 AM UTC+5:30, Danish wrote:
hydra error.JPG

Kevin

unread,
May 26, 2016, 9:02:24 AM5/26/16
to FDS and Smokeview Discussions
You can only run one MPI job at a time.

rsjh

unread,
Jul 4, 2016, 11:14:13 PM7/4/16
to FDS and Smokeview Discussions
Hi, Danish,

Have you solved your problem? I have the same problem with you.
The new version of FDS.6.4 need very long time to run a case and just run one MPI-simulation at one time in single computer, Therefore I also try to use FDS 6.1 with MPICH parallel proceess, but the problem occurs as yours when I run the cases excess the number of 3, the  problem as following:


Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 86
90]
Fatal error in MPI_Init: Other MPI error, error stack:
MPIR_Init_thread(392)...........:
MPID_Init(139)..................: channel initialization failed
MPIDI_CH3_Init(38)..............:
MPID_nem_init(234)..............:
MPID_nem_newtcp_module_init(96).:
MPID_nem_newtcp_module_bind(431):
MPIU_SOCKW_Bind_port_range(178).:  Unable to bind socket to port range [8670, 86
90]

job aborted:
rank: node: exit code[: error message]
0: ustc-PC: 123
1: ustc-PC: 123
2: ustc-PC: 1: process 2 exited without calling finalize
3: ustc-PC: 1: process 3 exited without calling finalize
4: ustc-PC: 123



在 2016年5月10日星期二 UTC+8下午2:08:43,Danish写道:

Kevin

unread,
Jul 5, 2016, 11:41:24 AM7/5/16
to FDS and Smokeview Discussions
We just released FDS 6.5.

Could you tell me if you have successfully run any FDS MPI job?

If you have MPICH installed, this might cause a conflict with the Intel MPI libraries that we distribute with the FDS Windows download.

Wali Hasan

unread,
Jul 5, 2016, 2:03:30 PM7/5/16
to fds...@googlegroups.com
Hi,

Now I am using the latest version of fds and after that I am successfully used all the cores of the system for mpi.
But the issue of using multiple mpi programs is still remains the  same.

Thanks and Regards
Danish

--
You received this message because you are subscribed to a topic in the Google Groups "FDS and Smokeview Discussions" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/fds-smv/fV0_FMNzZGU/unsubscribe.
To unsubscribe from this group and all its topics, send an email to fds-smv+u...@googlegroups.com.
To post to this group, send email to fds...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/fds-smv/db4d0a55-2c3e-4206-840b-edabb8a64e75%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

zpe...@gmail.com

unread,
Jul 6, 2016, 12:04:25 AM7/6/16
to fds-smv
Hi Kevin, 
 I have successfully the FDS MPI job by the new version FDS6.4, but only one case was run at on time. In addition, the speed of running is very slow and it needs very long time to run my case. So I do not want to use the new version.

 Therefore I try to use FDS 6.1 with MPICH parallel proceess which meets my calculation accuracy requirements, It is also successful when I run cases not excess 2, 
but the problem occurs when I run the cases excess the number of 3.

Thank you very much!


--

zpe...@gmail.com

unread,
Jul 6, 2016, 12:05:30 AM7/6/16
to fds-smv
Reply all
Reply to author
Forward
0 new messages