MPI error running TSUNAMI-3D

33 views
Skip to first unread message

Jake Smith

unread,
Apr 13, 2026, 1:43:13 PMApr 13
to SCALE Users Group
Hi, 

I am receiving the following error message during the loading of CE cross section libraries in TSUNAMI-3D when using the MPI enabled executable of SCALE 6.3.2. The CSAS and TRITON-depletion sequences for the same model are running without any issues. Does anyone know what is causing this error for the TSUNAMI-3D simulation? I tried all combinations of the IFP and CLUTCH methods with both KENO and SHIFT.

Thanks,
Jake Smith

Error message: 
>>> Loading CE library /opt/datalibs/scalexdata80/ce_v8.0_endf.h5
Fatal error in PMPI_Allreduce: Message truncated, error stack:
PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffebadf5c64, rbuf=0x7ffebadf5c60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Message truncated, error stack: PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffe3d00b214, rbuf=0x7ffe3d00b210, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffe19726674, rbuf=0x7ffe19726670, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Message truncated, error stack: PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffdb54d2c14, rbuf=0x7ffdb54d2c10, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffd57372a64, rbuf=0x7ffd57372a60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffe6816fb14, rbuf=0x7ffe6816fb10, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffcf8250a64, rbuf=0x7ffcf8250a60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective

Jake Smith

unread,
Apr 23, 2026, 11:37:57 AM (6 days ago) Apr 23
to SCALE Users Group
I realized this error message is not in a readable format. Here is the full error print out:

>>> Loading CE library /opt/datalibs/scalexdata80/ce_v8.0_endf.h5
Fatal error in PMPI_Allreduce: Message truncated, error stack:
PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffebadf5c64, rbuf=0x7ffebadf5c60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Message truncated, error stack: PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffe3d00b214, rbuf=0x7ffe3d00b210, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffe19726674, rbuf=0x7ffe19726670, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Message truncated, error stack: PMPI_Allreduce(450).....................: MPI_Allreduce(sbuf=0x7ffdb54d2c14, rbuf=0x7ffdb54d2c10, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436).....................: MPIR_Allreduce_impl(293)................: MPIR_Allreduce_intra_auto(178)..........: MPIR_Allreduce_intra_auto(84)...........: MPIR_Bcast_impl(310)....................: MPIR_Bcast_intra_auto(223)..............: MPIR_Bcast_intra_binomial(112)..........: MPIDI_CH3_PktHandler_EagerShortSend(363): Message from rank 0 and tag 2 truncated; 8 bytes received but buffer size is 4 Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffd57372a64, rbuf=0x7ffd57372a60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffe6816fb14, rbuf=0x7ffe6816fb10, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective Fatal error in PMPI_Allreduce: Other MPI error, error stack: PMPI_Allreduce(450)...........: MPI_Allreduce(sbuf=0x7ffcf8250a64, rbuf=0x7ffcf8250a60, count=1, datatype=MPI_INT, op=MPI_SUM, comm=MPI_COMM_WORLD) failed PMPI_Allreduce(436)...........: MPIR_Allreduce_impl(293)......: MPIR_Allreduce_intra_auto(178): MPIR_Allreduce_intra_auto(84).: MPIR_Bcast_impl(310)..........: MPIR_Bcast_intra_auto(223)....: MPIR_Bcast_intra_binomial(182): Failure during collective

Lisa Reed

unread,
Apr 27, 2026, 10:22:31 AM (3 days ago) Apr 27
to SCALE Users Group
While this has already been resolved via emailing scal...@ornl.gov, I am noting the response here for others to see should they encounter this issue:

This is a known issue in SCALE 6.3 versions pre-6.3.3 for MPI-enabled TSUNAMI using an HDF5 CE library (i.e., 8.0 rather than 7.1). The workaround with SCALE 6.3.2 is to either not use MPI with TSUNAMI or not use the HDF5 CE ENDF/B-VIII.0 library. 

Alternatively, the SCALE 6.3.3 maintenance patch may be requested by SCALE 6.3 license holders by emailing scal...@ornl.gov.

-Lisa 
Reply all
Reply to author
Forward
0 new messages