Re: Parallelization for single grain simulations

22 views
Skip to first unread message

Chaitali Patil

unread,
May 30, 2025, 9:45:30 AMMay 30
to DANISH KHAN, Prisms-CPFE-users
Hi Danish, 

I tried  the prm_UniaxialTensionTabular.prm example (in fcc/periodicBCs) with rate-dependent law and only single-crystal microstructure, which ran without any issue. So, potentially, a single crystal microstructure is not the issue. I would suggest trying your microstructure using the example file, ensuring that RVE or parameterization is not causing the problem. My guess is that each node does not have sufficient memory, so you may also try first a coarser microstructure/higher memory allocation.

Regards,
Chaitali

On Tue, May 27, 2025 at 2:55 AM DANISH KHAN <danishkh...@gmail.com> wrote:
Hi Chaitali,

I have observed that whenever I try to submit a single-grain simulation in parallelized mode (using mpirun—n), the simulation does not proceed beyond one iteration. Normally it ends this way:
--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node TUE028626 exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------

However, the same simulation runs perfectly fine when I submit it without parallelization. 

Is it something expected? If yes, is there a way to parallelize single grain simulations?

Regards,

Danish

--
You received this message because you are subscribed to the Google Groups "Prisms-CPFE-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prisms-cpfe-us...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/prisms-cpfe-users/c2c7cf75-7bd3-4e86-b722-c3d9e693838fn%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages