[nmag-users] Parallel computation error in case two materials are used

97 views
Skip to first unread message

Boris Gross

unread,
Oct 8, 2015, 11:39:22 AM10/8/15
to nmag-...@lists.soton.ac.uk
Dear nmag users,

I use nmag in the provided virtual machine with mpich2 for parallel
computing on the 8 cores of my CPU. Up to now I mainly investigated
hysteresis loops of magnetic nanotubes, which works very well. Now we
would like to determine the stray field in a confined region outside of
such a tube, and so I started to follow the suggestions in an earlier
post and introduced a second magnetic material with Ms almost zero for
this region. This seems to work as long as I perform the computation on
only a single core. For more than one core I get a PETSC error "Caught
signal number 11 SEGV: Segmentation Violation, probably memory access
out of range". This happens just after the hysteresis loop has started.
Does anybody have an idea what this means, or a suggestion how to solve
this issue?

Thank you very much for your help,
Boris

Fangohr H.

unread,
Oct 9, 2015, 4:50:49 AM10/9/15
to Boris Gross, Fangohr H., nmag-...@lists.soton.ac.uk
Dear Boris,

I haven’t seen this before, sorry. You say that it works okay on a single core?

Best wishes,

Hans

Boris Gross

unread,
Oct 9, 2015, 6:03:41 AM10/9/15
to Fangohr H., nmag-...@lists.soton.ac.uk
Hi Hans,

thank you for your answer. Yes, it works fine on a single core, also
when run with "mpiexec -n 1 nsim file.py". And simulations with only one
material also work fine on any number of cores.

I also tried the two material example, and the same thing happens, so it
is not connected to anything specific in my simulations.

Best,
Boris


________________________________________
Von: Fangohr H. [H.FA...@soton.ac.uk]
Gesendet: Freitag, 9. Oktober 2015 10:50
An: Boris André Gross
Cc: Fangohr H.; nmag-...@lists.soton.ac.uk
Betreff: Re: [nmag-users] Parallel computation error in case two
materials are used

Fangohr H.

unread,
Oct 9, 2015, 6:39:38 AM10/9/15
to Boris Gross, Fangohr H., nmag-...@lists.soton.ac.uk
Hi Boris,

> thank you for your answer. Yes, it works fine on a single core, also when run with "mpiexec -n 1 nsim file.py". And simulations with only one material also work fine on any number of cores.

This suggests some bug that occurs in parallel execution mode; possibly within petsc. However, if I recall correctly, petsc catches all sorts of errors even if they don’t occur in the petsc code.

I am afraid your best bet is to do these particular runs on one core only; sorry.

Best wishes,

Hans
Reply all
Reply to author
Forward
0 new messages