PetSc error when running MPI

877 views
Skip to first unread message

John

unread,
Dec 2, 2016, 10:48:48 AM12/2/16
to moose-users
I am trying to run in parallel and saw that I need to use the mpirun command. I can run perfectly well (if slowly) without MPI however, when I run with this command I fail in my first non-linear iteration with the attached error message.

I'm not understanding the error message, do I need to use a different solver?
MPI_Error.txt

Daniel Schwen

unread,
Dec 2, 2016, 10:53:24 AM12/2/16
to moose-users
Well, what _are_ your PETSc options?! -pc_type lu ? That only works in serial. Just remove _all_ your petsc options. That should default to additive Schwarz (-pc_type asm -pc_sub_type lu), which runs in parallel.

On Fri, Dec 2, 2016 at 8:48 AM John <haas...@gmail.com> wrote:
I am trying to run in parallel and saw that I need to use the mpirun command. I can run perfectly well (if slowly) without MPI however, when I run with this command I fail in my first non-linear iteration with the attached error message.

I'm not understanding the error message, do I need to use a different solver?

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/moose-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/moose-users/799f8086-e2a3-4572-a679-63f50e779bee%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Peterson, JW

unread,
Dec 2, 2016, 10:57:41 AM12/2/16
to moose-users
On Fri, Dec 2, 2016 at 8:48 AM, John <haas...@gmail.com> wrote:
I am trying to run in parallel and saw that I need to use the mpirun command. I can run perfectly well (if slowly) without MPI however, when I run with this command I fail in my first non-linear iteration with the attached error message.

I'm not understanding the error message, do I need to use a different solver?

Yes, you can't use "-pc_type lu" in parallel.  Instead, you can try the PETSc defaults or "-pc_type asm -sub_pc_type ilu" and see if they will work for your application.

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers
[0]PETSC ERROR: MatSolverPackage petsc does not support matrix type mpiaij
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.4, Apr, 12, 2016
[0]PETSC ERROR: ~/zapdos/zapdos-opt on a arch-linux2-c-opt named crcfe01.crc.nd.edu by jhaase1 Fri Dec  2 10:40:20 2016
[0]PETSC ERROR: Configure options --prefix=~/Moose-dir/petsc/petsc-3.6.4/gcc-opt --download-hypre=1 --with-ssl=0 --with-debugging=no --with-pic=1 --with-shared-libraries=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 --download-scalapack=1 --download-mumps=1 CC=mpicc CXX=mpicxx FC=mpif90 F77=mpif77 F90=mpif90 CFLAGS="-fPIC -fopenmp" CXXFLAGS="-fPIC -fopenmp" FFLAGS="-fPIC -fopenmp" FCFLAGS="-fPIC -fopenmp" F90FLAGS="-fPIC -fopenmp" F77FLAGS="-fPIC -fopenmp" PETSC_DIR=~/Moose-dir/petsc/temp/petsc-3.6.4
[0]PETSC ERROR: #1 MatGetFactor() line 4170 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/mat/interface/matrix.c
[0]PETSC ERROR: #2 PCSetUp_LU() line 125 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: #3 PCSetUp() line 983 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #4 KSPSetUp() line 332 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #5 KSPSolve() line 547 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #6 SNESSolve_NEWTONLS() line 233 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/snes/impls/ls/ls.c
[0]PETSC ERROR: #7 SNESSolve() line 3906 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/snes/interface/snes.c

-- 
John

Kong, Fande

unread,
Dec 2, 2016, 11:18:47 AM12/2/16
to moose...@googlegroups.com
On Fri, Dec 2, 2016 at 8:57 AM, Peterson, JW <jw.pe...@inl.gov> wrote:


On Fri, Dec 2, 2016 at 8:48 AM, John <haas...@gmail.com> wrote:
I am trying to run in parallel and saw that I need to use the mpirun command. I can run perfectly well (if slowly) without MPI however, when I run with this command I fail in my first non-linear iteration with the attached error message.

I'm not understanding the error message, do I need to use a different solver?

Yes, you can't use "-pc_type lu" in parallel.  Instead, you can try the PETSc defaults or "-pc_type asm -sub_pc_type ilu" and see if they will work for your application.

I saw that you have installed a superlu_dist  ("--download-superlu_dist=1") on your computer. So you can use superlu_dist in parallel with the option " -pc_type lu -pc_factor_mat_solver_package superlu_dist"


Fande,
 

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers
[0]PETSC ERROR: MatSolverPackage petsc does not support matrix type mpiaij
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.4, Apr, 12, 2016
[0]PETSC ERROR: ~/zapdos/zapdos-opt on a arch-linux2-c-opt named crcfe01.crc.nd.edu by jhaase1 Fri Dec  2 10:40:20 2016
[0]PETSC ERROR: Configure options --prefix=~/Moose-dir/petsc/petsc-3.6.4/gcc-opt --download-hypre=1 --with-ssl=0 --with-debugging=no --with-pic=1 --with-shared-libraries=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-fblaslapack=1 --download-metis=1 --download-parmetis=1 --download-superlu_dist=1 --download-scalapack=1 --download-mumps=1 CC=mpicc CXX=mpicxx FC=mpif90 F77=mpif77 F90=mpif90 CFLAGS="-fPIC -fopenmp" CXXFLAGS="-fPIC -fopenmp" FFLAGS="-fPIC -fopenmp" FCFLAGS="-fPIC -fopenmp" F90FLAGS="-fPIC -fopenmp" F77FLAGS="-fPIC -fopenmp" PETSC_DIR=~/Moose-dir/petsc/temp/petsc-3.6.4
[0]PETSC ERROR: #1 MatGetFactor() line 4170 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/mat/interface/matrix.c
[0]PETSC ERROR: #2 PCSetUp_LU() line 125 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: #3 PCSetUp() line 983 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #4 KSPSetUp() line 332 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #5 KSPSolve() line 547 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #6 SNESSolve_NEWTONLS() line 233 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/snes/impls/ls/ls.c
[0]PETSC ERROR: #7 SNESSolve() line 3906 in ~/Moose-dir/petsc/temp/petsc-3.6.4/src/snes/interface/snes.c

-- 
John

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users+unsubscribe@googlegroups.com.

Derek Gaston

unread,
Dec 2, 2016, 11:20:04 AM12/2/16
to moose...@googlegroups.com
Yes: but the best advice is to get away from using LU (or any direct solver).  If you want to go parallel... start trying to use preconditioned Krylov solvers.

Derek

To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.

Kong, Fande

unread,
Dec 2, 2016, 11:30:14 AM12/2/16
to moose...@googlegroups.com
On Fri, Dec 2, 2016 at 9:19 AM, Derek Gaston <frie...@gmail.com> wrote:

Yes: but the best advice is to get away from using LU (or any direct solver).  If you want to go parallel... start trying to use preconditioned Krylov solvers.

Yes, exactly right. But if he just wants to play for a small-scale problem for at most a hundred of cores, direct solver should work just fine.



 

Derek

Cody Permann

unread,
Dec 2, 2016, 11:36:18 AM12/2/16
to moose...@googlegroups.com
True, but we'd rather guide users and developers to the preconditioners we'd like them to use. Many people don't want to invest a lot of time in learning about several different types of preconditioners when using MOOSE. They're going to copy and paste whatever they begin using in subsequent simulations. We see this all the time in input files throughout the project. 

It's best to start on the target we'd like to promote and show people other options when they need them for specific reasons.
Cody
 


 

Derek

--
You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/moose-users.

Alexander Lindsay

unread,
Dec 2, 2016, 1:43:14 PM12/2/16
to moose...@googlegroups.com
I'll take the "blame" for John using LU :-)

But John, I've found that for our problems,

-pc_type -sub_pc_type
asm        lu

works very well.

Alex
Reply all
Reply to author
Forward
0 new messages