PetSc Hybrid MPI-OPENMP Parallelization with Spack Dealii

87 views
Skip to first unread message

Sukhminder Singh

unread,
Jan 24, 2018, 3:13:06 PM1/24/18
to deal.II User Group
I installed Spack package of Dealii on a machine with 32 cores (48 virtual cores) and ran a petsc-parallel code. If I run using only one MPI process, then I only get one core running for the solver part. Shouldn't it use all the available cores within a node?  I think its installed with MPI only support, right? Can I use hybrid MPI-OPENMPI parallelization with PetSc installed using Spack?

Denis Davydov

unread,
Jan 24, 2018, 5:07:44 PM1/24/18
to deal.II User Group
Hi,

On Wednesday, January 24, 2018 at 9:13:06 PM UTC+1, Sukhminder Singh wrote:
I installed Spack package of Dealii on a machine with 32 cores (48 virtual cores) and ran a petsc-parallel code. If I run using only one MPI process, then I only get one core running for the solver part. Shouldn't it use all the available cores within a node?  I think its installed with MPI only support, right? Can I use hybrid MPI-OPENMPI parallelization with PetSc  
installed using Spack?

if you use PETSc, that's how it works, AFAIK PETSc does not support any hybrid parallelism.

Elsewhere, deal.II uses TBB (for example in assembly, see other tutorials), the number of threads is controlled via 

p.s. it has nothing to do with Spack, which is a package manager to help install things consistently. 

Regards,
Denis. 
Reply all
Reply to author
Forward
0 new messages