step-40. petsc with openmp

214 views
Skip to first unread message

Xiaohan Zhang

unread,
Mar 14, 2014, 9:56:21 AM3/14/14
to dea...@googlegroups.com
Hi,

I am trying to use openmp in the element assemble procedure of step-40. Has anyone done this before? Any advice is appreciated.

My purpose is to increase the assembly time by making use of multi-threads of each core. Is this feasible?

-Xiaohan

Timo Heister

unread,
Mar 14, 2014, 10:24:18 AM3/14/14
to dea...@googlegroups.com
> I am trying to use openmp in the element assemble procedure of step-40. Has
> anyone done this before? Any advice is appreciated.

Our implementation of PETScWrappers is not thread-safe, so you can not
write into matrices/vectors concurrently. That means using TBB (what
we use inside deal.II) or OpenMP won't help you much, because it would
require locking.

Our Trilinos wrappers are thread-safe, though.

Even if you change the implementation of PETScWrappers to allow this
(it wouldn't be too difficult, ask me if you want to know more), you
still have the problem that anything in the linear solver (matrix
vector products, preconditioners, ...) are likely not running
multi-threaded.

> My purpose is to increase the assembly time by making use of multi-threads
> of each core. Is this feasible?

You know you can run one MPI task per core, right?

--
Timo Heister
http://www.math.clemson.edu/~heister/

Xiaohan Zhang

unread,
Mar 14, 2014, 10:50:59 AM3/14/14
to dea...@googlegroups.com
Hi Timo,

I see your point. I just read the petsc document, petsc is not thread safe itself. 

I will probably stick to pure mpi for now. Thanks

-Xiaohan 

Kartik Jujare

unread,
Jul 11, 2017, 10:33:34 AM7/11/17
to deal.II User Group
Hi Timo,

Does this still hold true of the Petsc wrappers not being thread-safe?

Regards,
Kartik Jujare

Wolfgang Bangerth

unread,
Jul 11, 2017, 11:53:48 AM7/11/17
to dea...@googlegroups.com
On 07/11/2017 08:33 AM, Kartik Jujare wrote:
>
> Does this still hold true of the Petsc wrappers not being thread-safe?

Yes. But it's not the wrappers that are the problem, it's that PETSc itself is
not thread safe.

Best
W.


--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

Kartik Jujare

unread,
Jul 12, 2017, 7:22:13 AM7/12/17
to deal.II User Group, bang...@colostate.edu
Thank you for the answer

Regards,
Kartik
Reply all
Reply to author
Forward
0 new messages