PETSc with large systems (anaconda)

778 views
Skip to first unread message

Franco Milicchio

unread,
Jun 19, 2017, 8:07:57 AM6/19/17
to fenics-support
Dear all,

is it possible to use PETSc for large systems? Right now I am getting an out of memory error with the following code:

  dolfin::PETScLUSolver algorithm;
  algorithm.set_operator(A);
  algorithm.solve(*u.vector(), b);

While the complete error log is this:

UMFPACK V5.7.6 (May 4, 2016): ERROR: out of memory

libc++abi.dylib: terminating with uncaught exception of type std::runtime_error: 

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 76 (Error in external library).
*** Where:   This error was encountered inside /Users/travis/miniconda3/conda-bld/fenics_1494837782211/work/dolfin-2017.1.0/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 0
*** 
*** DOLFIN version: 2017.1.0
*** Git changeset:  
*** -------------------------------------------------------------------------


The number of DOFs is not really that high 2253001, I expected it to work, and right now I am limited to ~1M DOFs. 

I am looking to possible options with the default anaconda binary (2017.1 running on a macOS 10.12).

Thanks for any help!
    Franco




Jan Blechta

unread,
Jun 19, 2017, 8:26:24 AM6/19/17
to Franco Milicchio, fenics-support
Number of dofs does not say how large fill-in will be encountered in LU
factors. Fill-in is not in general controlled by number of DOFs.

I would recommend trying MUMPS, SuperLU, or SuperLU_dist instead of
UMFPACK. Furthermore one can then play with a particular library
parameters to tweak the fill-in, for example

PETScOptions.set("-mat_mumps_cntl_1", 0.05)

see
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERMUMPS.html
for overview of parameters and MUMPS manual for thorough description of
the method and parameters.

Jan

Franco Milicchio

unread,
Jun 19, 2017, 9:04:19 AM6/19/17
to fenics-support


On Monday, June 19, 2017 at 2:26:24 PM UTC+2, Jan Blechta wrote:
Number of dofs does not say how large fill-in will be encountered in LU
factors. Fill-in is not in general controlled by number of DOFs.

I would recommend trying MUMPS, SuperLU, or SuperLU_dist instead of
UMFPACK. Furthermore one can then play with a particular library
parameters to tweak the fill-in, for example

  PETScOptions.set("-mat_mumps_cntl_1", 0.05)

see
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERMUMPS.html
for overview of parameters and MUMPS manual for thorough description of
the method and parameters.



Thanks, Jan.

Nothing seems to let me solve a large system with a LU solver, not even setting memory bounds to 10GB. I have to switch to PETScKrylovSolver for large systems, it seems:

  PETScOptions::set("mat_mumps_icntl_14", 80.0);
  PETScOptions::set("mat_mumps_icntl_23", 10240);

Is there any other option with the stock anaconda Fenics package? 

Thank you!
    Franco

Jan Blechta

unread,
Jun 19, 2017, 9:09:26 AM6/19/17
to Franco Milicchio, fenics-support
You can try adjusting pivotting behaviour (or try to employ static
pivotting with iterative refinement). But most importantly you tell
MUMPS whether the system is symmetric or SPD which allows for using
algorithms with much better control of fill-in. If this does not help
you're probably right that sparse direct method is not an option of
choice for that large problem.

Jan


On Mon, 19 Jun 2017 05:07:56 -0700 (PDT)
Franco Milicchio <franco.m...@gmail.com> wrote:

Franco Milicchio

unread,
Jun 19, 2017, 9:16:53 AM6/19/17
to fenics-support


On Monday, June 19, 2017 at 3:09:26 PM UTC+2, Jan Blechta wrote:
You can try adjusting pivotting behaviour (or try to employ static
pivotting with iterative refinement). But most importantly you tell
MUMPS whether the system is symmetric or SPD which allows for using
algorithms with much better control of fill-in. If this does not help
you're probably right that sparse direct method is not an option of
choice for that large problem.

I am not sure I am using MUMPS right now. How can I make sure I am pushing PETSc to the limit?

Thanks!
 

Jan Blechta

unread,
Jun 19, 2017, 9:32:00 AM6/19/17
to Franco Milicchio, fenics-support
Examples of using MUMPS in DOLFIN:

solver = PETScLUSolver("mumps")
solver.set_operator(A)
solver.solve(x, b)

solver = LinearVariationalSolver(problem)
solver.parameters["linear_solver"] = "mumps"

solve(A, x, b 'mumps')

solve(a == L, solver_parameters={'linear_solver': 'mumps'})

etc.


With

PETScOptions("-mat_mumps_icntl_4", 3)

you should see some output of MUMPS on stdout.

> pushing PETSc to the limit?

Sorry, I don't know how to answer this questions.

Jan

>
> Thanks!
>
>

Garth N. Wells

unread,
Jun 19, 2017, 9:50:19 AM6/19/17
to Jan Blechta, Franco Milicchio, fenics-support
The issue here is not DOLFIN but the Anaconda PETSc package.
conda-forge doesn't allow PETSc to be built with --download-foo [1],
which means that to enable PETSc with packages like MUMPS, the package
must be in conda-forge. Even if a package is on conda-forge, there are
often issues with MPI versions.

Garth

[1] https://github.com/conda-forge/petsc-feedstock/issues/12
> --
> You received this message because you are subscribed to the Google Groups "fenics-support" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to fenics-suppor...@googlegroups.com.
> To post to this group, send email to fenics-...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/fenics-support/20170619153159.23223b93%40gott.
> For more options, visit https://groups.google.com/d/optout.

Franco Milicchio

unread,
Jun 19, 2017, 10:25:56 AM6/19/17
to fenics-support


On Monday, June 19, 2017 at 3:50:19 PM UTC+2, Garth N. Wells wrote:
The issue here is not DOLFIN but the Anaconda PETSc package.
conda-forge doesn't allow PETSc to be built with --download-foo [1],
which means that to enable PETSc with packages like MUMPS, the package
must be in conda-forge. Even if a package is on conda-forge, there are
often issues with MPI versions.

Thanks, Garth. This means I have just to wait for a solution. 

In the meantime, I will use the Krylov solvers for large enough systems.

Thank you!
 
Reply all
Reply to author
Forward
0 new messages