step-29 & step-40

78 views
Skip to first unread message

Hermes Sampedro

unread,
Sep 9, 2021, 5:29:56 AM9/9/21
to deal.II User Group
Dear all, 

I adapted step-29 to run in parallel, similar as step-40. With 1 MPI rank it works, however, when using more than 1 rank I  get a running error (attached is the full output)

An error occurred in line <74> of file </var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.3.0-zy7k3uwnakcqjvrajvacy5l4jrl7eaex/spack-src/source/dofs/dof_tools_sparsity.cc> in function

    void dealii::DoFTools::make_sparsity_pattern(const DoFHandler<dim, spacedim> &, SparsityPatternType &, const AffineConstraints<number> &, const bool, const types::subdomain_id) [dim = 2, spacedim = 2, SparsityPatternType = dealii::DynamicSparsityPattern, number = double]

The violated condition was: 

    sparsity.n_rows() == n_dofs

Additional information: 

    Dimension 26752 not equal to 51842


I found out that the problem is in the setp_system() function. The problematic line is in red. Could you please help me to figure out the issue?


template <int dim>

  void UltrasoundProblem<dim>::setup_system()  {

    deallog << "Setting up system... ";

     deallog << "OK1... ";

    dof_handler.distribute_dofs(fe);

     locally_owned_dofs = dof_handler.locally_owned_dofs();

      DoFTools::extract_locally_relevant_dofs(dof_handler, locally_relevant_dofs);

      locally_relevant_solution.reinit(locally_owned_dofs, locally_relevant_dofs,mpi_communicator);

      system_rhs.reinit(locally_owned_dofs, mpi_communicator);

      constraints.clear();

      constraints.reinit(locally_relevant_dofs);

      DoFTools::make_hanging_node_constraints(dof_handler, constraints);

     VectorTools::interpolate_boundary_values(dof_handler, 1,DirichletBoundaryValues<dim>(),constraints);

      constraints.close();

    DynamicSparsityPattern dsp(locally_relevant_dofs.n_elements(), locally_relevant_dofs.n_elements());

    DoFTools::make_sparsity_pattern(dof_handler, dsp,constraints,false);//THIS

 SparsityTools::distribute_sparsity_pattern(dsp,dof_handler.locally_owned_dofs(),mpi_communicator,locally_relevant_dofs);

   system_matrix.reinit(locally_owned_dofs,locally_owned_dofs, dsp, mpi_communicator);

  }


Thank you very much

H.



log.rtf

Bruno Turcksin

unread,
Sep 9, 2021, 8:36:33 AM9/9/21
to deal.II User Group
Hermes,

The following line is wrong:
DynamicSparsityPattern dsp(locally_relevant_dofs.n_elements(), locally_relevant_dofs.n_elements());

This constructor says that you want a sparsity pattern that has locally_relevant_dofs.n_elements() elements. This is not the case. You want a sparsity pattern that has as many elements as the total numbers of dofs. Also since you want to use MPI, you need to provide a third argument, the locally_owned IndexSet.

You can also simply use:
DynamicSparsityPattern dsp(locally_relevant_dofs);

Best,

Bruno

Hermes Sampedro

unread,
Sep 10, 2021, 5:58:45 AM9/10/21
to dea...@googlegroups.com
Dear Bruno,

Thank you very much for your help.

I would kindly like to ask what is the recommended way in Dealii to write on a plain .txt file the results of a single point when using MPI. I solve for different frequencies and would like to write the solution after each iteration. In the non-parallel version it works as follows:

ofstream myfile;

myfile.open ("solution.txt");

 

       for ( int freq = 0; freí< 10; ++i)

        {

        setup_system();

        assemble_system(freq);

        solve();

         output_results(myfile);

        }

        myfile.close();



       The function  output_results(myfile) takes care of writing the solution of a single point in a plain .txt file for each frequency.


Step-40 shows a way to write .vtu files format when using MPI, but I did not find any function to write the plain solution.

I would appreciate any suggestions on how to do that using Dealii.


Thank you again.

Regards, 

H.




--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/r3NGr6TnxXs/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/f621e3ee-3bce-49b3-9151-1b5a3e4edc55n%40googlegroups.com.

Bruno Turcksin

unread,
Sep 10, 2021, 8:56:48 AM9/10/21
to deal.II User Group
Hermes,

There is no recommended way to write a plain .txt file, it's left to the user to write their own function. The reason deal.II provides a function to write a .vtu file is that you need to use a very specific format. There is no format that you need to follow when writing a .txt file.

Best,

Bruno

Hermes Sampedro

unread,
Sep 14, 2021, 5:19:39 AM9/14/21
to dea...@googlegroups.com
Dear Bruno, thank you for your answer it helped me to solve the problem.

I had some issues with Trilinos library so I decided to implement step-29 in parallel (similar to step-40) using dealii::PETSxWrappers. I get an error on the solve() function on the red line:

template <int dim>

void UltrasoundProblem<dim>::solve()

{

PETScWrappers::MPI::Vector completely_distributed_solution(locally_owned_dofs,mpi_communicator);

SolverControl cn;

PETScWrappers::SparseDirectMUMPS solver(cn, mpi_communicator);

solver.solve(system_matrix, completely_distributed_solution, system_rhs);

constraints.distribute(completely_distributed_solution);

locally_relevant_solution = completely_distributed_solution;

}


where it was declared:

      dealii::PETScWrappers::SparseMatrix system_matrix;

      dealii::PETScWrappers::MPI::Vector       locally_relevant_solution;

      dealii::PETScWrappers::MPI::Vector       system_rhs;


[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------

[0]PETSC ERROR: Nonconforming object sizes

[0]PETSC ERROR: Preconditioner number of local rows 51842 does not equal resulting vector number of rows 26114

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020 

[0]PETSC ERROR: ./step-29 on a  named gbarlogin1 by hsllo Tue Sep 14 11:11:04 2021

[0]PETSC ERROR: Configure options --prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0 --with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0 --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90 --with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib --with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 --download-scalapack=1 --download-mumps=1

[0]PETSC ERROR: #1 PCApply() line 436 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c

[0]PETSC ERROR: #2 KSP_PCApply() line 281 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/include/petsc/private/kspimpl.h

[0]PETSC ERROR: #3 KSPSolve_PREONLY() line 22 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/impls/preonly/preonly.c

[0]PETSC ERROR: #4 KSPSolve_Private() line 694 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/interface/itfunc.c

[0]PETSC ERROR: #5 KSPSolve() line 853 in /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/interface/itfunc.c



----------------------------------------------------

Exception on processing: 


--------------------------------------------------------

An error occurred in line <807> of file </zhome/32/9/115503/dealii-candi/tmp/unpack/deal.II-v9.3.1/source/lac/petsc_solver.cc> in function

    void dealii::PETScWrappers::SparseDirectMUMPS::solve(const dealii::PETScWrappers::MatrixBase&, dealii::PETScWrappers::VectorBase&, const dealii::PETScWrappers::VectorBase&)

The violated condition was: 

    ierr == 0

Additional information: 

    deal.II encountered an error while calling a PETSc function.

    The description of the error provided by PETSc is "Nonconforming

    object sizes".

    The numerical value of the original error code is 60.

--------------------------------------------------------


Aborting!

---------------------




Should I use dealii::PETScWrappers::MPI::SparseMatrix system_matrix instead? If so could ou please help me to with the reinit() function? I do not fully understand how to call it.


Thank you

Regards, 

H


Bruno Turcksin

unread,
Sep 14, 2021, 8:10:59 AM9/14/21
to dea...@googlegroups.com
Hermes,

Le mar. 14 sept. 2021 à 05:19, Hermes Sampedro
<hermes...@gmail.com> a écrit :
>
> Should I use dealii::PETScWrappers::MPI::SparseMatrix system_matrix instead? If so could ou please help me to with the reinit() function? I do not fully understand how to call it.

That's right, you need the matrix to be distributed too. Take a look
at step-17 to see how to use PETScWrappers.

Best,

Bruno

Hermes Sampedro

unread,
Sep 14, 2021, 10:18:19 AM9/14/21
to dea...@googlegroups.com
Dear Bruno thank you very much for pointing step-17, I could solve the problem.

I would like to ask the last question. I am computing step-29 in parallel for different frequencies. I have a loop for each of the frequencies as follows:


        make_grid();

        setup_system();

        assemble_system(sI[0]);

         


       for ( int i = 0; i < Ns; ++i)

        {

        update_system(sI[i]);

        solve();

        output_results();

        }


In orther to not setup the system and assemble in each iteration I created update_system() to update the system matrix  as it change due to the frequency. I need to do system_matrix .reinit before the update to clean the matrix which I realize is time comsuming. I would like to ask if there is another efficient way to update the matrix.

Thank you
Regards, 
H
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/r3NGr6TnxXs/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+un...@googlegroups.com.

Bruno Turcksin

unread,
Sep 14, 2021, 10:28:36 AM9/14/21
to deal.II User Group
Hermes,

You probably don't need to use reinit() in your case. You want to use reinit() if the structure of the matrix changes with frequency. For example, if you change the degree of finite elements you need to call reinit(). If you want to change the values but the structure matrix is unchanged then you don't need to call reinit(). Instead, you can simply do matrix = 0; to clean the matrix.

Best,

Bruno

Hermes Sampedro

unread,
Sep 14, 2021, 10:53:05 AM9/14/21
to dea...@googlegroups.com
Thank you very much.

Regards,
Hermes

Reply all
Reply to author
Forward
0 new messages