fixing one component of solution to the same value

128 views
Skip to first unread message

RAJAT ARORA

unread,
Mar 14, 2017, 6:22:19 PM3/14/17
to deal.II User Group
Hello all,

I am using deal.ii to solve a 3D solid mechanics problem which uses p::d::triangulation.

I am solving equilibrium equations to solve for the displacement in the domain.
The domain of the problem is a cylinder with the z-axis aligned along the axis of the cylinder. 

To avoid the warping, I want constraints such that
displacement in the z direction is same for all nodes on the +z surface.

How can I apply such a constraint?

Thanks.

Wolfgang Bangerth

unread,
Mar 15, 2017, 3:35:42 PM3/15/17
to dea...@googlegroups.com

> I am using deal.ii to solve a 3D solid mechanics problem which uses
> p::d::triangulation.
>
> I am solving equilibrium equations to solve for the displacement in the domain.
> The domain of the problem is a cylinder with the z-axis aligned along the axis
> of the cylinder.
>
> To avoid the warping, I want constraints such that
> displacement in the z direction is same for all nodes on the +z surface.

Choose one degree of freedom for the vertical displacement on one processor
(say, it is u_42), and then on all other processors you want to go through all
vertical displacements at the boundary in question and set
u_i = u_42
This is a straightforward constraint to add to the ConstraintMatrix class.
Take a look at step-11 to see how constraints are added.

Best
W.


--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

RAJAT ARORA

unread,
Mar 23, 2017, 2:20:58 PM3/23/17
to deal.II User Group, bang...@colostate.edu
Thanks Professor.

RAJAT ARORA

unread,
Apr 12, 2017, 7:03:17 PM4/12/17
to deal.II User Group, bang...@colostate.edu
Hello Professor,

I did what you suggested above. However, I am facing a problem.

Just to repeat, I want to constraint all dofs on the z surface to  a single dof (named as the_dof = 2249)
 I apply the constraints the following way.


        ComponentMask cm(dim, false);
        cm.set(2, true); // z component
        IndexSet selected_dofs;

        std::set< types::boundary_id > boundary_ids;
        boundary_ids.insert(plus_z_face);

        DoFTools::extract_boundary_dofs (dof_handler, cm, selected_dofs, boundary_ids);
       
        std::vector<types::global_dof_index> index_vector;
        selected_dofs.fill_index_vector(index_vector); // I am using deal 8.3

        for (unsigned int i = 0; i < index_vector.size(); ++i)
        {
            if (index_vector[i] == the_dof) // dont constrain to itself
                continue;

            constraints.add_line(index_vector[i]);
            constraints.add_entry(index_vector[i], the_dof, 1.0);
        }
  constraints.close();

  

    DynamicSparsityPattern dsp (locally_relevant_dofs);

    DoFTools::make_sparsity_pattern (dof_handler, dsp, constraints, false);

    SparsityTools::distribute_sparsity_pattern (dsp,
            dof_handler.n_locally_owned_dofs_per_processor(),
            fdm->mpi_communicator,
            locally_relevant_dofs);

    system_matrix.reinit (locally_owned_dofs,
                          locally_owned_dofs,
                          dsp,
                          fdm->mpi_communicator);

// Then go to assemble system.



[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Argument out of range
[3]PETSC ERROR: Inserting a new nonzero at global row/column (2249, 6) into matrix
[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.6.4, Apr, 12, 2016 
[3]PETSC ERROR: Configure options --with-shared-libraries=1 --with-x=0 --with-mpi=1 --download-hypre=1 --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-f2cblaslapack --download-mpich --download-mumps --download-scalapack --download-parmetis --download-metis
[3]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 613 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/impls/aij/mpi/mpiaij.c
[3]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 731 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/impls/aij/mpi/mpiaij.c
[3]PETSC ERROR: #3 MatAssemblyEnd() line 5110 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/interface/matrix.c


I am not sure why such an error is occurring. This shows that sparsity pattern was generated that such a position (2249, 6) will not be filled, so no space was allocated while declaring the 
Petsc sparse matrix. However, constraints.distribute() function is trying to put data in that position.

Moreover, the dof 6 and the dof 2249 donot belong to the same element. And the dof 6 is for the x component and dof 2249 is for z component.

Can you please guide me as to what is wrong and how should I proceed?

Thanks.

Wolfgang Bangerth

unread,
Apr 12, 2017, 7:09:17 PM4/12/17
to dea...@googlegroups.com
On 04/12/2017 05:03 PM, RAJAT ARORA wrote:
>
>
> I am not sure why such an error is occurring. This shows that sparsity pattern
> was generated that such a position (2249, 6) will not be filled, so no space
> was allocated while declaring the
> Petsc sparse matrix. However, constraints.distribute() function is trying to
> put data in that position.
>
> Moreover, the dof 6 and the dof 2249 donot belong to the same element. And the
> dof 6 is for the x component and dof 2249 is for z component.
>
> Can you please guide me as to what is wrong and how should I proceed?

Is this something that happens already with one processor? Either way, you
will need to debug why constraints.distribute() tries to write into this
entry. For this, you ought to look at what DoF indices you have on the cell
where this happens, whether one of those has index 6, what constraints exist
on DoF 6, etc.

Also relevant is: do you have *additional* constraints? Hanging nodes? Did you
take all of those constraints into account when you built the sparsity
pattern? When you build the local matrix, are all DoF indices you use only
from the current cell? Etc.

RAJAT ARORA

unread,
Apr 17, 2017, 4:43:35 PM4/17/17
to deal.II User Group, bang...@colostate.edu

Hello Professor,

I have created a test case to show this wierd behaviour (bug ? ). Running with 3 or more processors give this error.

I have used some techniques that I actually learned from you and the other group members to come with a small working example and demonstrate this error.

I created a mesh which is of 64 elements with 4 elements in each X, Y, and Z direction.

Then, I just apply the constraints that every dof on the +z face is constrained to a unique dof on the +z face. No other constraints are applied.

I just try to assemble the system using a random cell_matrix and cell_rhs. 

The code gives the error below. 

Dofs number correspond to running code with 4 processors.

I think this is a bug because, for this mesh, dof number 225 is coupled to a dof on +z face which in turn is constrained to the unique_dof (374). This means that there is actually a coupling between dofs 225 and 374. But the sparsity pattern object does not allocate space for this new coupling (374, 225).

I am attaching the small working example showing this behavior. Kindly let me know if there is something wrong 
that I am doing.

[2]PETSC ERROR: Argument out of range
[2]PETSC ERROR: Inserting a new nonzero at global row/column (374, 225) into matrix
[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.6.4, Apr, 12, 2016 
[2]PETSC ERROR: ./fdm on a arch-linux2-c-debug named rajat by rajat Mon Apr 17 16:26:23 2017
[2]PETSC ERROR: Configure options --with-shared-libraries=1 --with-x=0 --with-mpi=1 --download-hypre=1 --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-f2cblaslapack --download-mpich --download-mumps --download-scalapack --download-parmetis --download-metis
[2]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 613 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/impls/aij/mpi/mpiaij.c
[2]PETSC ERROR: #2 MatAssemblyEnd_MPIAIJ() line 731 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/impls/aij/mpi/mpiaij.c
[2]PETSC ERROR: #3 MatAssemblyEnd() line 5110 in /home/rajat/Documents/Code-Libraries/petsc/petsc-3.6.4/src/mat/interface/matrix.c
terminate called after throwing an instance of 'dealii::PETScWrappers::MatrixBase::ExcPETScError'
  what():  
--------------------------------------------------------
An error occurred in line <263> of file </home/rajat/Documents/Code-Libraries/deal/dealii8.3/dealii-8.3.0/source/lac/petsc_matrix_base.cc> in function
    void dealii::PETScWrappers::MatrixBase::compress(dealii::VectorOperation::values)
The violated condition was: 
    ierr == 0
The name and call sequence of the exception was:
    ExcPETScError(ierr)
Additional Information: 
An error with error number 63 occurred while calling a PETSc function
--------------------------------------------------------

common.h
main.cc
CMakeLists.txt

RAJAT ARORA

unread,
Apr 18, 2017, 1:00:50 AM4/18/17
to deal.II User Group, bang...@colostate.edu

Hello Professor,

I am now using deal.ii 8.5 and I am attaching the updated code. 

The error message still remains the same.
common.h
main.cc
CMakeLists.txt

Daniel Arndt

unread,
Apr 20, 2017, 5:37:46 AM4/20/17
to deal.II User Group, bang...@colostate.edu
Rajat,

What is happening here is that you try to create constraints that include DoFs that are not element of the IndexSet you provided to the DynamicSparsityPattern object.
Therefore, all the entries for your "fixed" DoF are ignored if it is not already part of the locally relevant DoFs.
Have a look at the discussion in "The LaplaceProblem class; LaplaceProblem::setup_system" in step-40 [1].

You can solve your issue by changing
  IndexSet locally_relevant_dofs2 = locally_relevant_dofs; 
  constraint_temp.reinit(locally_relevant_dofs2);
to
  IndexSet locally_relevant_dofs2 = locally_relevant_dofs;
  locally_relevant_dofs2.add_index(the_dof);
  constraint_temp.reinit(locally_relevant_dofs2);

Best,
Daniel

[1] http://www.dealii.org/8.4.1/doxygen/deal.II/step_40.html

RAJAT ARORA

unread,
Apr 20, 2017, 7:52:29 AM4/20/17
to deal.II User Group, bang...@colostate.edu
Hello Daniel,

Thanks for the reply. 

I am actually doing what you suggested. Please look at the updated code that I had posted earlier and works with deal.ii 8.5. 
These lines are already there in the code. Surprisingly, such an error is still showing up. I am not sure what is still causing this weird behavior (bug?).

 IndexSet locally_relevant_dofs2 = locally_relevant_dofs;
locally_relevant_dofs2
.add_index(the_dof);
constraint_temp
.reinit(locally_relevant_dofs2);



I am attaching the code again for your perusal.

common.h
main.cc
CMakeLists.txt

Daniel Arndt

unread,
Apr 20, 2017, 9:16:57 AM4/20/17
to deal.II User Group, bang...@colostate.edu
Rajat,

It seems that I confused lines and did not express in the code what I wrote above:
You have to add that additional index also when initializing the sparsity pattern, i.e.
  std::cout << " constraints printed " <<std::endl;
  DynamicSparsityPattern dsp (locally_relevant_dofs);
should become
  std::cout << " constraints printed " <<std::endl;
  locally_relevant_dofs.add_index(the_dof);
  DynamicSparsityPattern dsp (locally_relevant_dofs);

Best,
Daniel

RAJAT ARORA

unread,
Apr 20, 2017, 4:21:06 PM4/20/17
to deal.II User Group, bang...@colostate.edu
Hello Daniel,

Thank you so much for pointing this out. I have been stuck here for a long time and could not figure out. 

I understood that the sparsity pattern class was some how failing to know that space needs to be allocated for coupling with the_dof even though it was not locally relevant dof.
But again, I was under the impression that since the constraintMatrix object is passed to it, it should now which dofs are coupled and which are not.

I tried to understand whether they work on the same index set by looking at the source code of ConstraintMatrix::add_entries_local_to_global to generate sparsity pattern.

Here is a suggestion, that I think might be helpful in debugging such errors.

When constraint matrix object was initialized only with locally relevant dofs and had no knowledge of the_dof, an assert was triggered which helped me understand the error.
By looking at the source code where the assert was triggered, I was able to add the_dof to the locally relevant IndexSet  to initialize constraint Matrix.

I think  it would be much more clear if such an Assert / Assert Throw is  placed somewhere in the Sparsity pattern class. This way one can make sure that error is not coming while assembling (doing cell->get_dof_indices()) or from any other source. More over, if such an issue arises, the error will be reported right away while generating sparsity pattern.

For example: Such an assert is triggered if two constraints are merged when they are initialized with different index sets.
So, should the index sets of two objects -- constraints and sparsity pattern -- not be compared when making a sparsity pattern?

I again would like to thank you for pointing out the error. I really appreciate your help.
Reply all
Reply to author
Forward
0 new messages