A query on SparseDirectUMFPACK initialization with TrilinosWrappers::BlockSparseMatrix !

72 views
Skip to first unread message

Shiraz Farouq

unread,
May 17, 2015, 12:19:10 PM5/17/15
to dea...@googlegroups.com
Hi !

I am following from step-36 for solving the eigenvalue problem:

I have a block matrix

A=[M  -aK
     K     M]

and preconditioner

P=[M   -a*aK
     K  M+aK]

I solve for the eigenvalue problem :

A
x=P \lambda x

I am using the Arpack solver using the following code from step-36

solve_for_eigenvalues(){

       SolverControl solver_control (system_dof_handler.n_dofs(), 1e-9);
       SparseDirectUMFPACK inverse;
       inverse.initialize (system_matrix);
       const unsigned int num_arnoldi_vectors = 2*eigenvalues.size() + 2;
       ArpackSolver::AdditionalData additional_data(num_arnoldi_vectors);
       ArpackSolver eigensolver (solver_control, additional_data);
       eigensolver.solve (system_matrix,
       system_matrix,
       inverse,
       eigenvalues,
       eigenfunctions,
       eigenfunctions.size());
       for (unsigned int i=0; i<eigenfunctions.size(); ++i)
       eigenfunctions[i] /= eigenfunctions[i].linfty_norm ();
       return solver_control.last_step ();

    }

However, its doesn't compile

undefined reference to `void dealii::SparseDirectUMFPACK::initialize<dealii::TrilinosWrappers::BlockSparseMatrix>(dealii::TrilinosWrappers::BlockSparseMatrix const&, dealii::SparseDirectUMFPACK::AdditionalData)'

How is the reference undefined !

Thanks

/S
Message has been deleted

Shiraz Farouq

unread,
May 17, 2015, 3:47:43 PM5/17/15
to dea...@googlegroups.com
Just a little added note !

I solved for (just for testing)

M x = M \lambda x

With Petsc implementation, the eigenvalues are all 1. However with Arpack, it complains:

Spurious eigenvalues are all in the interval [1,1]

*** glibc detected *** /home/eigen_spectrum/eigen_spectrum: double free or corruption (out): 0x0000000000982c40 ***

*** glibc detected *** /home/eigen_spectrum/eigen_spectrum: malloc(): memory corruption: 0x0000000000982cc0 ***


I think there is some error here !

Wolfgang Bangerth

unread,
May 30, 2015, 4:46:58 PM5/30/15
to dea...@googlegroups.com
On 05/17/2015 02:47 PM, Shiraz Farouq wrote:
> Just a little added note !
>
> I solved for (just for testing)
>
> *M* x =*M* \lambda x
>
> With Petsc implementation, the eigenvalues are all 1. However with Arpack, it
> complains:
>
> Spurious eigenvalues are all in the interval [1,1]
>
> *** glibc detected *** /home/eigen_spectrum/eigen_spectrum: double free or
> corruption (out): 0x0000000000982c40 ***
>
> *** glibc detected *** /home/eigen_spectrum/eigen_spectrum: malloc(): memory
> corruption: 0x0000000000982cc0 ***
>
>
> I think there is some error here !

Clearly. Someone is accessing memory incorrectly. You should run your program
under a memory debugging tool, e.g., using valgrind. This will help you find
where the problem is.

You could also just get a backtrace when running in a debugger.

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@math.tamu.edu
www: http://www.math.tamu.edu/~bangerth/

Shiraz Farouq

unread,
Jun 3, 2015, 8:35:01 AM6/3/15
to dea...@googlegroups.com
Thank you Wolfgang !

Well, my curiosity is that it only happens what the matrices on either side are the same ! Otherwise it runs fine ! I checked it and it looks very strange !
My second question is related to finding the eingenvalues of a block system (not been working on it for now) I have the examples in the tutorial for the general matrices, but they do not work for Block systems when I tried it; or have I missed something here.

Thanks again !



--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
--- You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/FcPgpVEN5DQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Uwe Köcher

unread,
Jun 3, 2015, 11:19:16 AM6/3/15
to dea...@googlegroups.com
Indeed, the double free or corruption should not occur, but the case that you have the *same* matrix
on both sides does not make sense to me...

The generalized eigenvalue problem reads as
A x  = \lamda B x, with a symmetric positve definite matrix B
or the specialized eigenvalue problem as
A x = \lambda x,
so the rhs matrix B is the identity.

So, if you want to solve your problem, why not copying the matrix M to another matrix M2
and plug M and M2 into your solver?

The deal.II user documentation of the Arpack solver
says, that it calls dneupd and dnaupd functions of ARPACK.
Potentially the memory error occurs here, if the matrix objects are the same, but for this you must
look into the implementation and file a bug report if you find some implementational error.

Best 
  Uwe

Shiraz Farouq

unread,
Jun 8, 2015, 10:51:50 PM6/8/15
to dea...@googlegroups.com
Thanks Uwe,

indeed the calculation was for testing and it turned out that it did not work ! I am not sure if the issue needs a fix, maybe a comment would do. Interesting that there is no such problem in case of Petc.

/S

Meng Fan

unread,
Jul 14, 2019, 4:52:16 AM7/14/19
to deal.II User Group
Hi Shiraz,

Today when I tried to compile the code which is meant to solve the condition number of system matrix, I also got the compiling error:

"error: undefined reference to 'void dealii::SparseDirectUMFPACK::initialize<dealii::TrilinosWrappers::BlockSparseMatrix>(dealii::TrilinosWrappers::BlockSparseMatrix const&, dealii::SparseDirectUMFPACK::AdditionalData)'"

I uses "LA::MPI::BlockSparseMatrix system_pde_matrix;" to declare my system_matrix instead what is used in step-36 "PETScWrappers::SparseMatrix stiffness_matrix, mass_matrix;"

Is this where the problem lies?

Have you solved this compiling error finally?

Best regards,
Fan Meng

在 2015年5月17日星期日 UTC+2下午6:19:10,Shiraz Farouq写道:
Reply all
Reply to author
Forward
0 new messages