Spurious eigenvalues

29 views
Skip to first unread message

Bruno Turcksin

unread,
Nov 30, 2017, 10:24:05 AM11/30/17
to deal.II User Group

Hi all,

In step-36, there is an explanation on how Dirichlet boundary conditions introduce spurious eigenvalues because some dofs are constrained. However, there is no mention of hanging nodes. So I am wondering if I can treat them as shown for the Dirichlet boundary, i.e, the only difference between a hanging node and a Dirichlet is what happens in ConstraintMatrix::distribute(). I also wonder if there is a way to avoid having these spurious eigenvalues computed or if the only way to deal with them is to redo the calculation after changing the entries in the matrix.

Best,

Bruno

Denis Davydov

unread,
Nov 30, 2017, 5:31:07 PM11/30/17
to deal.II User Group
Hi Bruno,

AFAIK, there is a simple solution: make initial vector (or subspace) perpendicular to those constrained entries. 
That is, if you do Lancoz, set random initial vector and then zero out constrained DoFs. 
Then being Krylov-based method it should form subspaces {x, Ax, A^2x,...} orthogonal to those constrained DoFs, so 
you should not get any issues.

Cheers,
Denis.

Bruno Turcksin

unread,
Nov 30, 2017, 7:47:28 PM11/30/17
to dea...@googlegroups.com
Thanks Denis!

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to a topic in the Google Groups "deal.II User Group" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/dealii/w8E6gJA2Tyc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to dealii+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Wolfgang Bangerth

unread,
Dec 1, 2017, 8:19:22 AM12/1/17
to dea...@googlegroups.com, Bruno Turcksin
On 11/30/2017 05:47 PM, Bruno Turcksin wrote:
>
> In step-36, there is an explanation on how Dirichlet boundary conditions
> introduce spurious eigenvalues because some dofs are constrained. However,
> there is no mention of hanging nodes. So I am wondering if I can treat them as
> shown for the Dirichlet boundary, i.e, the only difference between a hanging
> node and a Dirichlet is what happens in ConstraintMatrix::distribute().

Yes, I think this is correct -- you're going to have a row in the eigenvalue
equation where both matrices have an entry on the diagonal. You can set these
values to whatever you want and that's what's going to determine the size of
the spurious eigenvalue.


> I also
> wonder if there is a way to avoid having these spurious eigenvalues computed

They're always going to be there because we keep constrained nodes in the
linear system.

Best
W.


--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

Timo Heister

unread,
Dec 1, 2017, 8:50:38 AM12/1/17
to dea...@googlegroups.com, Bruno Turcksin
> They're always going to be there because we keep constrained nodes in the
> linear system.

If we modify the way ConstraintMatrix operates, we could work around
this though:
1. Without rescaling those equations (as we do inside ConstraintMatrix
right now) the spurious EV would all be equal to 1 and so ignoring
them is easier.
2. One could also zero out the constrained rows after the fact.
3. Did we rip out the support for removing the constrained entries
from the matrix completely?

--
Timo Heister
http://www.math.clemson.edu/~heister/

Denis Davydov

unread,
Dec 1, 2017, 8:55:53 AM12/1/17
to deal.II User Group


On Friday, December 1, 2017 at 2:50:38 PM UTC+1, Timo Heister wrote:
> They're always going to be there because we keep constrained nodes in the
> linear system.

If we modify the way ConstraintMatrix operates, we could work around
this though:

but there is really no need, is it? 
If you start with initial vector (or subspace, depending on the method) orthogonal to the constraints, you ain't gonna find converge to those eigenvalues anyway.

Cheers,
Denis 

Bruno Turcksin

unread,
Dec 1, 2017, 8:58:51 AM12/1/17
to Timo Heister, dea...@googlegroups.com
2017-12-01 8:50 GMT-05:00 Timo Heister <hei...@clemson.edu>:
3. Did we rip out the support for removing the constrained entries
from the matrix completely?
That was my plan at first using ConstraintMatix::condense but according to the documentation, the constrained entries are not removed.

Best,

Bruno

Wolfgang Bangerth

unread,
Dec 1, 2017, 2:56:45 PM12/1/17
to dea...@googlegroups.com
On 12/01/2017 06:50 AM, Timo Heister wrote:
>> They're always going to be there because we keep constrained nodes in the
>> linear system.
>
> If we modify the way ConstraintMatrix operates, we could work around
> this though:
> 1. Without rescaling those equations (as we do inside ConstraintMatrix
> right now) the spurious EV would all be equal to 1 and so ignoring
> them is easier.

Correct. But we re-scale for a good reason, namely so that all entries
in the matrix are of comparable size.


> 2. One could also zero out the constrained rows after the fact.

But only in one of the two matrices of a generalized eigenvalue problem.
At least one of the two matrices needs to remain invertible.

> 3. Did we rip out the support for removing the constrained entries
> from the matrix completely?

Yes.
Reply all
Reply to author
Forward
0 new messages