Doubt regarding largrange element gradient value in step-3

47 views
Skip to first unread message

het patel

unread,
Oct 17, 2019, 12:21:57 AM10/17/19
to dea...@googlegroups.com
Hello Deal.II users

I recently started learning to use deal.II and was going through step-3 tutorial. I was checking  shape_value() and shape_grad() function values to see which lagrange shape function it is calculating and I found that values of the gradient to be different.
For e.g.  at gauss point ( -0.577, -0.577) it gave gradient as [ -0.788675 , -0.788675] whereas it should be [-0.394 , -0.394 ] according to my calculation. Am I making mistake somewhere ?

Sincerely
Het Patel

Praveen C

unread,
Oct 17, 2019, 12:52:39 AM10/17/19
to Deal. II Googlegroup
You may be comparing the gradient on reference and real cells. They will differ by some factor due to the mapping.

FE.shape_grad gives gradient on reference cell


FEValues.shape_grad gives gradient on real cell

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/CAFYG0Dajn4umL%3DoRx57-PSVpvRswNqHVCinpG-8xZ%3Da5t3RwAA%40mail.gmail.com.

het patel

unread,
Oct 17, 2019, 1:49:43 PM10/17/19
to deal.II User Group
Hi Praveen

That is a nice piece of information. Yes, it is doing integration on the real cell. I didn't knew that there was something like FE.shape_grad for integration on the reference cell. I thought it did by default isoparametric mapping with FEValues and JxW was for the mapping.

Just one more thing. I have very basic understanding of C++ . In the code(step-3) it does vector dot product of the gradient without  writing any sort of function to do so. This might sound bit lame to you but, can you tell me if  we can do dot product by simply doing a*b then how to do vector cross product or matrix multiplication and other such mathematical operations as I have worked mostly on MATLAB  for my assignments where this things are very easy.

Sincerely
Het Patel


On Thursday, October 17, 2019 at 12:52:39 AM UTC-4, Praveen C wrote:
You may be comparing the gradient on reference and real cells. They will differ by some factor due to the mapping.

FE.shape_grad gives gradient on reference cell


FEValues.shape_grad gives gradient on real cell

On 17-Oct-2019, at 9:51 AM, het patel <het...@gmail.com> wrote:

Hello Deal.II users

I recently started learning to use deal.II and was going through step-3 tutorial. I was checking  shape_value() and shape_grad() function values to see which lagrange shape function it is calculating and I found that values of the gradient to be different.
For e.g.  at gauss point ( -0.577, -0.577) it gave gradient as [ -0.788675 , -0.788675] whereas it should be [-0.394 , -0.394 ] according to my calculation. Am I making mistake somewhere ?

Sincerely
Het Patel

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dea...@googlegroups.com.

Wolfgang Bangerth

unread,
Oct 17, 2019, 11:10:18 PM10/17/19
to dea...@googlegroups.com
On 10/17/19 11:49 AM, het patel wrote:
>
> Just one more thing. I have very basic understanding of C++ . In the
> code(step-3) it does vector dot product of the gradient without  writing any
> sort of function to do so. This might sound bit lame to you but, can you tell
> me if  we can do dot product by simply doing a*b then how to do vector cross
> product or matrix multiplication and other such mathematical operations as I
> have worked mostly on MATLAB  for my assignments where this things are very easy.

Quantities such as the gradient are represented by the class Tensor<1,dim>,
i.e., a rank-1 tensor (=vector) with dim components. All of the usual
operations +, -, * are defined for such objects and correspond to their usual
mathematical meaning. In particular, gradient*gradient results in a scalar,
whereas 2*gradient would result in a vector twice the length as the gradient
itself.

The same is true if you have matrices of size dim x dim: There is, for
example, operator* for such matrices and correspondingly sized vectors.

For vector products, you will want to look at the following function and the
ones below it:

https://www.dealii.org/developer/doxygen/deal.II/classTensor.html#a024cb35dcb0c9c453dfbeaab6bc9f078

Best
W.

--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@colostate.edu
www: http://www.math.colostate.edu/~bangerth/

het patel

unread,
Oct 17, 2019, 11:57:16 PM10/17/19
to deal.II User Group
Hello Dr. W. Bangerth

Thanks for the wonderful insight. This will really help me in continuing to move forward with tutorials and video lectures.

Sincerely
Het Patel

Praveen C

unread,
Oct 18, 2019, 12:18:45 AM10/18/19
to Deal. II Googlegroup


On 17-Oct-2019, at 11:19 PM, het patel <het...@gmail.com> wrote:

That is a nice piece of information. Yes, it is doing integration on the real cell. I didn't knew that there was something like FE.shape_grad for integration on the reference cell. I thought it did by default isoparametric mapping with FEValues and JxW was for the mapping.

FEValues takes care of converting gradient from reference to real cell. Mostly, this is what you will need when you implement a finite element method. 

See this for some explanation


and also this

PastedGraphic-1.pdf

het patel

unread,
Oct 18, 2019, 11:43:20 AM10/18/19
to deal.II User Group
Thanks Praveen, there were lot of doubts as I started to learn, but with this information that you and Dr. Bangerth provided all such trivial things impeding me has gone and learning ahead will be swift.

-Het

Reply all
Reply to author
Forward
0 new messages