Truncation errors when using petsc in dealii

53 views
Skip to first unread message

Yiliang Wang

unread,
Dec 8, 2025, 11:36:29 AM (10 days ago) Dec 8
to deal.II User Group
Hi Everyone,

I have been using dealii to solve nonlinear elasticity problem like step-44. The problem is geometry nonlinear only without any material nonlinearity. I have implemented two versions of codes to solve it. One is implemented as step-44, namely in a smp way. The other one is implemented as step-17 and step-18, namely in a dmp way. And I am using petsc to solve the linear equations. I found something interesting and I want to share it with you.

1. When I use SI-system (N-m-s), the material properties value will be really large as expected. For example, the Young's modulus will be 1e11 Pa. Somehow, the dmp code will behave very strangely in this case. The CG will finish in 0 iteration and the solution will be empty. If I change the unit system to be N-mm-s, the Young's modulus become 1e5 MPa and then the CG starts to behave normally. The smp code seems to be more insensitive to the type of unit system.

2. Although the finally results of smp and dmp are the same, the computational time is different. Surprisingly, the dmp code is slower than smp code. It is not because the CG is slower in dmp, it is somehow the dmp code will need more N-R iterations them smp.

Based on the above observation, I have feeling that there are some loss of accuracy when using petsc. Most likely it happens when we transfer Vector or Matrix between petsc and dealii. I am not sure if any of you has encountered those situations before. I need some advices about how to ensure the accuracy when using petsc in dealii.

Thanks.

Wolfgang Bangerth

unread,
Dec 9, 2025, 2:57:07 PM (9 days ago) Dec 9
to dea...@googlegroups.com
Yiliang:

> 1. When I use SI-system (N-m-s), the material properties value will be
> really large as expected. For example, the Young's modulus will be 1e11
> Pa. Somehow, the dmp code will behave very strangely in this case. The
> CG will finish in 0 iteration and the solution will be empty. If I
> change the unit system to be N-mm-s, the Young's modulus become 1e5 MPa
> and then the CG starts to behave normally. The smp code seems to be more
> insensitive to the type of unit system.

How do you set the stopping criterion for your solver? If you are
solving a single equation, the choice of physical units should not
matter because any choice scales all equations equally. In that case,
using a *relative* tolerance as a stopping criterion should lead to a
number of iterations that is the same for all choices of units.

The situation is different if you have a system of equations (say, the
Stokes equations) in which case you need to scale the equations to a
common unit first.


> 2. Although the finally results of smp and dmp are the same, the
> computational time is different. Surprisingly, the dmp code is slower
> than smp code. It is not because the CG is slower in dmp, it is somehow
> the dmp code will need more N-R iterations them smp.

I don't actually know what SMP and DMP mean. As a consequence, I can't
suggest why the two formulations may lead to different numbers of
nonlinear iterations.


> Based on the above observation, I have feeling that there are some loss
> of accuracy when using petsc. Most likely it happens when we transfer
> Vector or Matrix between petsc and dealii.

I don't think this is a likely reason for the discrepancy. We have been
using the PETSc interfaces for more than 20 years by now, and I don't
think that PETSc is more or less accurate than our own linear algebra
classes, or that accuracy is lost in the transfer. The issue is almost
certainly somewhere else.

Best
W.

Yiliang Wang

unread,
Dec 9, 2025, 3:19:19 PM (9 days ago) Dec 9
to dea...@googlegroups.com
Thanks for your reply, Dr. Bangerth.

Best regards,
Yiliang Wang

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dealii+un...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/dealii/a387af8e-45cd-4035-81ce-99a6395f2fa3%40colostate.edu.

Yiliang Wang

unread,
Dec 16, 2025, 11:42:51 PM (2 days ago) Dec 16
to deal.II User Group
Hi Dr. Bangerth,

I think I found where the issue is. But I am not sure how to fix it.

Here is the code I set up the preconditioner and solver convergence tol and max iterations.
Screenshot 2025-12-16 233225.png

However, if I run it with -ksp_view -pc_view, the printed info indicates that petsc still uses the default settings. Especially, it is using PRECONDITIONED norm for convergence test.
Screenshot 2025-12-16 233532.png
I am not sure why the input of solver_control is not passed into petsc. Or maybe it is overwritten somewhere? Is it because the versions of petsc and dealii I am using are not compatible? I am using petsc 3.16.6 and dealii 9.7.0.

I will appreciate it if you can help.

Bests,
Yiliang Wang

Yiliang Wang

unread,
Dec 17, 2025, 8:55:21 PM (14 hours ago) Dec 17
to deal.II User Group
Hi  Dr. Bangerth,

I think I have found the solution. I use add -ksp_norm_type unpreconditioned flag when run the simulation.
My suggestion is that it will be better if this flag is automatically turned on when using petsc CG. Otherwise, the convergence test is not consistent between the built-in CG and petsc CG.

Thanks,
Yiliang Wang

Wolfgang Bangerth

unread,
Dec 17, 2025, 10:54:24 PM (12 hours ago) Dec 17
to dea...@googlegroups.com
On 12/17/25 18:55, Yiliang Wang wrote:
>
> I think I have found the solution. I use add -ksp_norm_type unpreconditioned
> flag when run the simulation.
> My suggestion is that it will be better if this flag is automatically turned
> on when using petsc CG. Otherwise, the convergence test is not consistent
> between the built-in CG and petsc CG.

Ah, that's an interesting observation -- nice job figuring this out! I'm not
sure any of us ever realized that. Is there a way to set this flag in the
program, rather than on the command line? Would you like to see if you could
write a patch that does that? Presumably, one would make this modification here:
https://github.com/dealii/dealii/blob/master/source/lac/petsc_solver.cc#L327-L335
I don't know what we should do about the other PETSc solvers. Presumably, we
would want to make these similar to the deal.II solvers as well, but I'd be
happy to fix one of them at a time.

Best
W.
Reply all
Reply to author
Forward
0 new messages