Structured vs Unstructured Mesh

18 views
Skip to first unread message

Corey DeChant

unread,
May 13, 2020, 10:57:31 AM5/13/20
to zapdos-users
Good morning,

I was re-running the full length of 2D GEC case files and notice that the 1 Torr case fails. I was using the newest PR for Zapdos (Secondary electron bcs), but even when I use an older branch, it will still fail. This is odd since this should be the same file that gave the results shown at 2019 GEC. There is a test for this case, but it uses a very coarse mesh and the end time is between 1-2 rf cycles. I tried different runs with PJFNK, FDP, 'line_search = none', and auto scaling on with 'compute_scaling_once = false'. Changing the line search and turning off scaling once seemed to help a little but the runs still failed. The failure seems to be caused by high gradients at the dielectric boundary condition (I am using the EconomouDielectricBC).

Once I tried to using a structured mesh, the simulation ran to completion. My main question is why? Has anyone else tried to use a 2D structured mesh with Zapdos before? I documented some of the result in a PDF and the input file here ( https://github.com/csdechant/zapdos/tree/structured/debug ). After some reading, it seems that the "rule of thumb" is to used a structured mesh when possible, because they use less memory and they have better alignment which correlates better convergence. If anyone has any experience with structured vs unstructured meshes, please let me know.

Thank you,
Corey DeChant

Casey Icenhour

unread,
May 13, 2020, 12:10:52 PM5/13/20
to zapdos...@googlegroups.com
Did you take a look at how this behaves with different time step sizes? I only see dtmin and dtmax in your Executioner settings. 

--
You received this message because you are subscribed to the Google Groups "zapdos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/zapdos-users/56d387eb-60a2-4089-abbe-a3b3605b23f3%40googlegroups.com.

Alexander Lindsay

unread,
May 13, 2020, 12:13:00 PM5/13/20
to zapdos...@googlegroups.com
What preconditioner are you using?

I suspect that the structured (even when accounting for automatic scaling) has a better condition number than the  unstructured case, but that's mostly a hypothesis. If your problem is less than 1000 dofs, try running each with `-pc_type svd -pc_svd_monitor` and see what the condition number looks like. Although if you're using LU this probably isn't going to matter much (unless you're using PJFNK, then it will definitely matter no matter what the -pc_type is!)

Corey DeChant

unread,
May 13, 2020, 12:35:57 PM5/13/20
to zapdos-users
Casey - Yes. I tried dt = 1e-10s early on but the simulation still failed. I didn't try it with 'line_search = none', and auto scaling on with 'compute_scaling_once = false', so I will retry the smaller time step with those setting.

Alex - I am currently using SMP, but I did tried FDP early on. Below is the setting for the Executioner block it used for the structured mesh. When I use these setting with a similar unstructured mesh, the simulation fails. The number of nodes were about the same for both meshes, ~3000 nodes, so the problem has more than 1000 dofs.

[Executioner]
  type = Transient
  #end_time = 7.4e-3
  end_time = 3.6874e-5
  automatic_scaling = true
  compute_scaling_once = false
  solve_type = NEWTON
  scheme = bdf2
  dtmax = 1e-9
  dtmin = 1e-14
  line_search = none
  petsc_options = '-snes_converged_reason -snes_linesearch_monitor'
  petsc_options_iname = '-pc_type -pc_factor_shift_type -pc_factor_shift_amount -ksp_type -snes_linesearch_minlambda'
  petsc_options_value = 'lu NONZERO 1.e-10 fgmres 1e-3'
[]

Alexander Lindsay

unread,
May 13, 2020, 12:52:47 PM5/13/20
to zapdos...@googlegroups.com
I'm assuming with LU and NEWTON that your linear solve converges in one linear iteration every time? If so, then the problem is not the condition   number.

I'm curious why you're using fgmres? FGMRES is useful when your preconditioner is changing from iteration to iteration, but that shouldn't be the case here.

In general, I find that solves perform better with `line_search = none`, but there are always exceptions.



--
You received this message because you are subscribed to the Google Groups "zapdos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to zapdos-users...@googlegroups.com.

Corey DeChant

unread,
May 13, 2020, 1:20:54 PM5/13/20
to zapdos-users
Yes, with LU and NEWTON the linear solve converges in one linear iteration most of the time. The time it is not true is when the unstructured simulation starts to fail by "DIVERGED_FNORM_NAN", and then the linear solve will hit the default limit of 10000 iterations.

To be honest, I don't remember why I have "FGMRES". I believe I used it once a while back on something else. I usually use the same PETSc options for difference simulations, unless I need to change them and never saw the need to change that one. I can change it to the default setting, if you think it will make a difference.

I have only tried the default and none/basic line searches. The none/basic seems to help a little but the simulation still fails. 

All the unstructured output failed by "DIVERGED_FNORM_NAN", which I assumed is due to the high gradients at the dielectric boundary condition, but the structure mesh shows similar results at the same time step. The structure mesh seems to be able to handle it better and the high gradients get resolved later in the run time. 

Corey DeChant

unread,
May 21, 2020, 2:44:14 PM5/21/20
to zapdos-users
I did several reruns with the unstructured mesh, some with smaller time steps and one without FGMRES. Without FGMRES, the simulation still failed with the same fail time as before and the results looked the same with or without FGMRES. For the time stepper, I went as small as dt=1e-11 at a constant time step. The simulation ran a little bit longer, but it still failed and the results looked similar to those with the higher time steps.
Reply all
Reply to author
Forward
0 new messages