Paras,
> I am working on homogenization of particulate nano-composites and thus have to
> solve the finite deformation quasistatics problem on different
> microstructures, such as one shown in the attachment pic-1.png.
>
> In order to automate the generation of (2D & 3D) meshes for microstructures
> comprising of randomly distributed filler particles of arbitrary radius, we
> employ a self-written python code using GMSH for mesh generation. The setup
> works well for 2D but for 3D, the resulting mesh (left) culminates in
> irregularities in the stress contour as compared to that obtained with a
> structured mesh (right) as can be seen from the attachment pic-2.png. Due to
> the random spatial distribution and arbitrary size of particles, unstructured
> mesh appeared as the feasible option.
>
> In order to verify that the cause of such irregularities is the mesh and not
> our solver (based on deal.ii), we tested the mesh in Abaqus and similar
> irregularities were observed, cf. pic-3.png. Refining the mesh does not help
> either.
That suggests that you are doing something wrong. I don't know what exactly
the problem is, and in particular how you assign material properties to cells,
but we know from theory that the finite element solution must converge to the
exact solution if you just refine the mesh often enough. If it doesn't for
you, then something is amiss.
This does not say anything about the *absolute level of accuracy*. I would not
be surprised if for a mesh like the one you show (in which pretty much every
hexahedron is poorly shaped), the solution is less than for the corresponding
tetrahedal mesh. But if you refine it a couple of times, you should get a more
accurate solution. Is this not the case?
> Has anyone else working with unstructured Hex meshes, observed similar issues?
> Any clues on how to deal with this issue or on other tools (preferably open
> source) which one could use for generating better meshes in 3D would be of
> great help.
The quality of meshes matters for the absolute level of error. The hex meshes
you get by subdividing tets are generally quite poor, though I know of people
who are using these routinely and report that they nevertheless get good
accuracy. A better approach is certainly to see if you can find ways to
*directly* create hex meshes. gmsh can do this to some degree. For the sphere
you show, deal.II can also generate a high-quality mesh itself.
From the look of it, the solution is at least smoother. That might suggest
that the issue is not a bug in the code, but that the solution really does
converge, though maybe slowly. Do you have the resources to do one more
refinement, just to see whether the solution continues to get better? That's
not going to be a production run, but a one-time computation that might take a
day or two if necessary.
That won't work if you have many inclusions. We generally deal with many
inclusions by just using a regular mesh for the domain and at each quadrature
point asking whether it is inside an inclusion or in the background material.
Here is an example of how this could be done:
https://github.com/geodynamics/aspect/blob/master/benchmarks/nsinker/nsinker.cc
The value() returns the coefficient at a particular point. The mesh does not
actually to to mesh these inclusions, we just leave it to each quadrature
point. The benchmark this file implements is like this:
https://geodynamics.org/cig/news/research-highlights/november-2019/
You can find more information about this "nsinker" benchmark here:
https://tigerprints.clemson.edu/cgi/viewcontent.cgi?article=3528&context=all_dissertations
1. Generally visual rendering software (at least paraview) renders data on quad/hex meshes using triangular polygons. Consequently, when you look at the color map within a square, paraview is actually rendering the square using two triangles. Consequently, the color map often appears wrong because visually you are not seing the bi-linear interpolation, but some sort of "cut in half quad" where each cut is rendered using a linear simplex element. Do you think there is another metric you can look at other than the visualisation? What you are seeing right now might in fact be an artifact of the GPU rendering of your solution.
Stresses and strains (both based on the derivative of the solution field and thus suffering from the irregularities) are the commonly chosen metrics for visualization. In order to verify if this is just a software induced artefact, I also tired with VISIT, but got the same problem there as well. As an alternative to nodal stress values (computed by L2 projection of the QP stress values), I also tried to plot the cell-wise average stresses but in that case, the contour was even more irregular. Is there any visualization tool, which can render date directly using quads/hexes? Further, it is noteworthy that the solution i.e. the displacement, does not show such irregularities.
2. One issue with sub-divided tets into hex is that the added node on the triangular face does not match the topology of your geometry. In CFD this can introduce very large errors, so I guess it does the same thing for solid mechanics. What you can do is generate that mesh in GMSH, then "inflate it" to match the manifold of the real geometry you want to simulate. This means displacing a few nodes. You can achieve this robustly by solving a linear elasticity equation with dirichlet boundary conditoin to ensure that your displacement deforms your mesh everywhere. I have found this to work very well when the objects are spherical.
3. If the two solutions above don't work, you can try alternative meshing software (like cubit). However, they are harder to automatize in a GMSH-like fashion.
Trelis [https://coreform.com/products/trelis/] seems to be the next thing to try out. Could you please comment on in what sense would it be "harder to automate". The approach now would be to generate the RVE geometry within our python code in .step format, and then call Trelis from within our code for mesh generation.
Since, we plan to make the RVE generation tool open source at some point in the future, an open source meshing backend would have been great. Are you aware of any open source alternatives which could probably solve this problem?