Dear DREAM.3D team,
I am running into a persistent issue when trying to compute feature-averaged values of scalar fields that I appended to my DREAM.3D file.
Context:
I start from a DAMASK simulation (.hdf5) and export orientations into a .dream3d file using damask library functionality, namely export_DREAM3D(q='O').
After that, I append custom scalar arrays into /DataContainers/SyntheticVolumeDataContainer/CellData (namely: nuc_flag, tot_density, rho_mob_total, and rho_dip_total).
Each array has the same TupleDimensions as Eulers, i.e. (478, 13, 108) with ComponentDimensions = [1].
Example:
tot_density: shape (671112,), dtype float32
Metadata: TupleDimensions = [478, 13, 108], ComponentDimensions = [1]
When I open the file in DREAM.3D or in HDFView/myHDF5, the arrays are clearly present under CellData:
I also confirm that FeatureIds exists after segmentation and that I select tot_density directly from the UI.
The problem:
When I add the Find Average Value of Scalars For Feature filter and point it to tot_density, the pipeline fails with:
Pipeline output:
Yet, tot_density (and the other arrays) are still visible in the Data Structure pane after the Reader step.
Troubleshooting I tried:
Verified in HDF5 browser → arrays are written correctly, with consistent tuple dimensions and types.
Verified in DREAM.3D → arrays show up in the Data Structure pane after the Reader, and I selected them in the proxy.
Confirmed that FeatureIds is created by segmentation.
Tried disabling/re-enabling the filter, reordering the pipeline, and saving/reloading the file.
Observation:
The error only appears after segmentation. It looks as if the segmentation step may overwrite or recreate the CellData AttributeMatrix, causing DREAM.3D to “drop” arrays it doesn’t recognize.The appended scalar arrays should persist through segmentation and be available to subsequent filters (in particular, Find Average Value of Scalars For Feature).
Question:
Is this a limitation/bug in DREAM.3D where segmentation replaces CellData and discards non-standard arrays?
If so, is there a recommended workaround for keeping custom arrays (tot_density, etc.) alive through the segmentation pipeline?
Fotis Tsiolis
--
Doctoral Candidate
Theory and Simulation Group
Department of Microstructure Physics and Alloy Design
Max-Planck-Institut für Nachhaltige Materialien
--
You received this message because you are subscribed to the Google Groups "dream3d-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dream3d-user...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/dream3d-users/b7785804-60bb-4112-b1fc-04e5b214a48cn%40googlegroups.com.
Fotis Tsiolis
--
Doctoral Candidate
Theory and Simulation Group
Department of Microstructure Physics and Alloy Design
Max-Planck-Institut für Nachhaltige Materialien
Dear Mr. Jackson,Thank you very much for your reply.
I would like to infrom you that I was able to replicate your segmentation pipeline results using the new -NX version of D3D. I have noticed thought that the resultant hdf5 structure differs from the 6.x version of the software. This causes segmentation errors with the recrystallization code in which I want to use the segmented RVE ans an input. Therefore, I also tried the "Finf Attribute Arrays Statistics" filter that you mentioned, but I bumped into the same error code -90002 (see the screenshot attached).Moreover, I used an older 6.x version than the last one that I currrently use to test my segmentation pipeline there. Unfortunately, the error persists. I guess there is something wrong with the python code I use to export my results from the DAMASK hdf5 file to a 6.x compatible .dream3d hdf5 file.Therefore, I would like to ask if there is a filter that could write the segmentation results into the 6.x hdf5 data fromat.
Thank you very much in advance for your help.Sincerely,Fotis Tsiolis
--
Doctoral Candidate
Theory and Simulation Group
Department of Microstructure Physics and Alloy Design
Max-Planck-Institut für Nachhaltige Materialien
To view this discussion visit https://groups.google.com/d/msgid/dream3d-users/b939ce6c-c750-41e1-b001-8871ab4e25b1n%40googlegroups.com.
![]()
Dear Mr. Jackson,
Thank you for your detailed reply and for testing the pipeline in DREAM3D-NX. I was able to replicate the results with NX as well. However, I noticed that the data structure in NX differs from the 6.x versions, which are the ones compatible with the recrystallization code I am currently using.
Following your suggestion, I also tested the Compute Array Statistics filter in 6.x, but unfortunately, I still received the same -90002 error. I even tried running the workflow with previous 6.x versions, but the issue persisted. At this point, I suspect the problem may be related to the Python script I wrote to export DAMASK HDF5 data into a 6.x-compatible HDF5 format for segmentation of my RVE.
Before I start developing a more robust conversion script, I would like to ask if DREAM.3D provides any filter or functionality to write datasets back into the legacy HDF5 format (6.x compatible). This would save some time compared to building a custom exporter.
Thanks again for your support.
Best regards,
Fotis Tsiolis
--
Doctoral Candidate
Theory and Simulation Group
Department of Microstructure Physics and Alloy Design
Max-Planck-Institut für Nachhaltige Materialien
Στις Πέμπτη 18 Σεπτεμβρίου 2025 στις 1:25:01 π.μ. UTC+2, ο χρήστης Michael Jackson έγραψε:
Fotis Tsiolis
--
Doctoral Candidate
Theory and Simulation Group
Department of Microstructure Physics and Alloy Design
Max-Planck-Institut für Nachhaltige Materialien
Max-Planck-Str. 1, D-40237, Düsseldorf
Office: H2 5L1, E-mail: f.ts...@mpi-susmat.de