Global RTOFS: Post-processing workflow for archv.[a|b] format

71 views
Skip to first unread message

Connor Dibble

unread,
Sep 29, 2025, 2:28:21 PMSep 29
to HYCOM.org Forum
Hi All,

I am accessing full-volume RTOFS data via this AWS S3 bucket. I'd like to see more variables than are currently published in netcdf format, so I need to be able to process the native .[a|b] hycom outputs into netcdf.

I have attempted to use various tools found in HYCOM-tools, RTOFS_GLO, and other github repositories. I primarily work in python, but I have built a docker container with the compiled tooling associated with those repositories. I've seen some related forum posts regarding post-processing and the archv -> netcdf conversion, but I cannot find any complete information on this process and my attempts either fail to extract data correctly due to an invalid grid file, or produce data with invalid coordinates. I do need to perform data extractions above 47N so I do need to deal with the curvilinear grid.

I cannot seem to properly extract the data using the existing tools because I cannot find or produce the correct regional.grid.[a|b] and regional.depth.[a|b] files required to map latitude, longitude, and depth values correctly.

Ideally, I would be able to produce a global or regionally subsetted netcdf file from an arbitrary bounding box from the global archv files.

Can anyone point me to any documentation for this process? For example, it would be fantastic to have documentation to understand how the regional netcdf files that we do  see published are produced, along with any associated input files required for their production.

If anyone has any hints or directions that could help me achieve this goal, that would be most welcome.

Thank you,

Connor Dibble

Alan Wallcraft

unread,
Sep 29, 2025, 2:52:27 PMSep 29
to HYCOM.org Forum, Connor Dibble
I'm not sure which bathymetry RTOFS is using.



You can use hycom_archive_sea_ok to check that a bathymetry is the right one.

If you are OK with using the GLBb0.08 native grid (curvilinear above 47N) then archv2ncdf3z can already do a rectangular sub-region.  The GOMb0.08 example (ncdf3z_archm.csh) is for the entire region,  but to extract the GoM from GLBb0.08 you might use:

4500    'idm   ' = longitudinal array size
3298    'jdm   ' = latitudinal  array size
  41    'kdm   ' = number of layers
  34.0  'thbase' = reference density (sigma units)
   0    'smooth' = smooth the layered fields (0=F,1=T)
3249    'iorign' = i-origin of plotted subregion
1734    'jorign' = j-origin of plotted subregion
 263    'idmp  ' = i-extent of plotted subregion (<=idm; 0 implies idm)
 195    'jdmp  ' = j-extent of plotted subregion (<=jdm; 0 implies jdm)
   3    'itype ' = interpolation type (0=sample,1=linear,2=parabolic, 3=pchip)
  40    'kz    ' = number of depths to sample

Which would write out a 263x195 netCDF for the GoM region.

Alan.

Connor Dibble

unread,
Sep 29, 2025, 5:08:19 PMSep 29
to HYCOM.org Forum, Alan Wallcraft, Connor Dibble
Hello Alan,

Thank you for the reply. I contacted the NOAA group and am told that they are using https://data.hycom.org/datasets/GLBb0.08/expt_53.X/topo/
(regional.depth.[a,b] are depth_GLBb0.08_09m11.[a,b] from that site).

I have downloaded those files to test this out. At the moment I cannot seem to get any output at all from the hycom_archive_sea_ok function, so I need to investigated if I have a compile issue for that.

However, running the function with the following input file still yields a wrong bathymetry error. I will take test it out with the other grid you mentioned next just in case. Please let me know if you see any red flags with this configuration (which aims to convert the full domain).

rtofs_glo.t00z.f06.archv.a
netCDF
000 'iexpt ' = experiment number x10 (000=from archive file)
0 'yrflag' = days in year flag (0=360J16,1=366J16,2=366J01,3-actual)
4500 'idm ' = longitudinal array size
3298 'jdm ' = latitudinal array size
41 'kdm ' = number of layers
34.0 'thbase' = reference density (sigma units)
0 'smooth' = smooth the layered fields (0=F,1=T)
1 'iorign' = i-origin of plotted subregion
1 'jorign' = j-origin of plotted subregion
0 'idmp ' = i-extent of plotted subregion (<=idm; 0 implies idm)
0 'jdmp ' = j-extent of plotted subregion (<=jdm; 0 implies jdm)
3 'itype ' = interpolation type (0=sample,1=linear)
33 'kz ' = number of depths to sample
0.0 'z ' = sample depth 1
10.0 'z ' = sample depth 2
20.0 'z ' = sample depth 3
30.0 'z ' = sample depth 4
50.0 'z ' = sample depth 5
75.0 'z ' = sample depth 6
100.0 'z ' = sample depth 7
125.0 'z ' = sample depth 8
150.0 'z ' = sample depth 9
200.0 'z ' = sample depth 10
250.0 'z ' = sample depth 11
300.0 'z ' = sample depth 12
400.0 'z ' = sample depth 13
500.0 'z ' = sample depth 14
600.0 'z ' = sample depth 15
700.0 'z ' = sample depth 16
800.0 'z ' = sample depth 17
900.0 'z ' = sample depth 18
1000.0 'z ' = sample depth 19
1100.0 'z ' = sample depth 20
1200.0 'z ' = sample depth 21
1300.0 'z ' = sample depth 22
1400.0 'z ' = sample depth 23
1500.0 'z ' = sample depth 24
1750.0 'z ' = sample depth 25
2000.0 'z ' = sample depth 26
2500.0 'z ' = sample depth 27
3000.0 'z ' = sample depth 28
3500.0 'z ' = sample depth 29
4000.0 'z ' = sample depth 30
4500.0 'z ' = sample depth 31
5000.0 'z ' = sample depth 32
5500.0 'z ' = sample depth 33
0 'botio ' = bathymetry I/O unit (0 no I/O)
0 'mltio ' = mix.l.thk. I/O unit (0 no I/O)
1.0 'tempml' = temperature jump across mixed-layer (degC, 0 no I/O)
0.05 'densml' = density jump across mixed-layer (kg/m3, 0 no I/O)
0 'infio ' = intf. depth I/O unit (0 no I/O, <0 label with layer #)
0 'wviio ' = intf. veloc I/O unit (0 no I/O)
0 'wvlio ' = w-velocity I/O unit (0 no I/O)
0 'uvlio ' = u-velocity I/O unit (0 no I/O)
0 'vvlio ' = v-velocity I/O unit (0 no I/O)
0 'splio ' = speed I/O unit (0 no I/O)
30 'temio ' = temperature I/O unit (0 no I/O)
0 'salio ' = salinity I/O unit (0 no I/O)
0 'tthio ' = density I/O unit (0 no I/O)
0 'keio ' = kinetic egy I/O unit (0 no I/O)

I'll update again when I can test out another grid (after I work on re-compiling).

Thank you again,

Connor

Alan Wallcraft

unread,
Sep 29, 2025, 8:19:24 PMSep 29
to HYCOM.org Forum, Connor Dibble, Alan Wallcraft
Here is an example of using hycom_archive_sea_ok:

narwhal05 776> hycom_archive_sea_ok 201_archv.2017_002_06.a /app/projects/hycom/GLBb0.08/topo/depth_GLBb0.08_09m11.a anom_09m11.a
 
ARCHIVE LAND/SEA is OK
 
Depth anomaly: min,max =   -0.00123918    0.00114035
narwhal05 777> ll anom*
-rw-r--r-- 1 wallcraf 0375G018 59375616 Sep 30 00:17 anom_09m11.a

If there is a mismatch, you will get more output.

Alan.

Connor Dibble

unread,
Oct 10, 2025, 4:29:03 PM (7 days ago) Oct 10
to HYCOM.org Forum, Alan Wallcraft, Connor Dibble
Hi Alan,

Sorry for the slow reply. I did get this working and I appreciate your help.

I've opened a pull request on the HYCOM-Tools repository offering my containerization setup and an example script for getting this particular use-case through, in case that's of interest.


Best,

Connor

Reply all
Reply to author
Forward
0 new messages