GIRO update and status for HamSCI [from Ivan Galkin / UM Lowell]

43 views
Skip to first unread message

Phil Erickson

unread,
May 21, 2024, 6:35:17 PMMay 21
to Unknown
Hi all,

  At Ivan Galkin's request, I'm passing along this brief text to the HamSCI community on the status of the GIRO ionosonde (and other services) system at U Mass Lowell, along with some information on ionosonde data availability.

73
Phil W1PJE

From Ivan:

Greetings to HamSCI,

I would like to take the opportunity to brief you on the status of GIRO data repositories of ionogram data, https://giro.uml.edu, and our path forward.

As you may be aware, GIRO has been in uninterrupted operation since 2003, acquiring data from contributing ionosonde observatories (in ~30 countries) for immediate release to the public. A Digital Ionogram Database (DIDBase) in Lowell holds 42+ million digisonde ionogram+scaling measurements and another 14 million autoscaling records from non-Digisonde observatories. Our real-time coverage peaked about 3 years ago, with 63 ionosondes reporting prompt weather data to us. 

Over the years, we used to immediately forward most of the acquired SAO records to Boulder for what was originally known as "WDC-A for STP", then NOAA NGDC, then NCEI. We are currently migrating this real-time dissemination process to AFRL's UDL (unified data library), while the NCEI ionosonde service is reportedly being decommissioned.

Regretfully, operating the Lowell GIRO Data Center (LGDC) remains largely a pro bono effort at UML. We enjoyed seeding money from NRL in 2003; over the years, we accepted a number of computer hardware donations. The most significant donations and funding came from LDI (Lowell Digisonde International). A most memorable donation came from Terry Bullett who came to us bearing a tall stack of high-rpm SCSI drives, a cosmic 32 GB each, for the very first version of DIDBase.  We are also very grateful to AFRL for sending us boxes of CD-ROMs holding digisonde data copied from the original QIC tapes, which was a great data rescue. We had a full-time data engineer working to ingest those CDs, one by one, to DIDBase.  (this is still not quite complete, though a long-abandoned effort).

However, for over 2 decades and after many tries, we have not been successful in drawing sufficient interest from the federal agencies to secure a steady flow of [even symbolic] maintenance funding to sustain us.

The lack of funds has not stopped us before, but it certainly prevents us from consistent expansion of our services, especially those to the Ham Radio community. The important tasks of presenting our data optimally to you have often been taken over by other contributors. In our impression, the existing online GIRO data products remain a not-so-visible service to the HamSCI world. We would certainly like to work on this further. 

Our web portal at giro.uml.edu include some potentially-useful "ionospheric weather" resources: MUF computation for various distances, local 24-hour radio weather charts, and HF signal raytracers (including a GPU-driven oblique ionogram synthesizer "RayTRIX CQP"). These services use a real-time assimilative IRI model called IRTAM, driven by GIRO ionosondes. In our immediate plans:  assimilation of the radio occultation profile data from UCAR/COSMIC projects and D-region absorption estimates by comparing O- and X-echo strengths in ionograms (based on GPSII and MSIS). More about our science [this week] as there is an annual INAG meeting during the URSI AT-RASC conference in Gran Canaria, Spain on Wednesday 22 - we will try to arrange a video feed.

Our grand plan is to move LGDC to a cloud storage to avoid the technical nuisances of running the center "on-premises". This will require a significant recurrent cost that we cannot accommodate at this point. So LGDC will sustain its existing, rather aged and somewhat overlooked status, until a reliable arrangement can be secured.  For testing, we have placed one of our 8 science databases, the smallest of all, to AWS MariaDB and observed a $3+k/yr expense. This does not scale well to the 11TB DIDBase alone, and we have close to 25TB of data (and growing). Our attempts to set up a managed service at UML using a "science data preservation" argument have not come to fruition, at least not yet. 

There are good Samaritans that consider taking LGDC under their wing; Phil Erickson of MIT Haystack is one of the dear friends and sympathizers.  We do hope for a better future within the next 2-4 years.

--- Ivan Galkin
--
----
Phil Erickson
phil.e...@gmail.com
Reply all
Reply to author
Forward
0 new messages