I'll take a step back and describe the challenge more detailed:
I have a skin (Bootstrap) which is showing live data in charts. One design goal is to have the most accurate and most up to date values shown in the front end. The charts are updated with every loop value (received by the MQTT-JS client in the browser). One challenge is to keep the charts in sync with the data in the backend. They run out of sync quickly for various reasons. One very common reason is inactivity: the device which runs the browser is inactive, the browser tab running the web page is inactive, and so on. A mechanism in the web page checks regularly for updated backend data and asynchronously reloads backend data and refreshes the charts. Such a call for new data is done in two steps: the first step is to get a small JS file containing the timestamp of the latest backend data, and is only done when such new data can be expected (after the next archive_interval). If this timestamp is newer than the last known timestamp, the second step, fetching the new backend data and reloading the charts, is done.
The idea is, or better was, to inform the front end when new backend data was uploaded, to get the latest backend data immediately.
But thinking a little bit further, this doesn't really fix or improve the issue I have with this mechanism, which is:
The new backend data is uploaded quite a while after the preceding archive_interval has finished. Depending on the number and complexity of reports run, and the backend machine's capabilities, this takes the one or the other minute. While in most cases this gap isn't an issue, in some cases you miss or lose events and readings up until the next archive_interval's backend data is available, for instance:
Imagine, rain is pouring down like crazy and your gauge goes up every couple of seconds. At the top of the archive_interval the live data is constantly updates by the MQTT messages coming in. After a minute or so, the backend has finished storing the new archive value, generated the reports and uploaded the fresh backend data. The front end now fetches the backend data and reloads the charts, "losing" all loop data in between. "Losing", because this data will show up again, but you will have to wait for the next archive_interval and the reports and uploads to be done. Or imagine a very strong gust happening in this gap: it will disappear in the chart and reappear after the next archive_interval.
So all in all, this is not a big deal. But from the above design goals point of view, it is an issue.
To resolve the issue, informing the front end isn't a good approach. The gap is still there. To solve my issue, I need to keep all loop data with timestamps after the top of the newest archive interval remaining in the front end, after syncing it with the newest backend data.