As we are creating more and more flows for a diverse set of sensors, a common pattern is that we see data from different sensors at very different rates. Because of this, we end up having to "cache" the values of the incoming JSON sensor data into a context (flow or global) and then create other flows, triggered by inject nodes, that gather the "latest" data values from the various sensors to then perform their computation or other processing of the array of sensor data.
I've been wondering if there might be a good way to create a node, or have some other Node-RED feature, that would allow me to flow the entire JSON object, along with an identifier, into some new "context node" that would store the entire JSON object in either a flow or global context that would allow it's properties to be referenced from anywhere else in my functions and flows. The reference would use some context reference, the identifier, and then the path to the property we want to reference.
This would allow us to automatically "cache" the entire JSON object from the sensors at whatever rate they arrive, and then have our "processing" flow have access to them when needed.
Before we started to develop something like this ourselves, I wanted to start a discussion, and see if anyone else has a different pattern that might be better, or if this is something that might be on the roadmap for Node-RED itself?
Thoughts?