That's a very useful explanation.
But not surprisingly I have another question. Upon working with my nord2 5G and GPSTest, it reported that it used 34 gnss satellites and achieved an estimated accuracy of 1.2 meters, wish I considered excellent. This, I understand, means that 68% of the time, the displayed value is within 1.2 m of being correct.
What what I wonder is this: suppose I use this device with an averaging software, for example Locus GIS. Suppose I let it average for one minute, or 5 minutes, or 15 minutes, for 30 minutes, or an hour. How accurate will the resulting value be?
I understand this is based on some obscure statistical mathematics that I would at least be able to appreciate, although not immediately understand. Implicitly, I expect that we would recognize that the longer the device averaged, the more accurate the resulting measurement will be.
I also realize that this is not especially useful for a person who is relying on dynamic location: let's say I had a car driving around. But once that car has stopped, if it nos measurements is stopped and reset,, the average value ought to get more accurate over time.
In fact, I expect that there is a graph that can be drawn, time on the x-axis, and estimated inaccuracy on the y-axis, which will show how much better the average value will be. If this sounds familiar to you, and you could figure out how to find that, I would very much appreciate it.