In my app, I calculate the speed of the tablet (!) by using the following -
1. It reads the latitude, longitude and system time and stores them
2. When the location changes, then
2a. It stores the previous data in new lat/long variables
2b. It stores the new readings of the same data
3. It calculates the speed by
3a. It calculates the distance between the readings by calculating the length of the hypotenuse of the triangle with right sides (difference in long readings) and (difference in lat readings) and converts from degrees to nautical miles
3b It calculates the time elapsed between the readings by subtracting the times
3c It divides the distance by the time elapsed
I calculate the heading by -
4a. It calculates the angle from atan2 of the differences in lat and long readings
4b. It uses the modulo function on the angle
These seem to give generally sensible results, but the values jump around a lot. If I am stationary, the speed never seems to fall to zero, it jumps around up to say 50 knots, and the track varies all around the compass, as if the data from the sensor were noisy. My commercial GPS apps give smooth results, but respond quite quickly to changing speed (say), so it's not as if they are waiting for a long time between calculations to improve the result.
Sensor parameters tried have been distanceinterval = 2 and 5, and timeinterval = 500 and 2000
Please can anyone suggest why this happens, and how to fix it?
Thanks.