I already mentioned, I'm using a Visual Basic program to exploit my GPX tracks.
Every now and then, the GPX file contains bad positions. All of a sudden one point is completely wrong, e.g. 2km from where we were. Can also happen when inside a building, nothing GPS E can do about. I can then use GPS Track Editor to remove such bad points.
But, I still have tracks that are not cleaned from such bad points. My VB programs helps me here: such a bad point is most of the time visible by that the calculated speed between 2 points become really too high to be true (e.g. I never drive faster than 135km/h, so 200km/h is impossible).
Below an analysis from GPX data of a friend, never drives faster than 100km/h with his mobilhome. Here he's supposed to be driving at 131,4km/h

when looking at the points, they are perfectly normal, all perfectly on the highway.

After a long discussion with him, we concluded that the reason is this:
If one has trackpoints every 2 to 3 seconds, the fact that the timestamps contain no useful decimals (they all are .000) make the calculated speed unprecise (e.g. a real 90km/h might be displayed as 60km/h or 130km/h).
If the time in <TrkPt> would contain meaningful decimals instead of .000 that problem would go away.
(P.S. my recent tracks have 5 second intervals, the problem there is smaller. So, surely not a bug that I report, GPS E would simply become better yet)
Kris Buelens,
--- VM/VSE consultant, Belgium ---