Login required to started new threads

Login required to post replies

Re: DC Rainmaker: Garmin’s Biggest Competitor Is Their Own Software Instability - Ouch! [trail]
trail wrote:
helo guy wrote:
. GPS coordinates are just lat/long and timestamps, somehow they could not get that basic "speed = distance/time" calculation right. Even a high school level CS student should be able to manage that.


That's the high school way of doing GPS velocity. :) Doppler and/or carrier phase shift calculations are generally much more accurate for measuring instantaneous velocity. And that's how most GPS receivers do it. There's one set of algorithms to do the least-squares stuff on pseudoranges to calculate positions, and another set doing carrier phase stuff to calculate velocity.

But I agree that Garmin (who know a thing or two about GPS) should be able to detect poor velocity measurement conditions and use some simple modelling techniques to estimate what velocity might be. And then use post-processing once the satellite signals are regained to improve those estimates given the new information.


I am not expecting any advanced signal processing nor do they need to do any modeling techniques in the situation I described. Garmin has no data to work with when it is in a tunnel or under a bridge, BUT they do know the last good GPS fix and the fix when they exit the tunnel. They don't know exactly how my speed varied while I was in the tunnel, but they COULD calculate what my average speed was as soon as I exit and they get another fix. I would be perfectly content with that.

But that is not how my watch worked. I have no idea what calculations they use as I have never seen the source code, but it seems to be some sort of moving average where they assumed a fixed time between points. I always got a massive jump in speed as soon as I got GPS contact back, which then moves back down to my actual speed. I'm talking going from 10 min miles, to 5 min miles on exiting an underpass, then ramping back up to my actual speed. No way any of the techniques you described would consistently behave like that if they were properly implemented. And GPS fixes include uncertainty values so they should always be able to detect this situation.

I seriously doubt they are doing any advanced signal processing even when they do have a solid GPS connection. Speed readings tend to be much too erratic, and this is an issue that I've heard repeated by many Garmin users. Even a very simple smoothing algorithm would fix that. I see somewhat similar behavior with gradient calculations, but I cut Garmin more slack as altitude measurements are much less precise.
Last edited by: helo guy: Jun 11, 19 20:12

Edit Log:

  • Post edited by helo guy (Lightning Ridge) on Jun 11, 19 20:12