View Single Post
  #54   Report Post  
Old March 25th 08, 04:18 AM posted to rec.radio.amateur.moderated
[email protected] jimp@specsol.spam.sux.com is offline
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 1,898
Default WPM to BPS calculation

Mark Kramer wrote:
In article ,
Phil Kane wrote:
Something must have changed (or been fixed) then - we made
measurements about three years ago and there was about six seconds
offset - an eternity for accurate time measurements. 340 nanoseconds
we can tolerate. Six seconds we can't.


It's changed. GPS and UTC now differ by 14 seconds, according to
http://tycho.usno.navy.mil/gpstt.html. This is because GPS time does
not include leap seconds.


If you read the whole thing, you find there are several differences
betweeen the raw time and UTC.

This 14 second difference is part of the GPS broadcast, so can easily
be backed out of the GPS time data to produce UTC. Once corrected,
the UTC values have the stated accuracy.


All the offsets from UTC and their values are in the NAV message. Most
receivers do that adjustment automaticaly as UTC is what most end users
want.

Now, if you have some receiver that outputs the raw uncorrected stuff
or a home brew receiver without the corrections...

That would be a case of RTFM.

Don't be confused by the latency of some GPS units in producing time/fix
products. I've seen them produce fixes several seconds later. That's why
the time is included in postition data, so you know when you were there.
If you want time from your GPS, you need either the 1PPS pulse output or
a unit with a known and predictable period from real time to character
output. For many uses, simply assuming that the first character of the
output string (NMEA) occurs at the time in the message is adequate,
but that's not going to get you your 340ns accuracy.


Most cheap receivers are either optimized for position or time, not
both, so it pays to read the spec sheet carefully.

For example, I am using a Trimble Acutime to feed an home-brew time
demon. Tests comparing system time from this demon to ntp stratum 1
servers gave a few millisecond difference. Good enough for me.


That's one that has been optimized for time, so a good choice for your
application. A bit of attention to details could get you into the
microsecond range, but for the majority of people not necessary.

--
Jim Pennino

Remove .spam.sux to reply.