View Single Post
  #9   Report Post  
Old December 30th 03, 05:42 PM
Dale Parfitt
 
Posts: n/a
Default


"Richard Hosking" wrote in message
u...
Wolfgang
Thanks for your reply
I am not very sophisticated mathematically and I have not studied
communication theory. Your article is most helpful. However it looks as
though it might be somewhat hard to implement in a small/slow codespace

such
as a microcontroller. (It may well be impossible to do this job in such a
device.)
Still I thought I would try.
Off the top of my head I thought I would start with a hard limited audio
signal, with transitions at digital hi/low level, rather than a sampled
signal. This would obviously limit the signal to noise ratio, but I was

not
planning to use this for weak signal work. The clock/demodulation

algorithm
would then deal with the timing between transitions of the signal, rather
than a spectral analysis. This would be relatively simple to implement

with
a timer to count the interval between transitions. The controller could
track carrier freq with a Freq Locked Loop algorithm, over a limted
range.(say 600-900 Hz) If there are no transitions or randomly timed
transitions (noise), then the controller assumes a space Similarly, the
controller tracks sending speed by timing the length of dots and dashes,

and
adjusting the algorithm to track this. This length could be initially
acquired by timing an initial series of dots/dashes. The length would
presumably fall into two groups, with the shorter length being dots. The
average of these shorter "on" periods would be the bit rate. The

controller
could keep a moving average of the signal, discarding any "on" periods

that
are longer than say 2.0 times the current average. Finally, I would use

the
bit rate to time gaps (say) 2.5 times the bit rate to separate characters
and decode them with a simple lookup table.
No doubt there are lots of problems with this approach!
I would be interetsed in your comments

Richard

Many years back, I built a TTL deocder from a QST article- by Petway??
Anyway he used a number of storage registers- one for average dot length,
one for average character spece length and one for average word space
length. After initialization, a clock would run at the receipt of each
element. The count was then compared to previous average- if the clock count
was larger, the average was incremented by 1; shorter, it was decremented by
1.
The machine did a surprisingly good job on hand sent Morse. Although the
hardware technology is antiquated by today's standards, the algorithm was
elegant and easily understood. I would estimate this article (2 or 3 parter)
appeared in QST from the early 70's.

GL,

Dale W4OP