In article ,
Michael Coslo wrote:
| Odd, the old definition of Morse Code didn't use the terms "0" and "1",
| nor "mark" and "space." All the timing was in terms of the length of a
| single "dit."
|
| As far as I know, Morse code did not "become digital" until some people
| wanted to make it look as if it was more advanced than it is. Until then
| it was as you describe.
You're confusing your terms now. Digital and binary are related, but
not the same.
This defintion of `digital data'
http://www.answers.com/main/ntquery?...87736&method=6
definately fits Morse code, both at the `on' and `off' level and at
the `dit' `dah' `short space' `medium space' `long space' level.
Digital also means `having to do with fingers', and Morse code is
usually sent using one's fingers ...
Of course, the terms digital and binary didn't really become commonly
used until computers did. Considering that Morse code was born in
1832 or so, it's not surprising that people weren't thinking of it in
computer terms for a while.
Ultimately, terms are getting confused all over the place. The terms
CW and Morse code are used interchangably, when it's really more
accurate to say that CW is to Morse code as RTTY is to Baudot. But
ultimately it doesn't matter, because however you define it, people
use it to talk to each other and it works.
--
Doug McLaren,
, AD5RH
This is a test of the emergency .signature program. This is only a test.