View Single Post
  #9   Report Post  
Old March 23rd 08, 04:03 PM posted to rec.radio.amateur.moderated
Paul W. Schleck[_3_] Paul W. Schleck[_3_] is offline
external usenet poster
 
First recorded activity by RadioBanter: Aug 2010
Posts: 63
Default WPM to BPS calculation

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

In Klystron writes:

wrote:

[...]
So if the word PARIS is sent 50 times in 1 minute, that minute is
divided into 2500 dit times. Which is 41.66 bps.
[...]



It still seems like an awfully slow data rate. I have seen people
throw 14400 Baud modems in the garbage because they considered them to
be so slow as to be worthless. A data rate of 42 bps is about 3 orders
of magnitude slower than that.


Many types of communications vary over many orders of magnitude of
information rate, yet are considered useful and up-to-date.

For example, the Casio WaveCeptor on my wrist:

http://www.eham.net/reviews/detail/2497

receives a ~ 1 Baud Pulse Position Modulated (PPM) signal from radio
station WWVB in Fort Collins, Colorado, which transmits on 60 kHz. It
takes about a minute to send the complete time code to synchronize my
watch. Slow? Yes. Useful? Yes, very much so, especially when
considering the coverage and reliability that can be obtained from such
a low-bandwidth, groundwave-propagated, Very Low Frequency (VLF) signal.
The watch only needs to receive the time code at most once per day,
which it does so automatically in the early hours of the morning sitting
on my desk or dresser. A faster data rate would require something other
than a VLF signal, and would not improve much on the quality or
usability of the communications. It would definitely increase the
price. Witness the much greater success in the marketplace of
WWVB-based watches versus more advanced, higher bandwidth, but much more
expensive, "Smart Personal Object Technology" (SPOT) watches:

http://www.spotstop.com

One of the most current and widely used communications technologies
among young people is cellular telephone text messaging:

http://en.wikipedia.org/wiki/Text_messaging

(sometimes also called "Short Messaging System" or SMS)

According to this recent demonstration on the Tonight Show with Jay
Leno:

http://www.youtube.com/watch?v=AhsSgcsTMd4

the realizable data rates are comparable in order of magnitude to that
of fast Morse code that can be sent and received by human operators.
Just try telling a teenager with an SMS-capable cellular telephone that
it should be thrown in the trash because it isn't fast enough, or isn't
of sufficiently novel technology, and see his or her reaction.

To give you an amateur radio example, the Automated Position Reporting
System (APRS):

http://www.aprs.org

uses 1200 Baud AFSK packet. Faster, but still an order of magnitude
slower than technologies you imply should be thrown out. Since APRS
reports important, but compact, telemetry at periodic intervals, the
technology meets the requirements of many users utilizing VHF radios and
Terminal Node Controllers (TNC's). Again, substituting much higher data
rates would really not improve the technology or better meet the
requirements of the users which it serves.

To even give you a Morse code example, consider the simplicity and
effectiveness of the NCDXF beacons running on the HF bands:

http://www.ncdxf.org/beacons.html

A relatively low data rate On-Off Keyed (OOK) Morse Code signal is able
to quickly convey to the listener the quality of the communications
link, and required link budget, to various points around the globe. All
that is needed to be transmitted is a station identification, and the
same symbol (in this case the letter "V") sent at 10 dB power steps from
100 Watts to 0.1 Watt. Complex modulation/demodulation equipment to
achieve "orders of magnitude" faster data rates would not only not fit
on the HF bands, they would not seem to offer much improvement in the
quality of the service.

I suppose one could implement a beacon network using something like
PSK31:

http://www.psk31.com/

which might even be able to demonstrate realizable communications link
budgets below 0.1 Watt. But even that advanced digital mode would only
have data rates comparable to Morse code. Though the NCDXF beacon
network is a Morse code service, note that Morse code knowledge is
really not necessary to utilize it effectively. A synchronized time
base and a chart of which station transmits at which time would enable
very fast determination of the link budget to the beacon locations. If
you can't remember what a "V" sounds like in Morse Code (". . . _" like
the intro to Beethoven's Fifth Symphony), I suppose you could put that
on the chart as well. After all, the use of similar charts are how
pilots usually decode the Morse code identifications of aeronautical
beacons.

There are even a number of excellent software packages linked from the
NCDXF site above that could automatically monitor the signals, decode
the Morse, and record the quality of the communications paths over time.
One such package is Faros:

http://www.dxatlas.com/Faros/

one of many advanced signal processing software packages for amateur
radio that exploits the ubiquitousness of of inexpensive personal
computers with sound cards in most home ham "shacks."

Focusing simply on information rate disregards other aspects of the
communications and the channel over which it is transmitted. These
important aspects include the bandwidth and propagation characteristics
of the available channel, the complexity of the required transmitting
and receiving equipment, the amount of data that needs to be
transmitted, and how quickly and often it needs to be conveyed.

Single-attribute measuring contests may be fun, even ego-boosting to
some, but are really not very useful or impressive to those who actually
design and use practical communications systems.

It just seems inconsistent with the way
that so many hams have fought tooth and nail to hold onto Morse and to
hinder the move toward digital modes.


I'm not sure that I understand your line of reasoning here. You are
implying cause-and-effect. In other words, use and advocacy of Morse
code somehow directly contributed to the obstruction of other
technologies. Can you give direct evidence of specific examples? If
you are implying that licensing requirements obstructed the development
of advanced digital modes, that really doesn't appear to be the case.
Witness the success of Tucson Amateur Packet Radio (TAPR):

http://www.tapr.org

and the Radio Amateur Satellite Corporation:

http://www.amsat.org

which have developed or championed many promising digital technologies,
developed by amateurs with widely varying degrees of Morse code
operating skills.

Furthermore, if the only technologies that you believe should be saved
from being thrown away are those at 14.4 kBaud and up, those
technologies are only practically realizable on amateur radio bands at
high VHF and up. Such bands have been open to licensees without need of
a Morse code test for going on 17 years now. Even before then, these
bands were accessible to Technician-class amateurs since at least
shortly after World War II, with a license that only required a minimal,
5 WPM (essentially individual character-recognition) Morse code test.

If you are saying that someone *else* should have developed these
technologies (other than you, of course), and that since they haven't,
then someone *must* be to blame, well, you can't really dictate how the
world should turn out without taking an active role to help make it that
way.

--
Klystron


- --
73, Paul W. Schleck, K3FU

http://www.novia.net/~pschleck/
Finger for PGP Public Key

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (SunOS)

iD8DBQFH5epg6Pj0az779o4RAvfbAJ4kewTvCX5mqHimGwfXkK tQCusKFwCgxKPZ
ovhE2D69Thi8oiiqsv5I9X8=
=4RMi
-----END PGP SIGNATURE-----