"George, W5YR" wrote in message ...
....
I think that a great deal of confusion over this whole issue comes from two
sources:
1. vague efforts to apply the infamous "Maximum Power Transfer Theorem" from
the early days in undergrad EE school; and
2. confusing an r-f transmitter output stage with the classical "signal
generator" with a dissipative 50-ohm internal resistance.
Forget both of those irritants and concentrate on the required load for the
transmitter, which the designer will provide and insist upon, and then
adjust the antenna system to provide that load and all will be well.
Those of us who _do_ have to worry, in intimate detail, about
generator source impedances, are most thankful that we do NOT when we
put loads on our ham rigs. Thanks for a great posting that nicely
summarizes what a lot of us have been saying for a long time. Perhaps
Reg is right. Perhaps we SHOULD quit calling it an SWR meter and
instead call it a "Transmitter Load Indicator" (or perhaps transmitter
load error indicator).
When you plug an appliance into the mains, do you worry about what the
mains source impedance is, so long as it's low enough to maintain the
proper voltage? When you connect speakers to an amplifier, do you
worry about what the source impedance is, so long as it's low enough
to not materially affect damping? If not, why would you worry about
transmitter source impedance? Why would you not worry instead about
proving the proper load so the amplifier can do it's job right?
Cheers,
Tom
|