View Single Post
  #15   Report Post  
Old December 22nd 05, 06:38 AM posted to rec.radio.amateur.antenna
Roy Lewallen
 
Posts: n/a
Default Standing Waves (and Impedance)

W. Watson wrote:
. . .
Not a bad explanation from Wikipedia:

SWR has a number of implications that are directly applicable to radio use.

1. SWR is an indicator of reflected waves bouncing back and forth
within the transmission line, and as such, an increase in SWR
corresponds to an increase in power in the line beyond the actual
transmitted power. This increased power will increase RF losses, as
increased voltage increases dielectric losses, and increased current
increases resistive losses.


I go along with that.

2. Matched impedances give ideal power transfer; mismatched
impedances give high SWR and reduced power transfer.


That's oversimplified and a misapplication of the rule of maximum power
transfer. Suppose I have a 50 ohm source connected to a 50 ohm load and
adjust the source so it puts 100 watts into the load. Then I put a half
wavelength of 300 ohm line between the source and the load. The
transmission line will have a 6:1 SWR. There will be a 6:1 impedance
mismatch at the transmission line-load junction. Yet
-- The load power will be 100 watts as before.
-- The power produced by the source will be 100 watts as before.
-- The system efficiency will be the same as it was before.
-- 100 watts will be transferred from the source to the line.
-- 100 watts will be transferred from the line to the load.

So in no way did the high SWR result in reduced power transfer.

Now change the load impedance to 300 ohms.

-- There is now a 6:1 mismatch between the source and the line.
-- The line SWR is now 1:1.
-- The load power will be reduced.

The mismatch between source and line didn't cause a high SWR on the
line. In fact, changing the line impedance degraded the match at the
same time it improved the SWR.

3. Higher power in the transmission line also leaks back into the
radio, which causes it to heat up.


That's demonstrably false. For some examples and explanations, see
http://www.eznec.com/misc/food_for_t...se%20Power.txt.
(You might have to splice this URL back together if your browser splits it.)

4. The higher voltages associated with a sufficiently high SWR could
damage the transmitter. Solid state radios which have a lower tolerance
for high voltages may automatically reduce output power to prevent
damage. Tube radios may arc. The high voltages may also cause
transmission line dielectric to break down and/or burn.


That's true. Some transmitters can be damaged from a number of causes
when the load impedance isn't approximately what the transmitter was
designed for. Only one of those possible causes is increased voltage.

Of course, a high SWR can also cause the voltage at the transmitter to
be lower than it otherwise would have been.

Abnormally high
voltages in the antenna system increase the chance of accidental
radiation burn if someone touches the antenna during transmission.


But the antenna doesn't have an SWR, the transmission line does. If you
do have an open wire transmission line, it's best not to touch the line
regardless of the SWR. But if you have a high line SWR, there's just a
good of a chance that the voltage at the point you touch is lower due to
the high SWR than it is than the voltage is higher.

I'll bet if you search the web you can find just about any kind of
possible misinformation about SWR, just as you can about any other topic.

Roy Lewallen, W7EL