VSWR doesn't matter?
In .com, billcalley said:
We are all told that VSWR doesn't matter when using low loss
transmission lines, since the RF energy will travel from the
transmitter up to the mismatched antenna, where a certain amount of
this RF energy will reflect back towards the transmitter; after which
the RF will then reflect back up to the antenna -- where the energy is
eventually radiated after bouncing back and forth between the
transmitter and antenna. I understand the concept, but what I don't
quite understand is why the reflected RF energy isn't simply absorbed
by the 50 ohm output of the transmitter after the first reflection?
Two problems:
1) The transmitter may well have output impedance matching the
characteristic impedance of the transmission line. RF power reflected
back in this case gets converted to heat in the output stage of the
transmitter, in addition to whatever heat the output stage already has to
dissipate.
1a) The reflection may increase requirement of the output
tubes/transistors to both drop voltage and dissipate power. This can be a
problem for many transistors, especially a lot of bipolar ones. It is not
necessarily sufficient to stay within power, current, voltage and thermal
ratings. Many bipolar transistors have reduced capability to safely
dissipate power at voltages that are higher but within their ratings -
sometimes even at voltages as low as 35-50 volts. This problem tends to
be worse with bipolar transistors that are faster and/or better for use
with higher frequencies. The keyphrase here is "forward bias second
breakdown", a problem of uneven current distribution within the die at
higher voltage drop.
2) It appears to me that transmitters can have output stage output
impedance differing from the intended load impedance.
An analog is common practice with audio amplifiers - output impedance is
often ideally as close to zero as possible, as opposed to matching the
load impedance.
If zero output impedance is achieved in an RF output stage, I see a
possible benefit - reflections do not increase output stage heating but
get reflected back towards the antenna. Then again, the impedance of the
input end of the transmission line could be low or significantly reactive
depending on how the load is mismatched and how many wavelengths long the
transmission line is, and that can increase heating of the output stage.
In a few cases transmitted power can also increase.
Not only is increased output stage heating possible and maybe fairly
likely, high VSWR also causes a high chance of the output stage seeing a
partially reactive load. RF bipolar transistors often do not like those
due to increased need to dissipate power with higher voltage drop. As I
said above, RF bipolar transistors are likely to really dislike
simultaneous higher voltage drop and higher power dissipation.
- Don Klipstein )
|