tim gorman wrote:
. . .
Unless you are using rg174 the 7 foot of extra cable should not make this
much difference unless the input impedance of the amplifer is not 50ohm
resistive. If it is not purely resistive then changing the cable length can
impact the SWR seen at the transmitter end significantly. . .
Changing the cable length won't change the SWR on the cable regardless
of the kind of load impedance and, if the SWR meter is designed for the
cable's Z0, it won't change the SWR meter reading, either. Except, of
course, that cable loss will always lower the SWR -- but that shouldn't
be a significant factor with such short cables.
Changing the cable length *will* change the impedance looking into the
cable, whether or not the load is purely resistive. The only exception
to this is if the load is resistive *and equal to the line's
characteristic impedance* in which case the impedance looking in will be
Z0 for any length cable.
Transmitters will often put up with some mismatched impedances better
than others, even if the SWR is the same, and sometimes changing the
cable length between it and a mismatched load will cause it to see a
more or less favorable impedance. But if the SWR really is 1.3:1, I
doubt that's the cause of this problem.
I agree with the suggestion that the OP measure the SWR and if possible
the power at both ends with both cables. Something else is going on,
like maybe a bad cable or connector.
I don't think the OP said what frequency this is happening at. That
might give some additional clues.
Roy Lewallen, W7EL
|