Thread: Why?
View Single Post
  #2   Report Post  
Old October 18th 04, 04:15 PM
Dave
 
Posts: n/a
Default

three possible reasons..

the meter is not as accurate at low power levels. this is common on meters
driven by line power. below some level the simple diode rectifier just
doesn't drive the meter movement, since this usually happens on the
reflected side first due to the lower voltage the readings can't be trusted
at low swr's or low power levels.

the other possibility is that you have something breaking down causing a
change in swr. while this is unlikely at only 100w, it is possible. this
would normally cause a sudden jump in swr as power is increased and often
results in erratic indications.

and lastly, the transmitter is poorly adjusted. as power is increased you
may be generating more harmonic content. if the antenna reflects the
harmonics back they can add to the swr. this is often seen in tube finals
or high power amps, when being tuned the swr reading after the amp changes
as the tune/load controls are adjusted, even if the power indicated stays
constant.


"Ken Bessler" wrote in message
...
Can anyone tell me why I get a worse SWR at
100w than at 5w? I know the reflected power
goes up but I'm using a cross needle meter here
so I'm referring to the actual ratio.

Which SWR should I trust?

Ken