View Single Post
  #17   Report Post  
Old August 11th 07, 12:17 AM posted to rec.radio.amateur.antenna
K7ITM K7ITM is offline
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 644
Default measuring cable loss

On Aug 9, 10:17 pm, "Jimmie D" wrote:
"Owen Duffy" wrote in message

...



Jim Lux wrote in news:f9gg4i$7q9$1
@nntp1.jpl.nasa.gov:


...


Jim good points and all noted.


Jimmie hasn't give a lot of detail about the specification he is
apparently trying to meet. Reading between the lines, it might be an
EIRP, and assuming a given antenna gain, he is trying to calculate the
permitted transmitter power output.


Not only is the uncertainty of practical service equipment an issue in
tenth dB accuracy, but no mention has been made of transmission line loss
under mismatch conditions, and mismatch loss.


Jimmie, if you have a plausible story to tell the regulator, then that
might suffice.


If you have assessed the Return Loss of a rho=1 termination, then you
could use that and the measured Forward and Reverse power using say a
Bird 43 at the transmitter end of that known line loss (being half the
return loss) to calculate the power absorbed by the load. The calculator
athttp://www.vk1od.net/tl/vswrc.phpdoes just that. The calculator at
http://www.vk1od.net/tl/tllc.phpcould be used to calculate the expected
RL of the o/c or s/c line section, just specify a load impedance of 1e6
or 1e-6 for each case. For example, at 1GHz, the RL of 200' LDF4-50A with
a 1e-6 load is 8.9dB, and if you got much higher than that, you might
suspect the cable to be faulty.


Tenths of a dB, remember that most service type power meters are probably
good for 6% to 10% of FSD, so I will go with Jim's 1dB accuracy.


BTW, directional wattmeters for the ham market are often not capable of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading is
approximately zero on a 50 ohm load.


Owen


I think I have given enough info. But I will try yo expess it in another
way.
Power delivered to the antenna but be maintained with in +- 1 db in this
case that power is 100 watts. Power is normally
checked at the TX and recorded after allowing for line loss as "power at
the antenna". Power checks are done on a weekly basis. Once a year the line
loss is measured and this value is used to subtract from the power at the
transmitter for the rest of the year. With this in mind it would be most
prudent to measure the cable loss accurately. to establish the annual
benchmark.

Considering the test equipment I have available to use in a temperature
stablized building an Agilent network analyzer or use an old HP power meter
at the top of the tower I am thinking that measuring rho of the cable while
terminated in a short may be the more accurate way to go.

Jimmie


As I mentioned before, be sure the cable is really 50 ohm (assuming
your instruments are calibrated to 50 ohms), or at least determine
what it is. Make your rho measurement; at that length of line, you
can adjust the frequency of measurement over a small range and get
values for rho at angles of 0 degrees and at 180 degrees. I will
assume that the cable is 50 ohms and the cable attenuation changes
practically none between the two readings, so the readings will be the
same. Now without changing anything, measure an attenuator with
nearly the same attenuation your think the cable has, also open-
circuited/shorted at the output. If the attenuator has the same
attenuation as the line, you should get the same value. You can then
have that attenuator calibrated at 1GHz to make sure it's correct.
Because your measured attenuation is twice the line attenuation, you
will get the line within 1dB if the measurement is within 2dB. It
shouldn't be very expensive to get a couple attenuators that would
bracket the line loss, and have them calibrated, and expect that they
would hold the calibration for a relatively long time if they aren't
mistreated. Seems like we never see much variation from one cal to
the next of decent attenuators.

As Jim noted, beware of environmental changes. I don't think that
dimensional changes will much matter, but the copper resistance will,
some. The effect, though, is not nearly as much as Jim suggested,
because of skin effect: a 1 degree C change causes the DC resistance
to change by 0.4%, but the AC resistance changes by only 0.2%. Since
the dB attenuation due to copper resistance is linear with resistance,
if the line attenuation is about 3.5dB, you'd need a 10% change in AC
resistance to see an 0.35dB change in attenuation. That's a 50 degree
C change, perhaps worth worrying about if you're in an extreme
climate. Looking at it another way, it's about 0.007dB/degree C.
It's probably worth making a point to measure the line loss at or near
the temperature extremes it experiences, though that would mean
climbing the tower at a couple times you might least like to. Be sure
moisture doesn't get into the line!

Cheers,
Tom