View Single Post
  #33   Report Post  
Old August 13th 07, 07:50 PM posted to rec.radio.amateur.antenna
Jim Lux Jim Lux is offline
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default measuring cable loss

John Ferrell wrote:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff at
http://rfdesign.com/mag/608RFDF2.pdf
http://cp.literature.agilent.com/lit...5966-0784E.pdf