Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #31   Report Post  
Old August 12th 07, 03:09 AM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 233
Default measuring cable loss

On Sat, 11 Aug 2007 21:44:08 -0400, Walter Maxwell wrote:

I haven't read all the posts in this thread, so I'm not sure that the method I'm advancing for measuring line
loss has not already been discussed.

The method I'm suggesting measures the input impedance (R and jX) of the line in question with the opposite
end of the line terminated first with an open circuit and then with a short, at a frequency where the line
will be close to lambda/8. With low-loss lines the R component will be very small, requiring an RF impedance
bridge that can produce accurate values of R in the 0.2 to 2-ohm range. (The General Radio GR-1606A is a
typical example, which, while using this procedure, will yield answers of greater accuracy than the methods
described in the current posts.) With the 1/8wl line the + and - reactances appearing in the measurements will
be approximately the value of the line Zo.

The open and short circuit values are then plugged into a BASIC program that I wrote years ago, which appears
in Chapter 15 of both Reflections 1 and 2. It also appears on my web page at www.w2du.com. The program outputs
the line attenuation, the complex Zo, and the electrical length of the line. The program solves the equations
appearing in Chipman's "Theory and Problems of Transmission Lines," Page 135.

On my web page go to 'View Chapters of Reflections 2' and click on Chapter 15 to see the detailed explanation
of the procedure.

The BASIC program TRANSCON (Transmission-line Constants) is listed there, but to save your having to load the
program from the typed list I will email a copy of the actual operable program to anyone who requests it,
addressing your request to .

I hope this suggestion will prove to be of value.

Walt, W2DU

Some additional information that can make it easier to follow the procedure. The section of Chapter 15
describing the line attenuation measurement procedure is Sec 15.3.1, Calibration of the Feedline, Page 15-3,
Chapter 15, and the table showing the results using the TRANSCON program is Fig 15-1, Page 15-4 The TRANSCON
program listing is on Page 15-22.

Walt, W2DU
  #32   Report Post  
Old August 13th 07, 07:35 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default measuring cable loss


Down in the lab here at work we have a whole rack of precision
misterminations (1.1:1, 1.2:1, 1.5:1, etc.) that some talented engineer
built and calibrated some decades ago. They're built on the Maury
bluedot N terminations.



I have always though that a budget priced set of mismatches would be real
handy, and have wondered why MFJ (or someone else for that matter) don't
offer a set for checking / calibration of the MFJ259B etc.

Owen


I've looked into this, and it's non-trivial to do in small quantities at
a (ham)reasonable price. My late father-in-law owned a small company
doing video and audio equipment for the broadcast industry, and one of
their larger selling product was a high quality 75 ohm BNC termination.
However, it took a fair amount of research on his part to find
appropriate components from which to assemble them, and even so, I doubt
his manufacturing cost was low enough to get to a reasonable retail
price for the ham market.


FWIW, 75 ohm terminations are readily available, as a mismatch standard
that might be useful. In F connectors they are available for less than
a dollar, but I'd be a bit leery of their precision (1% DC resistance is
easy to get, but over DC-100 MHz, a bit trickier). Good quality
(mechanical and electrical) seem to run about $4-5 each.

In small quantities, it would be hard to make and sell such things for
less than about $5 each (by the time you factor in mfr time, material
costs), and then there would be shipping. You might be able to do
better with selling a set (say, open, short, 50, 25, 75, 100, etc.)
because there would be economies of scale, both in mfr and in
shipping/handling.

Maybe $30/set plus shipping?

I can see all the folks going.. $30, for half a dozen connectors and
resistors? I have a box of connectors out in the shack, and resistors,
and I'll just fire up the soldering iron...
  #33   Report Post  
Old August 13th 07, 07:50 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default measuring cable loss

John Ferrell wrote:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff at
http://rfdesign.com/mag/608RFDF2.pdf
http://cp.literature.agilent.com/lit...5966-0784E.pdf
  #34   Report Post  
Old August 13th 07, 08:26 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 644
Default measuring cable loss

On Aug 13, 11:50 am, Jim Lux wrote:
John Ferrell wrote:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/litweb/pdf/5966-0784E.pdf


Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level. The temperature
variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid. With two channels in the AD8302, you could even
monitor antenna reflection coefficient (including angle), and be aware
of changes there. Analog Devices and Linear Technology both seem to
be strong in the RF power monitor IC area.

Cheers,
Tom

  #35   Report Post  
Old August 13th 07, 09:09 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default measuring cable loss

K7ITM wrote:
On Aug 13, 11:50 am, Jim Lux wrote:

John Ferrell wrote:

On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/litweb/pdf/5966-0784E.pdf



Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level.

indeed...
The temperature
variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid.


or, even simpler, what about a resistive tap (or a pair of resistive
taps separated by a short length of transmission line). If you're
sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out,
you need a 80 dB coupler. Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down, and you could put a 10-20dB pad in
before the detector. Calibration would take care of the coupling ratio,
although, you might want to be careful about the tempco of the resistor.
With two channels in the AD8302, you could even
monitor antenna reflection coefficient (including angle), and be aware
of changes there. Analog Devices and Linear Technology both seem to
be strong in the RF power monitor IC area.


Those are truly nifty parts, and form the basis of some very interesting
ham products over the past couple years (like LP-100 vector wattmeter
and various ham-oriented VNAs). What would be very cool is if AD would
combine something like the 8302 and the A/D so it would have a serial
digital output. Pretty close to a powermeter on a chip.

Functionally, this would be close to what you get with a Rhode+Schwartz
NRP series, a Boonton 52000, an Agilent U2000

Cheers,
Tom



  #36   Report Post  
Old August 14th 07, 06:56 AM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 2,951
Default measuring cable loss

On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:

Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,


Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.

A Pi attenuator will do it better.

73's
Richard Clark, KB7QHC
  #37   Report Post  
Old August 14th 07, 07:51 AM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 644
Default measuring cable loss

On Aug 13, 1:09 pm, Jim Lux wrote:
K7ITM wrote:
On Aug 13, 11:50 am, Jim Lux wrote:


John Ferrell wrote:


On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.


A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.


A diode, voltmeter, and temperature sensor might work, though.


useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/...


Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level.


indeed...
The temperature

variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid.


or, even simpler, what about a resistive tap (or a pair of resistive
taps separated by a short length of transmission line). If you're
sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out,
you need a 80 dB coupler. Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down, and you could put a 10-20dB pad in
before the detector. Calibration would take care of the coupling ratio,
although, you might want to be careful about the tempco of the resistor.

....

The OP said this is at 1GHz. It's really tough to get a reliable
resistive divider at 1GHz, with that sort of ratio. Actually, a
capacitive divider probably stands a better chance of working, though
getting that really right isn't trivial. (We used to worry about
variation in humidity and atmospheric pressure affecting the
dielectric constant of air, in using a capacitive sampler...though
admittedly that was for work to a level well beyond 1dB accuracy.)

I am rather fond of the coupled-line hybrid idea: it can be built in
a way that everything stays ratiometric, so the coupling ratio is very
nearly constant over temperature, and of course the directionality
lets you observe things you can't just from monitoring voltage at a
point. It's possible to build one with low coupling without too much
trouble; -60dB coupling isn't out of the question, for sure. I'm
imagining a design I could make reliably with simple machine tools
that would work well for the OP's application: 100 watts at about
1GHz as I recall in the through line, and coupling on the order of
-60dB to get to about -10dBm coupled power and have negligible effect
on the through line. There's a free fields solver software package
that will accurately predict the coupling, and with the right design
and normal machine shop tolerances the coupling and impedance should
be accurate to a fraction of a dB and better than a percent,
respectively. Perhaps I can run some examples to see if I'm off-base
on that, but that's what my mental calculations tell me at the moment.

Cheers,
Tom


  #38   Report Post  
Old August 14th 07, 08:14 AM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 644
Default measuring cable loss

On Aug 13, 10:56 pm, Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:

Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,


Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.


100aF??? :-) X(100aF)/X(100pF) = 50k/50 ??? ;-) ;-)

  #39   Report Post  
Old August 14th 07, 12:23 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 61
Default measuring cable loss

In article EgEui.4923$MT3.3995@trnddc05, "Jerry Martes"
wrote:


I consider "return loss" to be a ratio related to the mismatch of the load
to the line. A short on the end of a low loss line will have high Return
Loss. You probably did some math that isnt apparent in the statement "I am
assuming that 1/2 (of 6.75 dB) is the actual loss". .


Hello, and you don't have to "consider" what return loss is. At an
interface/boundary it is the ratio of incident power to reflected power.
Mismatch loss is the the ratio of incident power to that dissipated in the
load at the interface/boundary. These losses in terms of VSWR are given
by

RL (dB) = 20*log(S + 1)/(S-1)

ML (dB) = 10*log(S + 1)^2/(4*S)

where S is the VSWR and logarithms are to base 10.

A lossless transmission line fed at one end and ideally short-circuited on
the other end would display a feedpoint impedance that is totally reactive
(no resistive component). If a resistive component is present it must be
due to dissipative loss in the line and since power has to travel to the
load (short) and return to the feedpoint this resistance must be twice the
dissipative loss in the line.

The challenge here is, given a transmission line of certain physical
length, to find a measurable value at the operating frequency(s). An RF
signal source with a surplus (but in proper operating order) General Radio
(Genrad) impedance bridge is good for this type of measurement. Keep in
mind that any coupling from the line to nearby structures will affect the
measurement. Sincerely, and 73s from N4GGO,

John Wood (Code 5550) e-mail:
Naval Research Laboratory
4555 Overlook Avenue, SW
Washington, DC 20375-5337
  #40   Report Post  
Old August 14th 07, 04:44 PM posted to rec.radio.amateur.antenna
external usenet poster
 
First recorded activity by RadioBanter: Mar 2007
Posts: 801
Default measuring cable loss

Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:


Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,



Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.

A Pi attenuator will do it better.


A resistive 30dB tap into a 30 dB pi attenuator, or something like that?
That would get the resistor in the tap down to a reasonable value..
and, as you point out, at 1 GHz layout and component selection would be
critical.

I suppose if you're building a circuit board, a small parallel line
coupler would work just as well, and probably be easier.

in any case, the use of those nifty parts from AD does open up a lot of
interesting applications.



73's
Richard Clark, KB7QHC

Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Measuring quarter wave cable length with HP 8405A Gary Schafer Antenna 8 May 5th 06 03:11 AM
Calculating Coaxial Cable Loss David Robbins Antenna 5 January 1st 04 01:07 AM
Antenna cable loss query AES/newspost Scanner 7 December 11th 03 10:55 PM
Antenna cable loss query AES/newspost Shortwave 7 December 11th 03 10:55 PM
Measuring small inductances using a return loss bridge aWn Homebrew 11 September 11th 03 03:17 AM


All times are GMT +1. The time now is 01:36 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017