RadioBanter

RadioBanter (https://www.radiobanter.com/)
-   Antenna (https://www.radiobanter.com/antenna/)
-   -   measuring cable loss (https://www.radiobanter.com/antenna/123220-measuring-cable-loss.html)

Jimmie D August 9th 07 01:13 PM

measuring cable loss
 
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie



Jerry Martes August 9th 07 02:03 PM

measuring cable loss
 

"Jimmie D" wrote in message
...
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that
1/2 of this would be the actual loss of the cable. These numbers do fall
within the established norms for this cable. Can you think of a reason
thiis method would not be valid?


Jimmie


Hi Jimmie

I consider "return loss" to be a ratio related to the mismatch of the load
to the line. A short on the end of a low loss line will have high Return
Loss. You probably did some math that isnt apparent in the statement "I am
assuming that 1/2 (of 6.75 dB) is the actual loss". .

How difficult would it be to take a length of some decent RG-6 up the
tower to send the signal down to the *lower end*?

Jerry





Frank's August 9th 07 06:19 PM

measuring cable loss
 

"Jimmie D" wrote in message
...
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that
1/2 of this would be the actual loss of the cable. These numbers do fall
within the established norms for this cable. Can you think of a reason
thiis method would not be valid?


Jimmie


Half the return loss is a valid method of determining the transmission line
loss.

Frank



K7ITM August 9th 07 06:52 PM

measuring cable loss
 
On Aug 9, 5:13 am, "Jimmie D" wrote:
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?

Jimmie


It will be valid if the Z0 of the line is uniform, and matches the
calibration of the instrument you use to measure it. If the Z0 is
uniform but different than the impedance to which the instrument is
calibrated, you can easily see that effect by measuring the return
loss with the far end open and with it shorted. You can get the same
info, again assuming a uniform line, and assuming essentially
unchanged attenuation over a 2.5MHz span around your measurement
frequency, by measuring at multiple frequencies (doing a sweep). If
the line is the same impedance the instrument is calibrated to, the
return loss will trace out a circle centered on the middle of a Smith
display (assuming that display is referenced to the instrument's
impedance); in any event, the circle will be centered on the line's
Z0. If the line Z0 is non-uniform, expect the attenuation to vary
with frequency; the Smith display of a sweep likely will be quite non-
circular.

Cheers,
Tom


Chuck August 9th 07 08:40 PM

measuring cable loss
 
K7ITM wrote:


It will be valid if the Z0 of the line is uniform, and matches the
calibration of the instrument you use to measure it.


SNIP

It may be worth adding that even when
the line is neither uniform nor matched
to the impedance of the RLB, the
measured return loss will correctly
indicate the sum of losses due to the
mismatch and to the line losses.

When the line impedance is uniform, the
mismatch loss can be simply calculated
and the cable loss can then be found.

73,

Chuck

----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----

Jim Lux August 10th 07 12:18 AM

measuring cable loss
 
Jimmie D wrote:
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Sounds right...

Send the signal up, have a loss of 3.375 dB, all of it reflects back
from either short or open, another 3.375 dB loss, so the reflected
signal is down 6.75dB.


If you wanted to get real fancy, you could terminate in a known mismatch
too..

But that's getting up towards doing port cals on a VNA.

I assume you're not looking for tenth of a dB precision?

Jim

Jimmie D August 10th 07 01:49 AM

measuring cable loss
 


I assume you're not looking for tenth of a dB precision?

Jim


Actually yes I am..Power must be maintained +- 1db at the antenna.

Jimmie



Jerry Martes August 10th 07 02:36 AM

measuring cable loss
 

"Jimmie D" wrote in message
...


I assume you're not looking for tenth of a dB precision?

Jim


Actually yes I am..Power must be maintained +- 1db at the antenna.

Jimmie


Hi Jimmie

What test equipment are you using to record the 6.75 dB?

Jerry



Jim Lux August 10th 07 02:48 AM

measuring cable loss
 
Jimmie D wrote:
I assume you're not looking for tenth of a dB precision?

Jim



Actually yes I am..Power must be maintained +- 1db at the antenna.


You've got a bit of a challenge, then.. although +/- 1 dB (is that a 1
sigma or a 3 sigma or a absoulate max min spec?) might not require a
tenth of a dB precision.


1 dB is 25%
1% is 0.04dB

(measuring power at 1 GHz to 0.1dB absolute is moderately challenging,
especially outdoors) For reference, an Agilent E4418 is specified at
+/-0.6% (25C +/- 10 degrees).. plus you have a linearity spec which can
range from 1% to 4% depending on the relative levels of the reference
and unknown.

A good return loss measurement with a decent PNA (like an E8363) should
get you down in the sub 0.1dB transmission measurement with overall loss
in the 0 to 20dB range, so the measurement is clearly feasible at some
level.

The same piece of gear, measuring reflection coefficient (i.e. the put a
short or open at the other end, and measure mag(rho) and work back to
loss)... you said you have about 6dB return loss, so that's a reflection
coefficient (at the analyzer) of about 0.5, and for 2GHz, the
uncertainty would be about 0.01 (out of the 0.5), or, call it 2%...
again, about a 0.1 dB uncertainty.

OTOH, that's a $50K piece of test gear, sitting in a lab at 25C +/- 1C

There's also the temperature coefficient of the coax to worry about.
Copper has a temperature coefficient of 0.4%/degree C. A 10 degree
change in temperature is a 4% change in resistance (0.2dB), and the
resistance is a big part of the loss (dielectric loss changes
differently, and you'd have to worry about the dimensional changes too).

In any case, measuring the loss by terminating it in a reflection is
probably the easiest way, and potentially the most precise, because you
can have the source and the measurement at the same location. If you
tried to measure it by transmission loss (put the source at one end and
the detector at the other) you have the problem of the stability of the
source. In a bridge type scheme (which the reflection technique is) you
can essentially compare between the unknown (your cable) and a standard,
and adjust the standard until they match, so the variations in the power
level of the source cancel out (or use something that inherently
measures the ratio of the powers).

Something like the LP-100 wattmeter can probably make the measurement.
It's good to 5% typical, and can do ratioed/match measurements to much
better. I don't know if it can go to 1 GHz, though.


Something like the Anritsu SiteMaster (like the S311D) can do this for
sure(after all, it's what it was designed to do.. measure coax on towers)
http://www.us.anritsu.com/downloads/...1410-00419.pdf

If you need to measure loss on the fly, it's a bit trickier, but one way
is to put a deliberate small mismatch at the end (i.e. you put a 10 dB
directional coupler in the line at the antenna end, with the coupled
port terminated into a short). This reflects a known -20dB back down
the line. You look for changes in the amount of reflected power.
Obviously, if the antenna changes it's reflection, you have to separate
that out. There are clever techniques for this too (like having the
coupler terminate in a switch that is either a load or a short). This
kind of thing is pretty common on antenna measurement ranges, where you
need to remove the effects of the feed cable from the measurement.




Jimmie



Owen Duffy August 10th 07 05:04 AM

measuring cable loss
 
Jim Lux wrote in news:f9gg4i$7q9$1
@nntp1.jpl.nasa.gov:

....

Jim good points and all noted.

Jimmie hasn't give a lot of detail about the specification he is
apparently trying to meet. Reading between the lines, it might be an
EIRP, and assuming a given antenna gain, he is trying to calculate the
permitted transmitter power output.

Not only is the uncertainty of practical service equipment an issue in
tenth dB accuracy, but no mention has been made of transmission line loss
under mismatch conditions, and mismatch loss.

Jimmie, if you have a plausible story to tell the regulator, then that
might suffice.

If you have assessed the Return Loss of a rho=1 termination, then you
could use that and the measured Forward and Reverse power using say a
Bird 43 at the transmitter end of that known line loss (being half the
return loss) to calculate the power absorbed by the load. The calculator
at http://www.vk1od.net/tl/vswrc.php does just that. The calculator at
http://www.vk1od.net/tl/tllc.php could be used to calculate the expected
RL of the o/c or s/c line section, just specify a load impedance of 1e6
or 1e-6 for each case. For example, at 1GHz, the RL of 200' LDF4-50A with
a 1e-6 load is 8.9dB, and if you got much higher than that, you might
suspect the cable to be faulty.

Tenths of a dB, remember that most service type power meters are probably
good for 6% to 10% of FSD, so I will go with Jim's 1dB accuracy.

BTW, directional wattmeters for the ham market are often not capable of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading is
approximately zero on a 50 ohm load.

Owen

Jimmie D August 10th 07 05:51 AM

measuring cable loss
 

"Jim Lux" wrote in message
...
Jimmie D wrote:
I assume you're not looking for tenth of a dB precision?

Jim



Actually yes I am..Power must be maintained +- 1db at the antenna.


You've got a bit of a challenge, then.. although +/- 1 dB (is that a 1
sigma or a 3 sigma or a absoulate max min spec?) might not require a tenth
of a dB precision.


1 dB is 25%
1% is 0.04dB

(measuring power at 1 GHz to 0.1dB absolute is moderately challenging,
especially outdoors) For reference, an Agilent E4418 is specified at
+/-0.6% (25C +/- 10 degrees).. plus you have a linearity spec which can
range from 1% to 4% depending on the relative levels of the reference and
unknown.

A good return loss measurement with a decent PNA (like an E8363) should
get you down in the sub 0.1dB transmission measurement with overall loss
in the 0 to 20dB range, so the measurement is clearly feasible at some
level.

The same piece of gear, measuring reflection coefficient (i.e. the put a
short or open at the other end, and measure mag(rho) and work back to
loss)... you said you have about 6dB return loss, so that's a reflection
coefficient (at the analyzer) of about 0.5, and for 2GHz, the uncertainty
would be about 0.01 (out of the 0.5), or, call it 2%... again, about a 0.1
dB uncertainty.

OTOH, that's a $50K piece of test gear, sitting in a lab at 25C +/- 1C

There's also the temperature coefficient of the coax to worry about.
Copper has a temperature coefficient of 0.4%/degree C. A 10 degree change
in temperature is a 4% change in resistance (0.2dB), and the resistance is
a big part of the loss (dielectric loss changes differently, and you'd
have to worry about the dimensional changes too).

In any case, measuring the loss by terminating it in a reflection is
probably the easiest way, and potentially the most precise, because you
can have the source and the measurement at the same location. If you
tried to measure it by transmission loss (put the source at one end and
the detector at the other) you have the problem of the stability of the
source. In a bridge type scheme (which the reflection technique is) you
can essentially compare between the unknown (your cable) and a standard,
and adjust the standard until they match, so the variations in the power
level of the source cancel out (or use something that inherently measures
the ratio of the powers).

Something like the LP-100 wattmeter can probably make the measurement.
It's good to 5% typical, and can do ratioed/match measurements to much
better. I don't know if it can go to 1 GHz, though.


Something like the Anritsu SiteMaster (like the S311D) can do this for
sure(after all, it's what it was designed to do.. measure coax on towers)
http://www.us.anritsu.com/downloads/...1410-00419.pdf

If you need to measure loss on the fly, it's a bit trickier, but one way
is to put a deliberate small mismatch at the end (i.e. you put a 10 dB
directional coupler in the line at the antenna end, with the coupled port
terminated into a short). This reflects a known -20dB back down the line.
You look for changes in the amount of reflected power. Obviously, if the
antenna changes it's reflection, you have to separate that out. There are
clever techniques for this too (like having the coupler terminate in a
switch that is either a load or a short). This kind of thing is pretty
common on antenna measurement ranges, where you need to remove the effects
of the feed cable from the measurement.




Jimmie



Sounds like using my network analyser to measure return loss at the TX in an
envoromentally stabalized building is going to be a lot better than taking
my HP power meter up on the antenna in the middle of the night to measure
the power level at the end of the cable.


Jimmie



Jimmie D August 10th 07 06:17 AM

measuring cable loss
 

"Owen Duffy" wrote in message
...
Jim Lux wrote in news:f9gg4i$7q9$1
@nntp1.jpl.nasa.gov:

...

Jim good points and all noted.

Jimmie hasn't give a lot of detail about the specification he is
apparently trying to meet. Reading between the lines, it might be an
EIRP, and assuming a given antenna gain, he is trying to calculate the
permitted transmitter power output.

Not only is the uncertainty of practical service equipment an issue in
tenth dB accuracy, but no mention has been made of transmission line loss
under mismatch conditions, and mismatch loss.

Jimmie, if you have a plausible story to tell the regulator, then that
might suffice.

If you have assessed the Return Loss of a rho=1 termination, then you
could use that and the measured Forward and Reverse power using say a
Bird 43 at the transmitter end of that known line loss (being half the
return loss) to calculate the power absorbed by the load. The calculator
at http://www.vk1od.net/tl/vswrc.php does just that. The calculator at
http://www.vk1od.net/tl/tllc.php could be used to calculate the expected
RL of the o/c or s/c line section, just specify a load impedance of 1e6
or 1e-6 for each case. For example, at 1GHz, the RL of 200' LDF4-50A with
a 1e-6 load is 8.9dB, and if you got much higher than that, you might
suspect the cable to be faulty.

Tenths of a dB, remember that most service type power meters are probably
good for 6% to 10% of FSD, so I will go with Jim's 1dB accuracy.

BTW, directional wattmeters for the ham market are often not capable of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading is
approximately zero on a 50 ohm load.

Owen


I think I have given enough info. But I will try yo expess it in another
way.
Power delivered to the antenna but be maintained with in +- 1 db in this
case that power is 100 watts. Power is normally
checked at the TX and recorded after allowing for line loss as "power at
the antenna". Power checks are done on a weekly basis. Once a year the line
loss is measured and this value is used to subtract from the power at the
transmitter for the rest of the year. With this in mind it would be most
prudent to measure the cable loss accurately. to establish the annual
benchmark.

Considering the test equipment I have available to use in a temperature
stablized building an Agilent network analyzer or use an old HP power meter
at the top of the tower I am thinking that measuring rho of the cable while
terminated in a short may be the more accurate way to go.


Jimmie



Owen Duffy August 10th 07 06:51 AM

measuring cable loss
 
"Jimmie D" wrote in
:

I think I have given enough info. But I will try yo expess it in
another way.
Power delivered to the antenna but be maintained with in +- 1 db in
this case that power is 100 watts. Power is normally
checked at the TX and recorded after allowing for line loss as "power
at
the antenna". Power checks are done on a weekly basis. Once a year the
line loss is measured and this value is used to subtract from the
power at the transmitter for the rest of the year. With this in mind
it would be most prudent to measure the cable loss accurately. to
establish the annual benchmark.


Ok.

You haven't mentioned how you intend to deal with the likely case where
VSWR1.

Considering the test equipment I have available to use in a
temperature stablized building an Agilent network analyzer or use an
old HP power meter at the top of the tower I am thinking that
measuring rho of the cable while terminated in a short may be the more
accurate way to go.


Yes, especially if the NA is calibrated against o/c, s/c and Zo locally.

If for example, you discover that the one way matched line loss is
6.75/2dB (3.375dB), and you measure the fwd and reflected power at the tx
end to be say 200W and 15W, you could use the calculator I mentioned to
determine that the VSWR at the antenna was 1.36. Using that, and setting
forward power at the antenna to 102W for a net power to the antenna of
99.6W, forward and reflected at the tx end of the line should be 222W and
1.1W.

Of course, if the line was perfectly matched, you could just set the tx
end forward power to 100*10^(3.375/10) or 217W and reflected would be
zero... but that is unlikely. You could take the easier, simpler and
conservative way out and just set it for 217W forward irrespective of
mismatch.

It is splitting hairs, but sometimes precision in the method distracts
attention from accuracy!

Owen



using the calculator I mentioned, and setting the line loss to the tx to
3.375dB, loss to antenna to 0, tx power to

Jim Lux August 10th 07 04:51 PM

measuring cable loss
 

BTW, directional wattmeters for the ham market are often not capable of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading is
approximately zero on a 50 ohm load.


I should think, though, that one could calibrate such a
reflectometer/directional wattmeter. That is, you could test it with a
suitable variety of source and load impedances and develop a fairly
simple arithmetic correction that would be accurate.

The interesting question might be whether you could unambiguously take a
particular fwd and rev reading and turn that into a true fwd and true
rev, essentially solving for the mismatch.

Down in the lab here at work we have a whole rack of precision
misterminations (1.1:1, 1.2:1, 1.5:1, etc.) that some talented engineer
built and calibrated some decades ago. They're built on the Maury
bluedot N terminations.



Owen


Jim Lux August 10th 07 04:52 PM

measuring cable loss
 

Sounds like using my network analyser to measure return loss at the TX in an
envoromentally stabalized building is going to be a lot better than taking
my HP power meter up on the antenna in the middle of the night to measure
the power level at the end of the cable.


you betcha..

But you still have the tempco of the cable to agonize about.

Owen Duffy August 10th 07 10:28 PM

measuring cable loss
 
Jim Lux wrote in news:f9i1i3$8v5$1
@nntp1.jpl.nasa.gov:


BTW, directional wattmeters for the ham market are often not capable

of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading

is
approximately zero on a 50 ohm load.


I should think, though, that one could calibrate such a
reflectometer/directional wattmeter. That is, you could test it with a
suitable variety of source and load impedances and develop a fairly
simple arithmetic correction that would be accurate.


Yes Jim, some of the deficiencies of the instrument fall to things like
an equal response from the separate forward and reverse couplers. Scale
shape is an issue (especially where the sensitivity is continuously
adjustable using a pot). Phase and amplitude response of the coupler over
the frequency range is another issue not so readily calibrated out. A
coupler that is long will underestimate rho, and some couplers insert
more mismatch than they pretend to measure.

In my experience, many of the instruments that are claimed to work up to
144MHz band might well indicate close to 1:1 on a dummy load, but they do
not indicate rho=1 on a s/c or o/c. Whilst they may serve their purpose
as a null indicator on a 50 ohm load, they are not suited to the loss
measurement such as Jimmie is performing.


The interesting question might be whether you could unambiguously take

a
particular fwd and rev reading and turn that into a true fwd and true
rev, essentially solving for the mismatch.


I don't think you can compensate for lack of f/b ratio in the coupler,
for example because the coupled lines are too long.


Down in the lab here at work we have a whole rack of precision
misterminations (1.1:1, 1.2:1, 1.5:1, etc.) that some talented engineer
built and calibrated some decades ago. They're built on the Maury
bluedot N terminations.


I have always though that a budget priced set of mismatches would be real
handy, and have wondered why MFJ (or someone else for that matter) don't
offer a set for checking / calibration of the MFJ259B etc.

Owen

K7ITM August 11th 07 12:17 AM

measuring cable loss
 
On Aug 9, 10:17 pm, "Jimmie D" wrote:
"Owen Duffy" wrote in message

...



Jim Lux wrote in news:f9gg4i$7q9$1
@nntp1.jpl.nasa.gov:


...


Jim good points and all noted.


Jimmie hasn't give a lot of detail about the specification he is
apparently trying to meet. Reading between the lines, it might be an
EIRP, and assuming a given antenna gain, he is trying to calculate the
permitted transmitter power output.


Not only is the uncertainty of practical service equipment an issue in
tenth dB accuracy, but no mention has been made of transmission line loss
under mismatch conditions, and mismatch loss.


Jimmie, if you have a plausible story to tell the regulator, then that
might suffice.


If you have assessed the Return Loss of a rho=1 termination, then you
could use that and the measured Forward and Reverse power using say a
Bird 43 at the transmitter end of that known line loss (being half the
return loss) to calculate the power absorbed by the load. The calculator
athttp://www.vk1od.net/tl/vswrc.phpdoes just that. The calculator at
http://www.vk1od.net/tl/tllc.phpcould be used to calculate the expected
RL of the o/c or s/c line section, just specify a load impedance of 1e6
or 1e-6 for each case. For example, at 1GHz, the RL of 200' LDF4-50A with
a 1e-6 load is 8.9dB, and if you got much higher than that, you might
suspect the cable to be faulty.


Tenths of a dB, remember that most service type power meters are probably
good for 6% to 10% of FSD, so I will go with Jim's 1dB accuracy.


BTW, directional wattmeters for the ham market are often not capable of
reasonable accuracy on loads other than the nominal 50 ohm load. There
are a range of tests that such an instrument should satisfy, but for
hams, it is usually considered sufficient if the "reflected" reading is
approximately zero on a 50 ohm load.


Owen


I think I have given enough info. But I will try yo expess it in another
way.
Power delivered to the antenna but be maintained with in +- 1 db in this
case that power is 100 watts. Power is normally
checked at the TX and recorded after allowing for line loss as "power at
the antenna". Power checks are done on a weekly basis. Once a year the line
loss is measured and this value is used to subtract from the power at the
transmitter for the rest of the year. With this in mind it would be most
prudent to measure the cable loss accurately. to establish the annual
benchmark.

Considering the test equipment I have available to use in a temperature
stablized building an Agilent network analyzer or use an old HP power meter
at the top of the tower I am thinking that measuring rho of the cable while
terminated in a short may be the more accurate way to go.

Jimmie


As I mentioned before, be sure the cable is really 50 ohm (assuming
your instruments are calibrated to 50 ohms), or at least determine
what it is. Make your rho measurement; at that length of line, you
can adjust the frequency of measurement over a small range and get
values for rho at angles of 0 degrees and at 180 degrees. I will
assume that the cable is 50 ohms and the cable attenuation changes
practically none between the two readings, so the readings will be the
same. Now without changing anything, measure an attenuator with
nearly the same attenuation your think the cable has, also open-
circuited/shorted at the output. If the attenuator has the same
attenuation as the line, you should get the same value. You can then
have that attenuator calibrated at 1GHz to make sure it's correct.
Because your measured attenuation is twice the line attenuation, you
will get the line within 1dB if the measurement is within 2dB. It
shouldn't be very expensive to get a couple attenuators that would
bracket the line loss, and have them calibrated, and expect that they
would hold the calibration for a relatively long time if they aren't
mistreated. Seems like we never see much variation from one cal to
the next of decent attenuators.

As Jim noted, beware of environmental changes. I don't think that
dimensional changes will much matter, but the copper resistance will,
some. The effect, though, is not nearly as much as Jim suggested,
because of skin effect: a 1 degree C change causes the DC resistance
to change by 0.4%, but the AC resistance changes by only 0.2%. Since
the dB attenuation due to copper resistance is linear with resistance,
if the line attenuation is about 3.5dB, you'd need a 10% change in AC
resistance to see an 0.35dB change in attenuation. That's a 50 degree
C change, perhaps worth worrying about if you're in an extreme
climate. Looking at it another way, it's about 0.007dB/degree C.
It's probably worth making a point to measure the line loss at or near
the temperature extremes it experiences, though that would mean
climbing the tower at a couple times you might least like to. Be sure
moisture doesn't get into the line!

Cheers,
Tom


K7ITM August 11th 07 12:27 AM

measuring cable loss
 
On Aug 10, 2:28 pm, Owen Duffy wrote:
....
I don't think you can compensate for lack of f/b ratio in the coupler,
for example because the coupled lines are too long.

....
I'm curious what you mean by that, Owen...

Cheers,
Tom


Owen Duffy August 11th 07 01:29 AM

measuring cable loss
 
K7ITM wrote in news:1186788470.852002.260460
@b79g2000hse.googlegroups.com:

On Aug 10, 2:28 pm, Owen Duffy wrote:
...
I don't think you can compensate for lack of f/b ratio in the coupler,
for example because the coupled lines are too long.

...
I'm curious what you mean by that, Owen...


Tom, I was thinking of several instruments, all of the coupled lines type
of construction, that on a s/c and o/c failed to indicate rho=1, and showed
similar readings when physically reversed, suggesting it was not just a fwd
/ rev matching issue, there was something about the coupler that was too
dependent on the location of the SWR pattern relative to the coupler. Since
they worked better at lower frequencies, the length of the coupler was
likely to be a contribution.

Owen


Jimmie D August 11th 07 05:18 AM

measuring cable loss
 

"Owen Duffy" wrote in message
...
K7ITM wrote in news:1186788470.852002.260460
@b79g2000hse.googlegroups.com:

On Aug 10, 2:28 pm, Owen Duffy wrote:
...
I don't think you can compensate for lack of f/b ratio in the coupler,
for example because the coupled lines are too long.

...
I'm curious what you mean by that, Owen...


Tom, I was thinking of several instruments, all of the coupled lines type
of construction, that on a s/c and o/c failed to indicate rho=1, and
showed
similar readings when physically reversed, suggesting it was not just a
fwd
/ rev matching issue, there was something about the coupler that was too
dependent on the location of the SWR pattern relative to the coupler.
Since
they worked better at lower frequencies, the length of the coupler was
likely to be a contribution.

Owen


Would this be a problem for a directional coupler designed for a specific
frequecy?


Jimmie



Owen Duffy August 11th 07 05:55 AM

measuring cable loss
 
"Jimmie D" wrote in
:


"Owen Duffy" wrote in message
...
K7ITM wrote in news:1186788470.852002.260460
@b79g2000hse.googlegroups.com:

On Aug 10, 2:28 pm, Owen Duffy wrote:
...
I don't think you can compensate for lack of f/b ratio in the
coupler, for example because the coupled lines are too long.
...
I'm curious what you mean by that, Owen...


Tom, I was thinking of several instruments, all of the coupled lines
type of construction, that on a s/c and o/c failed to indicate rho=1,
and showed
similar readings when physically reversed, suggesting it was not just
a fwd
/ rev matching issue, there was something about the coupler that was
too dependent on the location of the SWR pattern relative to the
coupler. Since
they worked better at lower frequencies, the length of the coupler
was likely to be a contribution.

Owen


Would this be a problem for a directional coupler designed for a
specific frequecy?


Jimmie


Jimmie, I am talking about the el-cheap inline SWR / Power Meter that is
often sold to hams with unrealistic specs.

You can / should always test the performance of the kit you are using to
determine if you should have confidence in it.

There are a bund of notes on testing a directional wattmeter in the
article at http://www.vk1od.net/VSWR/VSWRMeter.htm .

BTW, for your purposes, if you had a Bird 43 with an element that read
upscale on fwd power (250W element for your application), it is all you
should need to form a reasonable estimate of line loss and set the
transmitter to deliver 100W to the antenna. You might need a smaller slug
to make a measurement of RL on a s/c or o/c termination.

Owen

K7ITM August 11th 07 06:56 AM

measuring cable loss
 
On Aug 10, 5:29 pm, Owen Duffy wrote:
K7ITM wrote in news:1186788470.852002.260460
@b79g2000hse.googlegroups.com:

On Aug 10, 2:28 pm, Owen Duffy wrote:
...
I don't think you can compensate for lack of f/b ratio in the coupler,
for example because the coupled lines are too long.

...
I'm curious what you mean by that, Owen...


Tom, I was thinking of several instruments, all of the coupled lines type
of construction, that on a s/c and o/c failed to indicate rho=1, and showed
similar readings when physically reversed, suggesting it was not just a fwd
/ rev matching issue, there was something about the coupler that was too
dependent on the location of the SWR pattern relative to the coupler. Since
they worked better at lower frequencies, the length of the coupler was
likely to be a contribution.

Owen


Hi Owen,

I've recently done at least a cursory study of the coupled-line
hybrid, and I found nothing to indicate that directionality is
affected by the line length. In fact, the usual length where it's
practical is 1/4 wave, since that's the length that provides maximum
coupling, and the coupling near that frequency changes only gently
with changes in frequency (falling off on either side). I was
particularly interested in finding that the directionality is
independent of the length, assuming uniform cross-section at least.
If this is in error, I'd really like to know about it, because it
affects something I'm working on.

I'm not sure exactly what sort of bridge is used in microwave network
analyzers; I do know that the ones we build out to a few hundred MHz
use resistive bridges, which are relatively frequency insensitive. (A
key trick is how to read the bridge imbalance without introducing
errors...)

Cheers,
Tom


Owen Duffy August 11th 07 07:13 AM

measuring cable loss
 
K7ITM wrote in
ups.com:

....
If this is in error, I'd really like to know about it, because it
affects something I'm working on.


Interesting findings Tom.

The way I think of these couplers is that you are trying to sample V and
I at a point on the main line, and a longish coupler of that type departs
from that ideal.

The effect I observed, and in several instruments, was obvious and
repeatable. I wonder that if the length of the lines is not the cause, if
it was the untidiness of the way in which the detector circuit was
implemented at each end of the coupler section. Of relevance also, is
that insertion of the instruments also caused significant SWR (1.2 in
the case of one of them) at the extreme uppoer end of their specified
range. IIRC two of the instruments had no equalisation / compensation,
they had a resistor at one end of the coupled line and a cap/diode at the
other end.

I still have one of the things that did this, and I have since nulled it
for 75 ohms, but I will have a play with it when I get home next week.

Owen

Owen Duffy August 11th 07 09:37 AM

measuring cable loss
 
K7ITM wrote in
ups.com:

....

Tom, for avoidance of doubt, I am not talking about the type of directional
coupler that uses a couple line and that you would terminate with matching
load. I am talking about the cheap VSWR meters that have about 100mm long
coupled line, that is quite tightly coupled, and the resistor at one end of
the line is adjusted to balance the electric field sample with the magnetic
field sample for a null reading with V/I=Zn.

Owen

Jimmie D August 11th 07 12:58 PM

measuring cable loss
 

"Owen Duffy" wrote in message
...
"Jimmie D" wrote in
:


"Owen Duffy" wrote in message
...
K7ITM wrote in news:1186788470.852002.260460
@b79g2000hse.googlegroups.com:

On Aug 10, 2:28 pm, Owen Duffy wrote:
...
I don't think you can compensate for lack of f/b ratio in the
coupler, for example because the coupled lines are too long.
...
I'm curious what you mean by that, Owen...

Tom, I was thinking of several instruments, all of the coupled lines
type of construction, that on a s/c and o/c failed to indicate rho=1,
and showed
similar readings when physically reversed, suggesting it was not just
a fwd
/ rev matching issue, there was something about the coupler that was
too dependent on the location of the SWR pattern relative to the
coupler. Since
they worked better at lower frequencies, the length of the coupler
was likely to be a contribution.

Owen


Would this be a problem for a directional coupler designed for a
specific frequecy?


Jimmie


Jimmie, I am talking about the el-cheap inline SWR / Power Meter that is
often sold to hams with unrealistic specs.

You can / should always test the performance of the kit you are using to
determine if you should have confidence in it.

There are a bund of notes on testing a directional wattmeter in the
article at http://www.vk1od.net/VSWR/VSWRMeter.htm .

BTW, for your purposes, if you had a Bird 43 with an element that read
upscale on fwd power (250W element for your application), it is all you
should need to form a reasonable estimate of line loss and set the
transmitter to deliver 100W to the antenna. You might need a smaller slug
to make a measurement of RL on a s/c or o/c termination.

Owen


Well it a done deal,
Engineering support came out last night and ran the checks for us while Im
on vacation and recovering from minor surgery, Yaaay. They did it the normal
way and by measuring the return loss and they decided the "return loss
method" worked better. Not sure what better means at this point. accurate
enough and easier and faster would constitute better.


Jimmie



John Ferrell August 11th 07 02:34 PM

measuring cable loss
 
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:

I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie

This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.

Climb tower once.

John Ferrell W8CCW
"Life is easier if you learn to
plow around the stumps"

Inquiry? August 11th 07 03:08 PM

measuring cable loss
 

If the speed of the "Drop" is measured in "measurable electricity" - then
what is
cycle time of the measuring device in Question?

So, in other words - if the "measuring Device" is working at the same speed
of
the thing that it is measuring (or faster) then how can it measure what it
is
measuring?

Supposedly the cycle time for low voltage is about 1000 times per second
versus
AC which is 60 times per second ... so is the device you are using "capable"
of
doing the job at all? Just a Question?

Here is my Web Site URL ... it might have some interesting solutions - or
Questions?
http://www3.telus.net/public/quark5/home.html






"Jimmie D" wrote in message
...
I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is

a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at

the
source end. I have done this and measured 6.75 db and I am assuming that

1/2
of this would be the actual loss of the cable. These numbers do fall

within
the established norms for this cable. Can you think of a reason thiis

method
would not be valid?


Jimmie





Richard Clark August 11th 07 06:47 PM

measuring cable loss
 
On Sat, 11 Aug 2007 07:58:36 -0400, "Jimmie D"
wrote:

They did it the normal way


Hi Jimmie,

Given the long and winding road to this point, it would give me pause
that suddenly something became "normal." The remainder of your post
is in contradiction to your earlier statement:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D" wrote:
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other.

For the sake of clarity (normality aside), what you originally
described Thursday is called "insertion loss."

On Fri, 10 Aug 2007 01:17:31 -0400, "Jimmie D" wrote:
Power delivered to the antenna but be maintained with in +- 1 db in this
case that power is 100 watts. Power is normally
checked at the TX and recorded after allowing for line loss as "power at
the antenna".

This again defines "insertion loss."

and by measuring the return loss and they decided the "return loss
method" worked better.


A description of the classic self-fulfilling prophecy.

I presume you mean this to be "the normal way," but it doesn't really
describe a method or procedure (a "way"); instead, it describes an
outcome. There are many "ways" to measure a characteristic called
"return loss." Some "ways" are more accurate than others.

Having introduced this term, "insertion loss," there remains one more
term to consider: "reflection loss." This and "return loss" can be
found scaled on the common form of the Smith Chart.

The distinction to these terms are that "return loss" and "reflection
loss" are a single port characteristic (that port being the "load"
which, of course, is NOT the antenna, but rather the line and the
antenna). "Insertion loss" is a two port characteristic that properly
conforms to your original question.

ALL such losses are defined by the system within which they reside.
This means you have to also characterize the impedances of BOTH the
load and the source. This last requirement is often dismissed in this
forum where the determination of the source's Z is frequently rejected
as being an impossibility (even when it is specified by the equipment
designer).

When Zsource = Zline = Zload, then many complexities are removed. I
have seen others ask you the characteristic Z of the load with no
response by you; and I am certain you have no comfortable assurance
about the Zsource of your transmitter. However, to this last, it
would be immaterial if Zline = Zload.

Not sure what better means at this point. accurate
enough and easier and faster would constitute better.


This, too, simplifies what is an exceedingly difficult determination
(of "return loss," "reflection loss," or "insertion loss") for the
accuracy you originally suggested. Accurate, easy, and fast are not
normally words used in conjunction except in advertising promotions.

The accuracy of any power determination is related to the known Z of
1. The load;
2. The source;
3. The detector.

At 1 GHz, these determinations are not so easily dismissed as trivial,
nor confirmed by dragging a $20,000 analyzer into the shop. The
analyzer answers the problem of knowing its own source Z, but it does
not answer what that source Z is of the transmitter (again, only a
necessity in the face of returned power).

Now, given no one has actually correlated accuracy to any metric here,
and given that accuracy is determined in large part by the three Zs
above; then a little more discussion is in order. Using only two (the
detector and the load could be interchanged for the simpler analysis):
Zsource = 100 Ohms
Zload = 33.3 Ohms

view in fixed font:

1 - Gammaload˛
Error = ------------------------------
(1 ± Gammasource · Gammaload)˛

Error = +0.42dB to -0.78 dB

These errors are independant of other errors such as instrumentation
error (meter linearity, conversion problems, ...) or operator errors
(reading the meter - a mirrored scale is required to keep this below
5%). Modern instrumentation (if you have the $$$$) solves some of
this, others dismiss it as a trivial concern and rely on name brand
(Bird is frequently uttered to achieve perfection).

Now, as to the variability in the error wholly associated with just
the Zs (providing you can accurately determine them - yes, a game of
infinite regress). The allowable error of 1dB is nearly wiped out
with some very possible characteristics and you haven't even begun
balancing the error budget. With luck (a fictional village where
every armchair technician resides) the error induced by mismatches
could be 0. That luck demands you know the length of the line (again,
with some accuracy - I enjoy the irony here too). The variation built
into the Error computation is from not knowing that length (as is
common, few know this with enough precision in wavelengths). At 1
Ghz, the characteristic of
aproximately 200ft of coax

is apocryphal.

73's
Richard Clark, KB7QHC

K7ITM August 11th 07 11:50 PM

measuring cable loss
 
On Aug 11, 1:37 am, Owen Duffy wrote:
K7ITM wrote roups.com:

...

Tom, for avoidance of doubt, I am not talking about the type of directional
coupler that uses a couple line and that you would terminate with matching
load. I am talking about the cheap VSWR meters that have about 100mm long
coupled line, that is quite tightly coupled, and the resistor at one end of
the line is adjusted to balance the electric field sample with the magnetic
field sample for a null reading with V/I=Zn.

Owen


Hi Owen,

I'm not sure I see the difference. The load on the cheapie you
describe is just the load required to terminate that line. I have a
freely redistributable field solver program that will calculate the
even and odd mode impedances for you from the geometry and the
dielectric's permittivity, and from those impedances and the length
you can predict the proper termination impedance of both the "through"
and the "coupled" lines, and the coupling at any particular
frequency. It IS a problem if you try to do it in microstrip because
the propagation velocity for the even and odd modes is different, but
in true TEM configurations, I believe the directionality is fine if
you maintain uniform cross-section. Actually, a way that they make
broadband coupled lines is to have a central section tightly coupled,
and another section on each end of that which is less tightly
coupled. You can extend it to 5 sections or more, to get even broader
bandwidth.

Info about them is out there, but it wasn't as easy for me to find as
I figured it would be. ;-)

Cheers,
Tom


Walter Maxwell August 12th 07 02:44 AM

measuring cable loss
 
I haven't read all the posts in this thread, so I'm not sure that the method I'm advancing for measuring line
loss has not already been discussed.

The method I'm suggesting measures the input impedance (R and jX) of the line in question with the opposite
end of the line terminated first with an open circuit and then with a short, at a frequency where the line
will be close to lambda/8. With low-loss lines the R component will be very small, requiring an RF impedance
bridge that can produce accurate values of R in the 0.2 to 2-ohm range. (The General Radio GR-1606A is a
typical example, which, while using this procedure, will yield answers of greater accuracy than the methods
described in the current posts.) With the 1/8wl line the + and - reactances appearing in the measurements will
be approximately the value of the line Zo.

The open and short circuit values are then plugged into a BASIC program that I wrote years ago, which appears
in Chapter 15 of both Reflections 1 and 2. It also appears on my web page at www.w2du.com. The program outputs
the line attenuation, the complex Zo, and the electrical length of the line. The program solves the equations
appearing in Chipman's "Theory and Problems of Transmission Lines," Page 135.

On my web page go to 'View Chapters of Reflections 2' and click on Chapter 15 to see the detailed explanation
of the procedure.

The BASIC program TRANSCON (Transmission-line Constants) is listed there, but to save your having to load the
program from the typed list I will email a copy of the actual operable program to anyone who requests it,
addressing your request to .

I hope this suggestion will prove to be of value.

Walt, W2DU



Walter Maxwell August 12th 07 03:09 AM

measuring cable loss
 
On Sat, 11 Aug 2007 21:44:08 -0400, Walter Maxwell wrote:

I haven't read all the posts in this thread, so I'm not sure that the method I'm advancing for measuring line
loss has not already been discussed.

The method I'm suggesting measures the input impedance (R and jX) of the line in question with the opposite
end of the line terminated first with an open circuit and then with a short, at a frequency where the line
will be close to lambda/8. With low-loss lines the R component will be very small, requiring an RF impedance
bridge that can produce accurate values of R in the 0.2 to 2-ohm range. (The General Radio GR-1606A is a
typical example, which, while using this procedure, will yield answers of greater accuracy than the methods
described in the current posts.) With the 1/8wl line the + and - reactances appearing in the measurements will
be approximately the value of the line Zo.

The open and short circuit values are then plugged into a BASIC program that I wrote years ago, which appears
in Chapter 15 of both Reflections 1 and 2. It also appears on my web page at www.w2du.com. The program outputs
the line attenuation, the complex Zo, and the electrical length of the line. The program solves the equations
appearing in Chipman's "Theory and Problems of Transmission Lines," Page 135.

On my web page go to 'View Chapters of Reflections 2' and click on Chapter 15 to see the detailed explanation
of the procedure.

The BASIC program TRANSCON (Transmission-line Constants) is listed there, but to save your having to load the
program from the typed list I will email a copy of the actual operable program to anyone who requests it,
addressing your request to .

I hope this suggestion will prove to be of value.

Walt, W2DU

Some additional information that can make it easier to follow the procedure. The section of Chapter 15
describing the line attenuation measurement procedure is Sec 15.3.1, Calibration of the Feedline, Page 15-3,
Chapter 15, and the table showing the results using the TRANSCON program is Fig 15-1, Page 15-4 The TRANSCON
program listing is on Page 15-22.

Walt, W2DU

Jim Lux August 13th 07 07:35 PM

measuring cable loss
 

Down in the lab here at work we have a whole rack of precision
misterminations (1.1:1, 1.2:1, 1.5:1, etc.) that some talented engineer
built and calibrated some decades ago. They're built on the Maury
bluedot N terminations.



I have always though that a budget priced set of mismatches would be real
handy, and have wondered why MFJ (or someone else for that matter) don't
offer a set for checking / calibration of the MFJ259B etc.

Owen


I've looked into this, and it's non-trivial to do in small quantities at
a (ham)reasonable price. My late father-in-law owned a small company
doing video and audio equipment for the broadcast industry, and one of
their larger selling product was a high quality 75 ohm BNC termination.
However, it took a fair amount of research on his part to find
appropriate components from which to assemble them, and even so, I doubt
his manufacturing cost was low enough to get to a reasonable retail
price for the ham market.


FWIW, 75 ohm terminations are readily available, as a mismatch standard
that might be useful. In F connectors they are available for less than
a dollar, but I'd be a bit leery of their precision (1% DC resistance is
easy to get, but over DC-100 MHz, a bit trickier). Good quality
(mechanical and electrical) seem to run about $4-5 each.

In small quantities, it would be hard to make and sell such things for
less than about $5 each (by the time you factor in mfr time, material
costs), and then there would be shipping. You might be able to do
better with selling a set (say, open, short, 50, 25, 75, 100, etc.)
because there would be economies of scale, both in mfr and in
shipping/handling.

Maybe $30/set plus shipping?

I can see all the folks going.. $30, for half a dozen connectors and
resistors? I have a box of connectors out in the shack, and resistors,
and I'll just fire up the soldering iron...

Jim Lux August 13th 07 07:50 PM

measuring cable loss
 
John Ferrell wrote:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff at
http://rfdesign.com/mag/608RFDF2.pdf
http://cp.literature.agilent.com/lit...5966-0784E.pdf

K7ITM August 13th 07 08:26 PM

measuring cable loss
 
On Aug 13, 11:50 am, Jim Lux wrote:
John Ferrell wrote:
On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/litweb/pdf/5966-0784E.pdf


Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level. The temperature
variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid. With two channels in the AD8302, you could even
monitor antenna reflection coefficient (including angle), and be aware
of changes there. Analog Devices and Linear Technology both seem to
be strong in the RF power monitor IC area.

Cheers,
Tom


Jim Lux August 13th 07 09:09 PM

measuring cable loss
 
K7ITM wrote:
On Aug 13, 11:50 am, Jim Lux wrote:

John Ferrell wrote:

On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.

A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.

A diode, voltmeter, and temperature sensor might work, though.

useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/litweb/pdf/5966-0784E.pdf



Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level.

indeed...
The temperature
variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid.


or, even simpler, what about a resistive tap (or a pair of resistive
taps separated by a short length of transmission line). If you're
sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out,
you need a 80 dB coupler. Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down, and you could put a 10-20dB pad in
before the detector. Calibration would take care of the coupling ratio,
although, you might want to be careful about the tempco of the resistor.
With two channels in the AD8302, you could even
monitor antenna reflection coefficient (including angle), and be aware
of changes there. Analog Devices and Linear Technology both seem to
be strong in the RF power monitor IC area.


Those are truly nifty parts, and form the basis of some very interesting
ham products over the past couple years (like LP-100 vector wattmeter
and various ham-oriented VNAs). What would be very cool is if AD would
combine something like the 8302 and the A/D so it would have a serial
digital output. Pretty close to a powermeter on a chip.

Functionally, this would be close to what you get with a Rhode+Schwartz
NRP series, a Boonton 52000, an Agilent U2000

Cheers,
Tom


Richard Clark August 14th 07 06:56 AM

measuring cable loss
 
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:

Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,


Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.

A Pi attenuator will do it better.

73's
Richard Clark, KB7QHC

K7ITM August 14th 07 07:51 AM

measuring cable loss
 
On Aug 13, 1:09 pm, Jim Lux wrote:
K7ITM wrote:
On Aug 13, 11:50 am, Jim Lux wrote:


John Ferrell wrote:


On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D"
wrote:


I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz.
The normal procedure for doing this is to inject a signal at one end and
measure the power out at the other. Using available test eqipment this is a
real pain to do. I propose to disconnect the cable at the top of the tower
terminating it in either a short or open and measure the return loss at the
source end. I have done this and measured 6.75 db and I am assuming that 1/2
of this would be the actual loss of the cable. These numbers do fall within
the established norms for this cable. Can you think of a reason thiis method
would not be valid?


Jimmie


This is way too complicated for me!
My solution would be to build/buy an RF probe and permanently mount it
at the top of the tower. Bring a pair of wires (Coax if you want it to
look really professional) down to the bottom and measure it whenever
or even all the time.


Considering he needs sub 1dB accuracy, this is challenging..it would
work if you assume your RF probe never needs calibration and is stable
over the environmental range of interest. Not a trivial thing to do.


A diode and a voltmeter certainly won't do it. (A typical diode detector
might vary 1 dB over a 20 degree C range.. judging from the Krytar 700
series data sheet I have sitting here. Granted that's a microwave
detector (100MHz to 40 GHz), but I'd expect similar from most other
diodes. I've given the link to an Agilent Ap note that describes various
detectors in excruciating detail.


A diode, voltmeter, and temperature sensor might work, though.


useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/...


Seems like modern RF detector ICs offer much better stability than
diodes. An AD8302, for example, has a typical +/- 0.25dB variation
from -40C to +85C, with a -30dBm signal level.


indeed...
The temperature

variation could be calibrated before installation; if necessary, an
especially temperature-stable part could be selected from a batch.
Then knowing the ambient within 20C would be sufficient. You'd need
to arrange sampling at a low level, which could be a well-constructed
90 degree hybrid.


or, even simpler, what about a resistive tap (or a pair of resistive
taps separated by a short length of transmission line). If you're
sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out,
you need a 80 dB coupler. Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down, and you could put a 10-20dB pad in
before the detector. Calibration would take care of the coupling ratio,
although, you might want to be careful about the tempco of the resistor.

....

The OP said this is at 1GHz. It's really tough to get a reliable
resistive divider at 1GHz, with that sort of ratio. Actually, a
capacitive divider probably stands a better chance of working, though
getting that really right isn't trivial. (We used to worry about
variation in humidity and atmospheric pressure affecting the
dielectric constant of air, in using a capacitive sampler...though
admittedly that was for work to a level well beyond 1dB accuracy.)

I am rather fond of the coupled-line hybrid idea: it can be built in
a way that everything stays ratiometric, so the coupling ratio is very
nearly constant over temperature, and of course the directionality
lets you observe things you can't just from monitoring voltage at a
point. It's possible to build one with low coupling without too much
trouble; -60dB coupling isn't out of the question, for sure. I'm
imagining a design I could make reliably with simple machine tools
that would work well for the OP's application: 100 watts at about
1GHz as I recall in the through line, and coupling on the order of
-60dB to get to about -10dBm coupled power and have negligible effect
on the through line. There's a free fields solver software package
that will accurately predict the coupling, and with the right design
and normal machine shop tolerances the coupling and impedance should
be accurate to a fraction of a dB and better than a percent,
respectively. Perhaps I can run some examples to see if I'm off-base
on that, but that's what my mental calculations tell me at the moment.

Cheers,
Tom



K7ITM August 14th 07 08:14 AM

measuring cable loss
 
On Aug 13, 10:56 pm, Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:

Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,


Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.


100aF??? :-) X(100aF)/X(100pF) = 50k/50 ??? ;-) ;-)


J. B. Wood August 14th 07 12:23 PM

measuring cable loss
 
In article EgEui.4923$MT3.3995@trnddc05, "Jerry Martes"
wrote:


I consider "return loss" to be a ratio related to the mismatch of the load
to the line. A short on the end of a low loss line will have high Return
Loss. You probably did some math that isnt apparent in the statement "I am
assuming that 1/2 (of 6.75 dB) is the actual loss". .


Hello, and you don't have to "consider" what return loss is. At an
interface/boundary it is the ratio of incident power to reflected power.
Mismatch loss is the the ratio of incident power to that dissipated in the
load at the interface/boundary. These losses in terms of VSWR are given
by

RL (dB) = 20*log(S + 1)/(S-1)

ML (dB) = 10*log(S + 1)^2/(4*S)

where S is the VSWR and logarithms are to base 10.

A lossless transmission line fed at one end and ideally short-circuited on
the other end would display a feedpoint impedance that is totally reactive
(no resistive component). If a resistive component is present it must be
due to dissipative loss in the line and since power has to travel to the
load (short) and return to the feedpoint this resistance must be twice the
dissipative loss in the line.

The challenge here is, given a transmission line of certain physical
length, to find a measurable value at the operating frequency(s). An RF
signal source with a surplus (but in proper operating order) General Radio
(Genrad) impedance bridge is good for this type of measurement. Keep in
mind that any coupling from the line to nearby structures will affect the
measurement. Sincerely, and 73s from N4GGO,

John Wood (Code 5550) e-mail:
Naval Research Laboratory
4555 Overlook Avenue, SW
Washington, DC 20375-5337

Jim Lux August 14th 07 04:44 PM

measuring cable loss
 
Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote:


Or, something like a 50k resistor into a 50
ohm load will be about 60 dB down,



Hi Jim,

Unlikely.

With parasitic capacitance at a meager 1pF across the 50K, its Z at
10MHz would compromise the attenuation presenting closer to 50 dB
down. At 1Ghz it would plunge like a rock. This, of course, presumes
a 1/4 watt resistor.

A better solution is to use surface mount resistors where the
parasitics are down at 100aF - but then you will have a frequency
dependant divider unless you can guarantee that the parasitic
capacitance of the 50 Ohm resistor is 100pF (sort of casts us back
into using a 1/4 watt resistor with a padding cap). At 1GHz, it is
not going to look like a trivial 50K load anymore.

A Pi attenuator will do it better.


A resistive 30dB tap into a 30 dB pi attenuator, or something like that?
That would get the resistor in the tap down to a reasonable value..
and, as you point out, at 1 GHz layout and component selection would be
critical.

I suppose if you're building a circuit board, a small parallel line
coupler would work just as well, and probably be easier.

in any case, the use of those nifty parts from AD does open up a lot of
interesting applications.



73's
Richard Clark, KB7QHC



All times are GMT +1. The time now is 02:54 PM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
RadioBanter.com