Home |
Search |
Today's Posts |
|
#1
|
|||
|
|||
measuring cable loss
K7ITM wrote:
On Aug 13, 11:50 am, Jim Lux wrote: John Ferrell wrote: On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D" wrote: I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz. The normal procedure for doing this is to inject a signal at one end and measure the power out at the other. Using available test eqipment this is a real pain to do. I propose to disconnect the cable at the top of the tower terminating it in either a short or open and measure the return loss at the source end. I have done this and measured 6.75 db and I am assuming that 1/2 of this would be the actual loss of the cable. These numbers do fall within the established norms for this cable. Can you think of a reason thiis method would not be valid? Jimmie This is way too complicated for me! My solution would be to build/buy an RF probe and permanently mount it at the top of the tower. Bring a pair of wires (Coax if you want it to look really professional) down to the bottom and measure it whenever or even all the time. Considering he needs sub 1dB accuracy, this is challenging..it would work if you assume your RF probe never needs calibration and is stable over the environmental range of interest. Not a trivial thing to do. A diode and a voltmeter certainly won't do it. (A typical diode detector might vary 1 dB over a 20 degree C range.. judging from the Krytar 700 series data sheet I have sitting here. Granted that's a microwave detector (100MHz to 40 GHz), but I'd expect similar from most other diodes. I've given the link to an Agilent Ap note that describes various detectors in excruciating detail. A diode, voltmeter, and temperature sensor might work, though. useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/litweb/pdf/5966-0784E.pdf Seems like modern RF detector ICs offer much better stability than diodes. An AD8302, for example, has a typical +/- 0.25dB variation from -40C to +85C, with a -30dBm signal level. indeed... The temperature variation could be calibrated before installation; if necessary, an especially temperature-stable part could be selected from a batch. Then knowing the ambient within 20C would be sufficient. You'd need to arrange sampling at a low level, which could be a well-constructed 90 degree hybrid. or, even simpler, what about a resistive tap (or a pair of resistive taps separated by a short length of transmission line). If you're sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out, you need a 80 dB coupler. Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, and you could put a 10-20dB pad in before the detector. Calibration would take care of the coupling ratio, although, you might want to be careful about the tempco of the resistor. With two channels in the AD8302, you could even monitor antenna reflection coefficient (including angle), and be aware of changes there. Analog Devices and Linear Technology both seem to be strong in the RF power monitor IC area. Those are truly nifty parts, and form the basis of some very interesting ham products over the past couple years (like LP-100 vector wattmeter and various ham-oriented VNAs). What would be very cool is if AD would combine something like the 8302 and the A/D so it would have a serial digital output. Pretty close to a powermeter on a chip. Functionally, this would be close to what you get with a Rhode+Schwartz NRP series, a Boonton 52000, an Agilent U2000 Cheers, Tom |
#2
|
|||
|
|||
measuring cable loss
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux
wrote: Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, Hi Jim, Unlikely. With parasitic capacitance at a meager 1pF across the 50K, its Z at 10MHz would compromise the attenuation presenting closer to 50 dB down. At 1Ghz it would plunge like a rock. This, of course, presumes a 1/4 watt resistor. A better solution is to use surface mount resistors where the parasitics are down at 100aF - but then you will have a frequency dependant divider unless you can guarantee that the parasitic capacitance of the 50 Ohm resistor is 100pF (sort of casts us back into using a 1/4 watt resistor with a padding cap). At 1GHz, it is not going to look like a trivial 50K load anymore. A Pi attenuator will do it better. 73's Richard Clark, KB7QHC |
#3
|
|||
|
|||
measuring cable loss
On Aug 13, 10:56 pm, Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux wrote: Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, Hi Jim, Unlikely. With parasitic capacitance at a meager 1pF across the 50K, its Z at 10MHz would compromise the attenuation presenting closer to 50 dB down. At 1Ghz it would plunge like a rock. This, of course, presumes a 1/4 watt resistor. A better solution is to use surface mount resistors where the parasitics are down at 100aF - but then you will have a frequency dependant divider unless you can guarantee that the parasitic capacitance of the 50 Ohm resistor is 100pF (sort of casts us back into using a 1/4 watt resistor with a padding cap). At 1GHz, it is not going to look like a trivial 50K load anymore. 100aF??? :-) X(100aF)/X(100pF) = 50k/50 ??? ;-) ;-) |
#4
|
|||
|
|||
measuring cable loss
On Tue, 14 Aug 2007 00:14:31 -0700, K7ITM wrote:
On Aug 13, 10:56 pm, Richard Clark wrote: On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux wrote: Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, Hi Jim, Unlikely. With parasitic capacitance at a meager 1pF across the 50K, its Z at 10MHz would compromise the attenuation presenting closer to 50 dB down. At 1Ghz it would plunge like a rock. This, of course, presumes a 1/4 watt resistor. A better solution is to use surface mount resistors where the parasitics are down at 100aF - but then you will have a frequency dependant divider unless you can guarantee that the parasitic capacitance of the 50 Ohm resistor is 100pF (sort of casts us back into using a 1/4 watt resistor with a padding cap). At 1GHz, it is not going to look like a trivial 50K load anymore. 100aF??? :-) X(100aF)/X(100pF) = 50k/50 ??? ;-) ;-) S/B 100fF (trying to watch the Perseids and do math at the same time). |
#5
|
|||
|
|||
measuring cable loss
Richard Clark wrote:
On Mon, 13 Aug 2007 13:09:09 -0700, Jim Lux wrote: Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, Hi Jim, Unlikely. With parasitic capacitance at a meager 1pF across the 50K, its Z at 10MHz would compromise the attenuation presenting closer to 50 dB down. At 1Ghz it would plunge like a rock. This, of course, presumes a 1/4 watt resistor. A better solution is to use surface mount resistors where the parasitics are down at 100aF - but then you will have a frequency dependant divider unless you can guarantee that the parasitic capacitance of the 50 Ohm resistor is 100pF (sort of casts us back into using a 1/4 watt resistor with a padding cap). At 1GHz, it is not going to look like a trivial 50K load anymore. A Pi attenuator will do it better. A resistive 30dB tap into a 30 dB pi attenuator, or something like that? That would get the resistor in the tap down to a reasonable value.. and, as you point out, at 1 GHz layout and component selection would be critical. I suppose if you're building a circuit board, a small parallel line coupler would work just as well, and probably be easier. in any case, the use of those nifty parts from AD does open up a lot of interesting applications. 73's Richard Clark, KB7QHC |
#6
|
|||
|
|||
measuring cable loss
On Aug 13, 1:09 pm, Jim Lux wrote:
K7ITM wrote: On Aug 13, 11:50 am, Jim Lux wrote: John Ferrell wrote: On Thu, 9 Aug 2007 08:13:45 -0400, "Jimmie D" wrote: I need to measure the loss of aproximately 200ft of coax @ a freq of 1Ghz. The normal procedure for doing this is to inject a signal at one end and measure the power out at the other. Using available test eqipment this is a real pain to do. I propose to disconnect the cable at the top of the tower terminating it in either a short or open and measure the return loss at the source end. I have done this and measured 6.75 db and I am assuming that 1/2 of this would be the actual loss of the cable. These numbers do fall within the established norms for this cable. Can you think of a reason thiis method would not be valid? Jimmie This is way too complicated for me! My solution would be to build/buy an RF probe and permanently mount it at the top of the tower. Bring a pair of wires (Coax if you want it to look really professional) down to the bottom and measure it whenever or even all the time. Considering he needs sub 1dB accuracy, this is challenging..it would work if you assume your RF probe never needs calibration and is stable over the environmental range of interest. Not a trivial thing to do. A diode and a voltmeter certainly won't do it. (A typical diode detector might vary 1 dB over a 20 degree C range.. judging from the Krytar 700 series data sheet I have sitting here. Granted that's a microwave detector (100MHz to 40 GHz), but I'd expect similar from most other diodes. I've given the link to an Agilent Ap note that describes various detectors in excruciating detail. A diode, voltmeter, and temperature sensor might work, though. useful stuff athttp://rfdesign.com/mag/608RFDF2.pdfhttp://cp.literature.agilent.com/... Seems like modern RF detector ICs offer much better stability than diodes. An AD8302, for example, has a typical +/- 0.25dB variation from -40C to +85C, with a -30dBm signal level. indeed... The temperature variation could be calibrated before installation; if necessary, an especially temperature-stable part could be selected from a batch. Then knowing the ambient within 20C would be sufficient. You'd need to arrange sampling at a low level, which could be a well-constructed 90 degree hybrid. or, even simpler, what about a resistive tap (or a pair of resistive taps separated by a short length of transmission line). If you're sending, say, 100W (+50dBm) up the wire, and you want, say, -30dBm out, you need a 80 dB coupler. Or, something like a 50k resistor into a 50 ohm load will be about 60 dB down, and you could put a 10-20dB pad in before the detector. Calibration would take care of the coupling ratio, although, you might want to be careful about the tempco of the resistor. .... The OP said this is at 1GHz. It's really tough to get a reliable resistive divider at 1GHz, with that sort of ratio. Actually, a capacitive divider probably stands a better chance of working, though getting that really right isn't trivial. (We used to worry about variation in humidity and atmospheric pressure affecting the dielectric constant of air, in using a capacitive sampler...though admittedly that was for work to a level well beyond 1dB accuracy.) I am rather fond of the coupled-line hybrid idea: it can be built in a way that everything stays ratiometric, so the coupling ratio is very nearly constant over temperature, and of course the directionality lets you observe things you can't just from monitoring voltage at a point. It's possible to build one with low coupling without too much trouble; -60dB coupling isn't out of the question, for sure. I'm imagining a design I could make reliably with simple machine tools that would work well for the OP's application: 100 watts at about 1GHz as I recall in the through line, and coupling on the order of -60dB to get to about -10dBm coupled power and have negligible effect on the through line. There's a free fields solver software package that will accurately predict the coupling, and with the right design and normal machine shop tolerances the coupling and impedance should be accurate to a fraction of a dB and better than a percent, respectively. Perhaps I can run some examples to see if I'm off-base on that, but that's what my mental calculations tell me at the moment. Cheers, Tom |
#7
|
|||
|
|||
measuring cable loss
I am rather fond of the coupled-line hybrid idea: it can be built in a way that everything stays ratiometric, so the coupling ratio is very nearly constant over temperature, and of course the directionality lets you observe things you can't just from monitoring voltage at a point. It's possible to build one with low coupling without too much trouble; -60dB coupling isn't out of the question, for sure. I'm imagining a design I could make reliably with simple machine tools that would work well for the OP's application: 100 watts at about 1GHz as I recall in the through line, and coupling on the order of -60dB to get to about -10dBm coupled power and have negligible effect on the through line. There's a free fields solver software package that will accurately predict the coupling, and with the right design and normal machine shop tolerances the coupling and impedance should be accurate to a fraction of a dB and better than a percent, respectively. Perhaps I can run some examples to see if I'm off-base on that, but that's what my mental calculations tell me at the moment. Actually, the exact coupling ratio probably isn't important in this application, because it could be "calibrated out". Stability would be a bigger concern, and it's certainly possible to design a coupler that is very temperature stable by choosing the right dimensions so that things change in the right ratios. |
#8
|
|||
|
|||
measuring cable loss
On Aug 14, 8:45 am, Jim Lux wrote:
I am rather fond of the coupled-line hybrid idea: it can be built in a way that everything stays ratiometric, so the coupling ratio is very nearly constant over temperature, and of course the directionality lets you observe things you can't just from monitoring voltage at a point. It's possible to build one with low coupling without too much trouble; -60dB coupling isn't out of the question, for sure. I'm imagining a design I could make reliably with simple machine tools that would work well for the OP's application: 100 watts at about 1GHz as I recall in the through line, and coupling on the order of -60dB to get to about -10dBm coupled power and have negligible effect on the through line. There's a free fields solver software package that will accurately predict the coupling, and with the right design and normal machine shop tolerances the coupling and impedance should be accurate to a fraction of a dB and better than a percent, respectively. Perhaps I can run some examples to see if I'm off-base on that, but that's what my mental calculations tell me at the moment. Actually, the exact coupling ratio probably isn't important in this application, because it could be "calibrated out". Stability would be a bigger concern, and it's certainly possible to design a coupler that is very temperature stable by choosing the right dimensions so that things change in the right ratios. Bingo. It's that ratiometric thing that is a big plus for stability. In a coupler made of all the same metal, or at least metals that have nearly equal coefficients of expansion, the ratios stay the same, and it's the dimensional ratios that establish the coupling and impedances, not the absolute size. Actually, the change in length does matter, but if you make the assembly a quarter wave long, the d(coupling)/d(length) is zero at that point anyway. In any event, I suppose the thermal coefficient of expansion of metals you'd be most likely to use is small enough that you'd be fine with a shorter coupler. There doesn't need to be anything terribly complex about the geometry of the whole thing, either. It's probably safe to say that changes in the dielectric constant of air due to air pressure and humidity aren't going to be significant in this case. ;-) Cheers, Tom |
#9
|
|||
|
|||
measuring cable loss
On Tue, 14 Aug 2007 09:53:31 -0700, K7ITM wrote:
Bingo. It's that ratiometric thing that is a big plus for stability. In a coupler made of all the same metal, or at least metals that have nearly equal coefficients of expansion, the ratios stay the same, and it's the dimensional ratios that establish the coupling and impedances, not the absolute size. Actually, the change in length does matter, but if you make the assembly a quarter wave long, the d(coupling)/d(length) is zero at that point anyway. In any event, I suppose the thermal coefficient of expansion of metals you'd be most likely to use is small enough that you'd be fine with a shorter coupler. There doesn't need to be anything terribly complex about the geometry of the whole thing, either. It's probably safe to say that changes in the dielectric constant of air due to air pressure and humidity aren't going to be significant in this case. ;-) Cheers, Tom Tom, I thought this thread concerned measurement of attenuation in transmission lines. On the 11th I posted a precedure that involves measuring the line input impedances with the line terminated in both a short circuit and an open circuit, then plugging the measured data into a BASIC program that outputs the attenuation, complex Zo, and electrical length. My thoughts were that this procedure gives results with more accuracy and precision than the procedures discussed before my post appeared. However, I noticed that my post drew zero response. Is my procedure out-of-line, or out dated? Walt, W2DU |
#10
|
|||
|
|||
measuring cable loss
On Tue, 14 Aug 2007 15:01:32 -0400, Walter Maxwell
wrote: My thoughts were that this procedure gives results with more accuracy and precision than the procedures discussed before my post appeared. However, I noticed that my post drew zero response. Is my procedure out-of-line, or out dated? Hi Walt, I provided a posting on how to determine the extent of error that was similarly ignored - don't feel bad. Accuracy isn't all that its cracked up to be. :-) 73's Richard Clark, KB7QHC |
Reply |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
Measuring quarter wave cable length with HP 8405A | Antenna | |||
Calculating Coaxial Cable Loss | Antenna | |||
Antenna cable loss query | Scanner | |||
Antenna cable loss query | Shortwave | |||
Measuring small inductances using a return loss bridge | Homebrew |