Home |
Search |
Today's Posts |
|
#1
![]() |
|||
|
|||
![]()
Roy Lewallen wrote:
Ian Wade G3NRW wrote: Mike, With the 4170 I can calibrate the instrument to compensate for the feeder impedance. After calibration, the indicated impedance at the TX end of the feeder is actually the antenna feedpoint impedance. This makes life a *lot* easier. Easy, yes. But If you're not careful, this can be a great example of garbage in, garbage out. I frequently calculate out the feedline transformation when making antenna measurements. But it's essential that you realize a small error in estimating the feedline loss(*) or length can sometimes result in a very large error in calculated impedance. This is particularly true if there's a large impedance mismatch between the line and antenna. Transmission line impedance, which can vary a lot from the specified nominal value (I've seen +/-20% with coax, more with ladder line), also has an effect on the result. So whenever I need accurate results or whenever the line Z0 is quite different from the antenna impedance, I start by carefully measuring the properties of the actual transmission line I'll be using. If you're not convinced, spend a few minutes playing with something like N6BV's TLW calculator that comes with the ARRL Antenna Book. (*) Some simplified techniques ignore transmission line loss altogether. This can lead to very inaccurate results in some situations. And loss is often quite different than the specified value, so it really has to be measured if it makes a significant difference. Roy Lewallen, W7EL The 4170 makes this a lot easier as you can measure the feedline actual parameters as well as calibrate out their effects. -- Jim Pennino Remove .spam.sux to reply. |
#2
![]() |
|||
|
|||
![]() |
#3
![]() |
|||
|
|||
![]()
Michael Coslo wrote:
wrote: The 4170 makes this a lot easier as you can measure the feedline actual parameters as well as calibrate out their effects. This is a dumb question on my part, but what you are saying is that the mitigating effects that the cable has on the VSWR, making it look better in general, can not only be calculated and "calibrated out", but that the actual SWR of your antenna at the feedpoint is then given? As you get closer to 1.1:1 at the actual antenna, would accuracy then suffer? If feedline loss can bring an antenna that is not near that to a level approaching that, wouldn't it mean that teh calibration is somewhere in the noise? Like I say, this could be a really stoopid question. - 73 de Mike N3LI - Basically what you do is calibrate the instrument at the measurement point, whether that point is the instrument connector or at the end of a length of coax. You attach an open, a short and a known resistance; 50 ohms by default but it is user definable. The instrument than frequency sweeps and stores the results in a user definable calibration file. When you make a measurement of an unknown, you define which calibration file to use and the instrument corrects the readings to display the characteristics at the measurement point. Given that this is a $500 insturment and not a $20,000 labratory instrument there are going to be limits to how accurate all this is. After having used the AIM for a while, my opinion is that it far execeeds what is required for practical amateur usage. If you want to see some actual numbers, you can find a comparison of the results of an AIM 4170 compared to HP lab equiment at: http://www.bnk.com/w0qe/AIM4170_page1.html -- Jim Pennino Remove .spam.sux to reply. |
#4
![]() |
|||
|
|||
![]()
..
snip Basically what you do is calibrate the instrument at the measurement point, whether that point is the instrument connector or at the end of a length of coax. You attach an open, a short and a known resistance; 50 ohms by default but it is user definable. The instrument than frequency sweeps and stores the results in a user definable calibration file. When you make a measurement of an unknown, you define which calibration file to use and the instrument corrects the readings to display the characteristics at the measurement point. Given that this is a $500 insturment and not a $20,000 labratory instrument there are going to be limits to how accurate all this is. When I inspected antennas, we had two multi-kilobuck "Site Master" instruments from Anritsu, mentioned here, that had a set of calibrated terminations. IIRC, to calibrate the unit(s), we had to connect the terminations, a short, a 50-ohm resistor and a shielded open circuit, one at a time, to the instrument and tell it which one was connected. It swept the frequencies of interest and stored its own baseline behavior over that band of interest. Then, anything connected to it was referenced to that baseline. We could also store a range of sweep frequencies (usually by the name or type of antenna we intended to sweep) and it would recall all the parameters. Automated, repeatable sweep testing is not available (yet) in lower cost instruments. I presume we could have calibrated any given cable, too. (Never required by our test memos.) Sal |
#5
![]() |
|||
|
|||
![]()
Sal M. Onella wrote:
. snip Basically what you do is calibrate the instrument at the measurement point, whether that point is the instrument connector or at the end of a length of coax. You attach an open, a short and a known resistance; 50 ohms by default but it is user definable. The instrument than frequency sweeps and stores the results in a user definable calibration file. When you make a measurement of an unknown, you define which calibration file to use and the instrument corrects the readings to display the characteristics at the measurement point. Given that this is a $500 insturment and not a $20,000 labratory instrument there are going to be limits to how accurate all this is. When I inspected antennas, we had two multi-kilobuck "Site Master" instruments from Anritsu, mentioned here, that had a set of calibrated terminations. IIRC, to calibrate the unit(s), we had to connect the terminations, a short, a 50-ohm resistor and a shielded open circuit, one at a time, to the instrument and tell it which one was connected. It swept the frequencies of interest and stored its own baseline behavior over that band of interest. Then, anything connected to it was referenced to that baseline. We could also store a range of sweep frequencies (usually by the name or type of antenna we intended to sweep) and it would recall all the parameters. Automated, repeatable sweep testing is not available (yet) in lower cost instruments. The $600 TenTec/TAPR VNA does open/short/thru/load calibration with sweeps, etc. I don't have the AIM, but I'll bet it does too. These days, it's not a big deal to include it. |
#6
![]() |
|||
|
|||
![]() "Jim Lux" wrote in message ... Sal M. Onella wrote: . snip Basically what you do is calibrate the instrument at the measurement point, whether that point is the instrument connector or at the end of a length of coax. You attach an open, a short and a known resistance; 50 ohms by default but it is user definable. The instrument than frequency sweeps and stores the results in a user definable calibration file. When you make a measurement of an unknown, you define which calibration file to use and the instrument corrects the readings to display the characteristics at the measurement point. Given that this is a $500 insturment and not a $20,000 labratory instrument there are going to be limits to how accurate all this is. When I inspected antennas, we had two multi-kilobuck "Site Master" instruments from Anritsu, mentioned here, that had a set of calibrated terminations. IIRC, to calibrate the unit(s), we had to connect the terminations, a short, a 50-ohm resistor and a shielded open circuit, one at a time, to the instrument and tell it which one was connected. It swept the frequencies of interest and stored its own baseline behavior over that band of interest. Then, anything connected to it was referenced to that baseline. We could also store a range of sweep frequencies (usually by the name or type of antenna we intended to sweep) and it would recall all the parameters. Automated, repeatable sweep testing is not available (yet) in lower cost instruments. The $600 TenTec/TAPR VNA does open/short/thru/load calibration with sweeps, etc. I don't have the AIM, but I'll bet it does too. These days, it's not a big deal to include it. Thanks for bringing me into the present. I have an analyzer already. When I drop it off the roof (inevitable), I will look at the TenTec/TAPR VNA. |
#7
![]() |
|||
|
|||
![]()
Michael Coslo wrote:
wrote: The 4170 makes this a lot easier as you can measure the feedline actual parameters as well as calibrate out their effects. This is a dumb question on my part, but what you are saying is that the mitigating effects that the cable has on the VSWR, making it look better in general, can not only be calculated and "calibrated out", but that the actual SWR of your antenna at the feedpoint is then given? As you get closer to 1.1:1 at the actual antenna, would accuracy then suffer? If feedline loss can bring an antenna that is not near that to a level approaching that, wouldn't it mean that teh calibration is somewhere in the noise? Like I say, this could be a really stoopid question. - 73 de Mike N3LI - Not at all. Imagine that you have a very lossy line. You'll read very nearly the cable Z0 regardless of what's at the other end. Extreme changes in far-end impedance will make very little difference at the input end, so it's impossible to tell with any accuracy what's at the far end by looking at the near-end impedance. Another case to consider is one where the Z0 of the cable is very different than the Z of the load. In that case, a tiny change in line Z0, length, or loss changes the input Z for a given load Z. It can be impossible to measure the line length, impedance, or loss with sufficient accuracy to find the far end impedance with decent accuracy. This doesn't mean you can't get measurements good enough for amateur or even professional use. But on the other hand, your measurements can be total garbage in spite of your cable measurements if you fail to realize just how sensitive the result can be to small errors. A careful experimenter will do a sensitivity analysis which tells how large an error in results is caused by an error in measuring the feedline or in the input impedance measurement, then the probable measurement errors are estimated to determine the probable error in the calculated result. While a mathematical sensitivity analysis is the rigorous way to do this, something like N6BV's TLW program is just fine for most amateur purposes. Or, if you're using one of the instruments that does the calculation for you, try telling it the line is a few percent longer or shorter, or has a Z0 or loss a few percent different from what it said or you measured. See how much it changes the result. If the change is small, no problem. But if it's large, it means that extreme care and maybe some other techniques have to be used to get a good measurement. Let me give an example, done with TLW. Suppose we're measuring the impedance of an antenna at 30 MHz through 100 feet of RG-8x. TLW gives these nominal values for RG-8x: Z0 - 50.2 - j0.47 Velocity factor - 0.8 Loss - 1.926 dB/100' And suppose that these are exactly what our measurement of the cable said. We measure 21 + j20 at the input end, and conclude that the impedance of the antenna is 374 - j84 ohms. But suppose the measurement at the input end was inaccurate by about 5%, and that the actual input end Z was 21 + j20. Then the load Z is 322 - j105, about 15% off in R, 25% in Z. Or maybe the cable measurement was off by just 1%, and the cable is really 101 and not 100 feet long. In that case, the antenna Z is really 129 + j166 ohms. We're even on the other side of resonance from where we thought. Or maybe the velocity factor was rounded a bit and it's really closer to 0.81 than 0.8. How much difference would that small error make? Well, the antenna Z would be 53 - j120 ohms with our input measurement of 21 + j20! So, what's the real antenna impedance? 374 - j84, 322 - j105, 129 + j166, or 53 - j120? You're fooling yourself if you think you really know. It's easy to get lulled into believing that just because we read a value to six decimal places, it's accurate. But you're usually doing very well to get within a few percent in spite of all those digits. And when that few percent results in a much bigger error in calculated results, it's even more important to realize the limitations of your accuracy. Roy Lewallen, W7EL |
#8
![]() |
|||
|
|||
![]()
Corrections:
I apologize, and misinterpreted my scribbled notes. The conclusion is the same, but some of the quoted numbers are a little off. Here are the correct ones. ------------ And suppose that these are exactly what our measurement of the cable said. We measure 21 + j20 at the input end, and conclude that the impedance of the antenna is 322 - j105 ohms. But suppose the measurement at the input end was inaccurate by about 5%, and that the actual input end Z was 22 + j21. Then the load Z is 273 - j125, about 15% off in R, 20% in X. . . ------------- Roy Lewallen wrote: And suppose that these are exactly what our measurement of the cable said. We measure 21 + j20 at the input end, and conclude that the impedance of the antenna is 374 - j84 ohms. But suppose the measurement at the input end was inaccurate by about 5%, and that the actual input end Z was 21 + j20. Then the load Z is 322 - j105, about 15% off in R, 25% in Z. . . |
#9
![]() |
|||
|
|||
![]()
Roy Lewallen wrote:
And suppose that these are exactly what our measurement of the cable said. We measure 21 + j20 at the input end, and conclude that the impedance of the antenna is 322 - j105 ohms. But suppose the measurement at the input end was inaccurate by about 5%, and that the actual input end Z was 22 + j21. Then the load Z is 273 - j125, about 15% off in R, 20% in X. . . Thanks Roy, My knowledge of the matter is at the noobie level, but given that loss eventually gets you reading only the Z0 of the cable, it's what I suspected. I hadn't thought about a big mismatch between the cable and antenna, so there's another data point. - 73 de Mike N3LI - |
#10
![]() |
|||
|
|||
![]()
In article , Michael Coslo
wrote: My knowledge of the matter is at the noobie level, but given that loss eventually gets you reading only the Z0 of the cable, it's what I suspected. I hadn't thought about a big mismatch between the cable and antenna, so there's another data point. - 73 de Mike N3LI - Hello, and I ran into this issue years ago when trying to measure high ( 10 or 20) VSWR loads (in this case out-of-band shipboard HF antenna feedpoint impedance) connected via a length of transmission line. Accurate determination of load resistance is difficult to ascertain under these conditions. A 1953 AIEE (a progenitor of the IEEE) paper by W.W. Macalpine, "Computation of Impedance and Efficiency of Transmission Lines with High Sanding Wave Ratio" describes the problem. I was, however, able to obtain accurate results when I included the small imaginary part (frequency dependent) of the characteristic impedance that is present in a low-loss line. Outside the high VSWR load issue the imaginary part can be ignored. Sincerely, and 73s from N4GGO, John Wood (Code 5550) e-mail: Naval Research Laboratory 4555 Overlook Avenue, SW Washington, DC 20375-5337 |
Reply |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
antenna analyzer | Swap | |||
FS MFJ 259 Antenna Analyzer | Swap | |||
Antenna analyzer? | Antenna | |||
FS: MFJ-249 Antenna Analyzer | Swap | |||
WTB: Antenna Analyzer | Swap |