Thread: NEC Evaluations
View Single Post
  #2   Report Post  
Old December 31st 08, 04:50 PM posted to rec.radio.amateur.antenna
[email protected] jimlux@earthlink.net is offline
external usenet poster
 
First recorded activity by RadioBanter: Jan 2007
Posts: 61
Default NEC Evaluations

On Dec 23, 12:39 pm, Richard Clark wrote:
On Tue, 23 Dec 2008 10:29:05 -0800, Jim Lux
wrote:

At HF and VHF, you should be able to do power measurements to a tenth of
a dB, with moderate care. (obviously, you'd have to deal with
measuring the mismatch, etc.). A run of the mill power meter should
give you 5% accuracy (0.2 dB) without too much trouble. A 8902
measuring receiver can do substantially better.


Nothing astonishes me more than the simple dash-off notes that claim
power measurement is a snap. I can well imagine, Jim, that you don't
do these measurements with traceability to the limits you suggest.


In point of fact, I *do* make measurements like that, and as I said,
it requires "moderate care" and good technique and instrumentation. A
random diode measured with your $5 Harbor Freight DMM isn't going to
hack it. Neither is most of the stuff sold to hams. It is hardly a
"snap", but it *is* within the reach of someone at home with a lot of
time and care to substitute for expensive gear and calibrations
(basically, you have to do your own calibration).



For the other readers:

We will specifically start with the 8902 measuring receiver. A
premier instrument indeed, but it falls fall FAR short of actually
measuring power without a considerable body of necessary
instrumentation (well illustrated by Mac's observation found in that
fig. 15 already cited). Most claimants peer at one line in a spec
sheet and figure that is the end of the discussion. Glances elsewhere
begins to build the actual accuracy obtainable through the chain of
errors that accumulate. For instance with a 1mW input in the VHF
band:

Internal power standard: ±1.2% and we have yet to look at the
measurement head's error contribution. The so-called "run of the mill
power meters" are drawing close, too close to this precision set's
expensive quality such that their estimation of 5% is already suspect
quality.


Standard power measuring head on a Agilent power meter is better than
5% at HF, probably in the range of 1% for one head in comparison
measurements over a short time. The 8902 is sort of a special case,
but can do very accurate relative measurements. FWIW, the 8902
calibrates out the measurement head effects.



Scale error demands a full-scale indication to simple keep the error
contribution down to 0.1% (a 1/10th scale indication would jump that
error to 1%) ±1 digit.

This oversimplifies a bit. Typically, you'll have some uncertainty
that is proportional to the signal measured (e.g. mismatch will affect
the signal the same way regardless of level) and some that is
absolute, independent of the signal level (e.g. the analog noise in
the voltmeter). As you say, bigger signals are easier to measure
precisely.. the real limiting factor is the accuracy with which you
know the attenuation of the attenuators you're using to get the steps.

With regard to mismatch, if you're interested in tenth dB accuracies,
you're going to have to measure the mismatch and account for it. It's
not that hard, just tedious. The typical power meter head doesn't
change it's Z very much, so once you've measured YOUR head and keep
the data around, you're good to go for the future. (and do your tests
at the same temperature, don't use the head for a door stop, etc.)

As far as calculating uncertainties.. you bet.. it's not just stacking
em up. But that's true of ANY precision measurement, so if one is
quoting better than half dB numbers (i.e. if you give any digits to
the right of the decimal point) one should be able to back it up with
the uncertainty analysis (which is all described on the NIST website
and in the tech notes). This isn't hard, it's just tedious. But the
whole thing about high quality amateur measurements is you're trading
off your time to do tedious extra measurements and analysis in
exchange for not sending a cal-lab a check.

The how to do it is all out there. What was "state of the art" for a
national laboratory in 1970 is fairly straightforward garage work
these days, and, a heck of a lot easier because you've got inexpensive
automation for making the measurements and inexpensive computer power
for doing the calibration calculations and uncertainty analysis.

The slide 36 discussion refers to measuring a signal at -110dBm, which
I would venture to say is well below the levels that most hams will be
interested in measuring. And, they are talking about where the source
Z is unconstrained. In a typical ham situation, these things probably
aren't the case. If you were interested in measuring, for instance,
the loss of a piece of coax or the output of a 0dBm buffer amplifier
to a tenth of a dB, that's a whole lot easier than a -110dBm signal
from some probe into a 8902. The context of this discussion was
making measurements of antennas, and for that, one can normally
arrange to have decent signal levels, etc. OR, one is interested in
relative measurements, rather than absolute calibration. It's a whole
lot easier to measure a 0.1 dB difference between two signals.

You suggested 5 alternatives:
ee if you can cook up a method that doesn't hammer you into the
ground. I can anticipate some:

1. Throw a box car of money at the problem;
Or, throw some time at the problem.. this is the classic ham tradeoff... "I don't have money, but I do have time" It's no different than grinding your own telescope mirrors, building your own Yagi or wire antenna, etc.



2. Buy lab time at NIST;
That's the money thing (and it doesn't require boxcar loads.. perhaps a kilobuck or two.. and for some folks, it's worth it.. although I can't see any amateur radio need. I can see doing it as part of a hobby involving precision, like the folks on time-nuts who operate multiple Cs clocks and build very high performance quartz oscillators for the thrill of getting to 1E-14 or 1E-16 Allan deviation.. Folks who do home nuclear fusion also might avail themselves of pro cal services for their neutron detectors, because there isn't a convenient way of doing home cals, unlike for RF power, where it's at least possible)


3. Write a report that runs to book length (I've carried most of that
water by providing the link above) - or xerox the book that already
exists: "Microwave Theory and Applications," Stephen F. Adam;

Or any of a variety of sources. One doesn't need a book for this, but one does need some care in technique and some background knowledge. It's like reading John Strong's book on building scientific instruments (back in the 40s, one built one's physics experimental gear and calibrated it yourself)


4. Do it with precision components employing best practices to the
best achievable accuracy - you will need further instruction into best
practices, however;

5. Ignore reality.

---