Thread
:
NEC Evaluations
View Single Post
#
25
January 4th 09, 07:19 AM posted to rec.radio.amateur.antenna
Richard Clark
external usenet poster
First recorded activity by RadioBanter: Jul 2006
Posts: 2,951
NEC Evaluations
On Sat, 3 Jan 2009 16:50:39 -0800 (PST),
wrote:
Some may appeal that their Relative Accuracy has the merit of Absolute
Accuracy - until you ask them to measure the actual, absolute value of
their ad-hoc standard. I have built primary standards and measured
them to 7 places. Over several years they migrated in value by 2 of
those least significant digits, but only in comparison to standards
that had "aged" and been calibrated at a national primary standards
laboratory. By themselves, I could have fooled myself (and perhaps
others less sophisticated) that they were absolutely accurate to the
extent of the number of digits my instruments could resolve.
--
So, you measured to 1E-7, and over several years, they changed by
1E-5? That's a whole heck of a lot better than 1E-2 (which is what
0.1dB implies).
I have accurately calibrated instruments beyond the resolution of
1E-12, but not for power. Metrology encompasses many physical and
electrical phenomenon. The standard mentioned above was a resistor,
not power. I said it was a primary standard, not a sensor.
Would the atypical Ham have the capacity to build a standard resistor?
Sure. Could that atypical Ham qualify the standard resistor? Maybe.
Could that atypical Ham calibrate the standard resistor? Not in
decade except by frequent shipment to the nearest primary standards
lab on their terms of calibration frequency - which would be very
frequent given the lack of aging. In that sense, the atypical Ham is
wholly incapable of calibration beyond the sense of it being performed
through proxy. However, as you have already dismissed the frequency
of calibration to the whim of desire - what price performance?
Of course, that same atypical Ham could simply take one certificate of
accuracy for 1ppm and derate it to 0.1% and probably live with that
accuracy for the rest of his natural life. Unfortunately, 0.1%
accuracy at D.C. doesn't lend itself to precision, much less accuracy
at 50MHz. In fact, that standard resistor would probably be off by
several orders of magnitude (10,000% error). I feel safe in saying
"probably" because that error would be overwhelming to the point of
saturating disbelief.
Those 3 orders of magnitude are why I think it's
reasonable and plausible for hams to make measurements to 0.1dB.
Try fishing harder that simply throwing money at this.
You still don't have a METHOD for achieving this vaunted 0.1dB power
determination, do you? Summoning up vague references to foggy
characteristics isn't going to gather up even a 10% accurate
determination.
is something
eminently doable for ham use (See Larsen's paper from 1975) and mostly
depends on a "really good" DVM for its accuracy.
This is quite a joke. "Really good" indeed - you may as well add
praying to an idol on an alter mounted in the corner of the shack to
boost results.
The original post had to do with comparing measured antenna patterns
against NEC models. That IS the case there.
And I would say that you have done a job of mocking the performance
Mac achieved which conformed to the best practices and METHODs offered
in the modern links provided by me. Those best practices and METHODs
have survived intact through the intervening decades because of their
fundamental necessity. You haven't reduced any of his errors except
by an automated sensor system. Given the other irreducible errors
that still inhabited his measurements (something you have yet to
respond to), power determination still resides in the domain of 10s of
percent error.
The case is this error still eclipes your 0.1dB by a vast margin.
73's
Richard Clark, KB7QHC
Reply With Quote
Richard Clark
View Public Profile
Find all posts by Richard Clark