View Single Post
  #13   Report Post  
Old November 26th 04, 08:27 PM
Avery Fineman
 
Posts: n/a
Default

In article , "William E. Sabin"
writes:

My receiver has a custom made, computer printed scale using a calibrated sig
gen, and there are two trimpot adjustments, one for the low end and one for
the high end. This circuit uses voltage regulated opamps. The S meter
dynamics are adjusted using RC time constants.

My S meter is accurate within +/- 2 dB from 160 M to 10 M, because the
receiver is designed for this accuracy. Because of the IF and RF circuit
design, the scale calibration is fairly correct and reliable, as I
mentioned.

Bill W0IYH


Thanks, Bill. I'm doing essentially the same...and expect the overall
receiver response to the flat within +/- 1 db within an octave and a
half tuning range. Accuracy of the S-Meter is only as good as the
RF level accuracy of the calibrating RF source but that's another
task and I have confidence in that. But, I have to start someplace
and that is why I asked about a "standard." I know that the U.S.
military didn't bother with any receiver S-Meter calibration standards
since around 1980, only approximate differential signal strength
readings if there was an indicator at all.