Highland Ham wrote:
I DID however, at around the same time, own a 2 meter Jap all-mode
transceiver that I happened to measure the "S" meter accuracy with an
HP signal generator. It turned out that 2 uVolts was "S"-1. THREE
uVolts was "S"-9.
==============================
Is it correct that for frequencies up to 30 MHz a S9 signal is 50
microvolt into 50 Ohms (or -73 dBm) but that for higher frequencies a
S9 signal is 5 microvolts into 50 ohms (or -93 dBm).
If that is (the agreed) norm ,was it ever formally sanctioned by IARU ?
Wrong way around: those standards (along with 6dB per S-point) were
formally sanctioned by IARU, but almost no amateur receiver has ever met
those standards.
I can hardly believe that any of the far eastern rice boxes have a
properly calibrated S-meter. Also the top end of the S-meter scale is
usually rather 'compressed', which surprises me since ICs with a log
type input/output relationship must be readily available.
Conventional S-meters don't actually measure signal strength - they
measure the AGC voltage. The S-meter could only be accurate if the
voltage/gain characteristic of the AGC-controlled stages happened to be
accurately logarithmic across the entire dynamic range of the receiver;
which is almost never true.
An accurate S-meter will also need some compensation for variations in
gain across the HF bands. Above all, a true reading of 'signal strength'
should NOT change when you switch in a preamp or an attenuator, or vary
the RF gain.
For the long story, see 'S Meter Blues' by W8WWV:
http://tinyurl.com/8nme6
--
73 from Ian GM3SEK 'In Practice' columnist for RadCom (RSGB)
http://www.ifwtech.co.uk/g3sek