After 36+years in the radcoms industry as a technician I have seen the
move from S/N to SINAD at first hand.
SINAD has been around throughout my career but it is really on in the
last 20 years that it has become most prevalent. The only reason that
I can think that this has become so is the improvement in technology
in the production of semiconductors and the consequent reduction in
their noise levels, and the much wider use of FETs in front ends,
which has resulted in S/N becoming an unrealistic measurement, the
noise floor now being so low. Add to this the now commonplace use of
SMD - which itself is 'quieter' - and improved circuit design, and the
distortion in the modulation becomes of significantly greater
importance.
Having said that 10% distortion was the norm when I started, whereas
2% or even less is today's standard, such are the improvements in that
side of transmission, hence SINAD is probably now the only viable
measurement.
Think of it like this: in the mid 70's most receivers would do 12dB
S/N for around 0.6uV for 60% system modulation on a 12.5KHz channel
spacing; today it is not uncommon to see 12dB SINAD (let alone S/N!)
at not much above 0.2uV, and if you look at equipments of far eastern
origin which may not have such good parameters (selectivity, adjacent
channel rejection, etc) as UK/European designed units it can be 3dB
better than that! Perhaps that's why you see a lot of oriental mobiles
and portables but most associated base stations, especially those used
on communal sites, are designed or made elsewhere (UK, USA, Germany,
and New Zealand to mention but a few.)
Why 12dB? My two penn'uth is that the human ear is very tolerant of
even harmonic distortion, especially 2nd which would predominate in
the audio chain due to filtration, so someone somewhere defined 12dB
(or about 25% distortion) as the maximum tolerable level for usable
speech.
--
Woody
harrogate2 at ntlworld dot com
|