View Single Post
  #12   Report Post  
Old November 28th 04, 03:56 AM
Frank
 
Posts: n/a
Default


"Ed Price" wrote in message
news:Unbqd.4490$KO5.3803@fed1read02...

"Frank" wrote in message
news:wN4qd.217617$9b.158132@edtnps84...
I think S meters should be calibrated in dB uV/m, or at least dBm input.

Frank



You cannot do that. Even on a good meter (like a spectrum analyzer or an
EMI receiver), the meter is calibrated only for input power at its front
panel connector. You still have to add a frequency dependent correction
factor for cable loss and antenna efficiency. Very expensive measurement
systems operate under computer control, with a calibrated analyzer and the
computer adding the appropriate factors for cable loss and antenna factor.
(This allows for flexibility; a different coax or antenna can be
substituted at any time, so long as the computer has a table of factors
for the new device.)

As for how you mark the S-meter scale, I agree that we use an archaic
system with S-units. It would be more rational to use a simple dB scale
referenced to something like 1 picowatt (which would then become 0 dBpW).
OTOH, the purpose of an S-meter is not to provide absolute measurements.
It is used as a tuning indicator and for relative signal strength
comparisons. And the archaic marking system works fine for that need.

Ed
wb6wsn


Yes, I agree with your comments, and am familiar with EMC and ATR
measurement techniques. Mentioning dBuV/m was a bit tongue in cheek, but
just the same if you could add the antenna factor etc. it would be nice to
give accurate reports of field strength. Still dBm (or dBpW) is certainly
no big deal. Even if not laboratory accurate, it still means so much more
that the S signal strength scale.

Frank