View Single Post
  #16   Report Post  
Old August 20th 06, 04:12 AM posted to rec.radio.amateur.homebrew
[email protected] LenAnderson@ieee.org is offline
external usenet poster
 
First recorded activity by RadioBanter: Aug 2006
Posts: 1,027
Default AGC signal/noise question...

From: Andrea Baldoni on Sat, Aug 19 2006 4:05 pm


wrote:

: I'm researching about the matter and I just read that, in a BJT for
: instance, emitter current is inversely proportional to the noise. So,
: if AGC reduces the gain (so current), SNR degrade?

: Not necessarily true. Noise, true natural noise, in a bipolar
: ...
: of how many factors go into noise generation within the
: transistor. :-)

So, if you have to engineer a (let's start with HF) receiver, do you think
it may better to:


There's no "engineering" involved, just a crunching of numbers
AFTER you find the input levels versus AGC and how much noise
is actually generated...and approximately WHERE this excess
noise is coming from.

1)
find a way to insert automatically a stepped attenuation (maybe
using a diode switched resistor network) and leaving amplifiers without AGC,
thus optimizing them for a particular gain


I see no need of that at this point. "Getting fancy" with
extra circuitry is rather useless without knowing what the
problem all this fancy circuitry is supposed to cure.

2)
build circuits with so high dynamic range that's completely impossible to
have input signals overload them (what's the dynamic range one should
normally expect at the antenna input, excluding obvious limit-case situations
where the transmitting output is fed into the receiver input...?)


That's NOT the issue here.

Noise and signal-to-noise ratios are only important at LOWEST
signal levels, not the highest.

3)
use the usual AGC


Why not? Decades of designs in many countries have successfully
operated with "usual AGC." [voltage-controlled, sometimes
current-controlled gain stages driven by a DC control line]

...I'm thinking the 1 could be a good solution if the demodulator had to be
a digital one. That way, a calibrated attenuator simply add bits to the ADC.
Hovewer, the 2 is very attractive, providing that all is analog, or the
ADC dynamic range is better than the one that could come from the antenna...


Experiment any way you want but I can't see that as your cure.

I've read most use the 3, digitizing the AGC signal maybe with a second
ADC channel, to have anyway a sort of more bits of resolution.
So probably I'm wrong and the right solution is the 3... but only if adding
an AGC never ruin amplifiers performance.


A rather common (for decades of designs and production) AGC action
is no more than 6 db change in output for 60 to 100 db of input
signal (carrier) change.

AGC should be approached from the standpoint of a servo loop.
The "error signal" is the change in carrier level at the
detector. The controlled items are the RF and IF amplifiers.
The time-constant of the error feedback loop (what is commonly
called "the AGC line") is quite slow but fast enough to try to
keep detector level constant through flutter (rapid reflections
at VHF and up) and ionospheric path variations.

If "the AGC line" somehow has some noise in it, that noise is
probably going to change RF-IF amplifier gain. However, the
frequency of that noise is going to be low; it is band-
limited by the usual AGC line decoupling.

Let's look at SNR with low to higher antenna input levels:
1. Assume you have (for example) 1 uV of noise at no-signal.
2. If the RF signal is 3.16 uV then the signal-plus-noise
to noise ratio is 10 db.
3. If the RF signal is 10 uV then the signal-plus-noise to
noise ratio is 20 db.
4. If the RF signal is 31.6 uV then the signal-plus-noise
to noise ratio is 30 db.

The common (for about 40+ years, internationally) level of
receiver sensitivity for AM mode signals is a 10 db signal-
plus-noise to noise ratio. That's an easy test, done by
connecting an AC voltmeter (that can measure RMS voltage)
to the detector output. With no signal input, all you get
is front-end noise; note that. Apply a known-level RF
source to the antenna input, adjust that level to be 10 db
higher than the noise level measured with no signal input.
Note the RF source level; that is the "minimum
sensitivity" level for the common "10 db S+N:N" criterion.

For FM or PM it is a bit more complicated. FM and PM
rely on quieting through the Limiter stages ahead of the
FM detector. For most tests of FM/PM sensitivity you NEED
a known-signal-source-level to determine the quieting.

: There isn't much FM on HF. What there is would be in
: narrow-band Data mode signals. Some of that Data is a
: combination of AM and PM similar to a wireline modem's
: modulation.

In fact I was receiving 144MHz using a converter.


That data was omitted. Have you checked out the converter
insofar as adding noise? You can get a rough comparison
by using another HF receiver.

Have you checked your internal (to HF receiver) FM
demodulator characteristics? Do you have the manufacturer's
specifications on that? Since nearly all FM/PM demods
use Limiters, they normally operate with AGC off.

: What is needed in an investigation of this is a reasonably-
: well-calibrated signal generator with a calibrated attenuator.

Unfortunately I have only a Instek function generator, and I'm not
very satisfied with any intrument I bought from this firm...

Anyway, sooner or later I'll build a dds one...


You can't work in the dark (without instruments) when
trying to troubleshoot electronics.

A DDS (Direct Digital Synthesis) signal generator gives
you very precise FREQUENCY. For years there have been
L-C oscillator based signal generators which have been
stable enough in frequency to determine AGC action.

What you really need to investigate the AGC is PRECISE
RF ATTENUATION -and- a way to calibrate the maximum RF
output. [an ordinary diode detector could do that if it
was itself calibrated against a known RF source LEVEL]

Ah, another question. I have a very precise digital voltmeter. Very
precise, 6.5digits (this time from Agilent)... Unfortunately, it's
absolutely unable to handle RF.
I would like to build a "RF" frontend for it... Any ideas?
I'm thinking to a precise rectifier built with an OP AMP followed by a
OP AMP integrator...


The usual method of making a "precise" RF voltmeter is to
begin with a wideband video amplifier with gain controls
setting the gain in the full-scale ranges desired.

However, the BACK END needs attention, particularly if you
want TRUE RMS measurement. The "less precise" HP3400A
AC voltmeter could do that True RMS within 1% using an
analog meter readout (mirrored scale on meter). The
3400A used a pair of matched heaters and thermocouples.
Amplified AC heated one heater. A high-gain DC op-amp
had inputs (opposing) from both thermocouples. Op-amp
output heated the second heater. This was self-balancing.
The AC Voltage indicated actually came from the DC op-amp
output.

If you are going to measure AC-RF volts of both sinewaves
and noise, you need True RMS indication. Without that the
noise (random stuff) read by simple averaging rectifiers
will be DOWN by as much as 50% compared to a sinewave input.
There are three basic types of AC voltmeters made: Rectify-
average (common to handheld multimeters); Logarithmic (now
a standard of high-end bench multimeters) using special ICs
for True RMS conversion to DC; Thermal (now out of favor
in new designs but using the first-principles of measuring
the effective heating of a resistive load). Thermocouple
sensors are reliable, can handle overloads, but a diode
string biased for forward conduction can produce DC voltage
changes of -2 mV / degree C heating.

For some references, you can search the Internet for
"RMS to DC" conversion, or begin at www.ednmag.com, go to
their Archives button, select issue for May 11, 2000, and
look at the "How It Works" article by Jim Williams of
Linear Technology Corporation. LTC made an IC that was a
dual heater-sensor, the LT1088, but that IC is now
discontinued. The article shows a "front end" as well as
the whole AC voltmeter circuit.