View Single Post
  #26   Report Post  
Old August 25th 06, 09:52 PM posted to rec.radio.amateur.homebrew
tim gorman tim gorman is offline
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 8
Default AGC signal/noise question...

Andrea Baldoni wrote:



tim gorman wrote:

: Are you sure you are seeing an AGC problem? What you describe above,
: with slightly less signal in the center, is typical of a *filter* with
: dip in

I have the filter dip; it was not cured with realigning, but realigning
was anyway helpful to gain a dB or two.
I also see that disabling AGC cause less noise in FM while listening to 2m
converted to HF by the internal converter of the Yaesu FR-101. I didn't
check if enabling or disabling AGC cause any change in the filter dip,
anyway I'll check and report soon.

: Without knowing more about the receiver I can't make any guesses as to
: what is in play here but I question if this is an AGC artifact.

It uses a MC1496G as mixer and two CA3053 as IF. Plus some DG FETs, in
first RF amplifier, after the mixer... What information do you need?

Ciao,
AB

... Andrea Baldoni, 2002: messaggio non protetto da copyright.


A block diagram would be helpful. Do you know if the AGC is being derived
from the audio chain or from a sampling circuit in the IF chain?

Where are you turning off the AGC? In the FR-101? It sounds like you have a
frequency converter feeding an HF receiver with an FM position. Is that
correct? Could you just as easily tune in 2-m SSB as well as 2-m FM?

I can't seem to grasp why turning off the AGC would result in *less* noise,
especially on FM.

The "noise factor" of the system is probably fixed by the converter, not the
receiver. It would be like hooking an antenna to an HF receiver. Usually
the noise in the receiver is set by the atmospheric noise the antenna picks
up, not by the noise factor of the receiver. (On the higher HF bands, 15m
and 10m, this may not always be the case) The same would apply for the 2m
converter. Unless it is designed very well its contribution to the noise at
the antenna of the receiver would probably mask the noise factor of the
receiver itself.

If you could kill the power to the converter you could probably test for
this by just killing the power and seeing what happens to the noise out of
the receiver speaker. If it goes down, then the noise factor of the
receiver is irrelevant. If it doesn't change then the converter is
contributing less noise than the receiver itself.

If turning off the AGC causes less noise output then my first guess would be
to look at what "turning off the AGC" is actually doing. Is it actually
breaking the AGC loop so the AGC inputs to the CA3053 amps are left
floating? Or does turning off the AGC actually mean putting a fixed bias on
the CA3053's? Either case could potentially cause the gain of the CA3053's
to actually go down with the AGC turned off and that might be what is going
on.

Can you actually monitor the AGC loop to see what happens to the AGC voltage
when the AGC is turned off?

tim ab0wr