Home |
Search |
Today's Posts |
|
#1
![]() |
|||
|
|||
![]()
"N9NEO" ) writes:
Greetings, I have just got a sony ICF sw7600GR and it is a very nice radio. The sync detector seems to take care of a lot of the distortion, but the audio continues fading in and out and is quite annoying. Could the fading be mitigated to any extent by using another stage of agc? I am going to be doing some experiments with the 455kc if out on my Red Sun RP2100 whenever it gets here. Detectors, filters, SSB, etc... I thought that along with other experiments I might want to try some outboard agc. Synchronous detectors have never been about dealing with fading. They are about ensuring there is enough "carrier" to beat the sidebands down to audio. So there's fading on the incoming signal. That means the amplitude of the sidebands is varying with that fading. A locally generated "carrier" at the receiver ensures that there is something to beat those sidebands down to audio, even if the transmitter's carrier has faded too much to do the proper job. But a constant level "carrier" at the receiver beats the sideband down to audio intact, ie an ideal mixer would not add anything to the signal. So if the sideband is fading, of course the audio output of the receiver will vary with that fading. What the sync detector brings you is the ability to decode that signal even if the carrier goes missing, because of selective fading. Dealing with the fading of the sidebands is in a different realm, and obviously a miraculous receiver that eliminates fading has long been sought after. Armstrong dealt with it in part, by moving to FM and using limiters in the receiver, but that only works when the signal is above a certain level. Below it, the signal levels are too low for the limiters to kick in, and that fading is obvious. Beyond a certain point, you get conflict. Have a scheme that does a really good job of eliminating the fading, and likely that starts affecting the "fidelity" of the signal, because how do you discriminate between the voice at the transmitter end varying in amplitude, because the speaker starts talking more quietly or even just because sounds are made up of varying levels, and the signal fading as it travels to the receiver? It's easy to counter some of the fading, but it gets harder the more you try to conquer it. Michael |
#2
![]() |
|||
|
|||
![]() |
#4
![]() |
|||
|
|||
![]() Michael Black wrote: Beyond a certain point, you get conflict. Have a scheme that does a really good job of eliminating the fading, and likely that starts affecting the "fidelity" of the signal, because how do you discriminate between the voice at the transmitter end varying in amplitude, because the speaker starts talking more quietly or even just because sounds are made up of varying levels, and the signal fading as it travels to the receiver? It's easy to counter some of the fading, but it gets harder the more you try to conquer it. Michael Has anyone tried to measure the fading of the carrier and use that as a guide as to whether the level change in the sideband is fading related or program content related? Bob |
#5
![]() |
|||
|
|||
![]() Michael Black wrote: "N9NEO" ) writes: Greetings, I have just got a sony ICF sw7600GR and it is a very nice radio. The sync detector seems to take care of a lot of the distortion, but the audio continues fading in and out and is quite annoying. Could the fading be mitigated to any extent by using another stage of agc? I am going to be doing some experiments with the 455kc if out on my Red Sun RP2100 whenever it gets here. Detectors, filters, SSB, etc... I thought that along with other experiments I might want to try some outboard agc. Synchronous detectors have never been about dealing with fading. They are about ensuring there is enough "carrier" to beat the sidebands down to audio. Narrow band signal have less fading, thus sync demod will have less fading. However, the result isn't all that significant since all you have done is cut the bandwidth in half. So there's fading on the incoming signal. That means the amplitude of the sidebands is varying with that fading. A locally generated "carrier" at the receiver ensures that there is something to beat those sidebands down to audio, even if the transmitter's carrier has faded too much to do the proper job. But a constant level "carrier" at the receiver beats the sideband down to audio intact, ie an ideal mixer would not add anything to the signal. So if the sideband is fading, of course the audio output of the receiver will vary with that fading. With an envelope detector, the carrier isn't beating down the sideband. If you just look at the math of AM modulation, you would see that the carrier is just there for the ride. What the sync detector brings you is the ability to decode that signal even if the carrier goes missing, because of selective fading. Dealing with the fading of the sidebands is in a different realm, and obviously a miraculous receiver that eliminates fading has long been sought after. Armstrong dealt with it in part, by moving to FM and using limiters in the receiver, but that only works when the signal is above a certain level. Below it, the signal levels are too low for the limiters to kick in, and that fading is obvious. Beyond a certain point, you get conflict. Have a scheme that does a really good job of eliminating the fading, and likely that starts affecting the "fidelity" of the signal, because how do you discriminate between the voice at the transmitter end varying in amplitude, because the speaker starts talking more quietly or even just because sounds are made up of varying levels, and the signal fading as it travels to the receiver? It's easy to counter some of the fading, but it gets harder the more you try to conquer it. Michael |
#6
![]() |
|||
|
|||
![]() |
#7
![]() |
|||
|
|||
![]() Telamon wrote: In article . com, wrote: Michael Black wrote: "N9NEO" ) writes: Greetings, I have just got a sony ICF sw7600GR and it is a very nice radio. The sync detector seems to take care of a lot of the distortion, but the audio continues fading in and out and is quite annoying. Could the fading be mitigated to any extent by using another stage of agc? I am going to be doing some experiments with the 455kc if out on my Red Sun RP2100 whenever it gets here. Detectors, filters, SSB, etc... I thought that along with other experiments I might want to try some outboard agc. Synchronous detectors have never been about dealing with fading. They are about ensuring there is enough "carrier" to beat the sidebands down to audio. Narrow band signal have less fading, thus sync demod will have less fading. However, the result isn't all that significant since all you have done is cut the bandwidth in half. Narrow band signals do not have less fading. So there's fading on the incoming signal. That means the amplitude of the sidebands is varying with that fading. A locally generated "carrier" at the receiver ensures that there is something to beat those sidebands down to audio, even if the transmitter's carrier has faded too much to do the proper job. But a constant level "carrier" at the receiver beats the sideband down to audio intact, ie an ideal mixer would not add anything to the signal. So if the sideband is fading, of course the audio output of the receiver will vary with that fading. With an envelope detector, the carrier isn't beating down the sideband. If you just look at the math of AM modulation, you would see that the carrier is just there for the ride. Selective fading occurs when conditions cause a very narrow band of frequencies to be received at very low amplitudes where most of the side band information is present at levels that your receiver can ordinarily demodulate properly. When part of the side band is being notched out it does not sound all that bad but when the carrier gets weakened then the AM demodulator can't process the side band information properly and there is horrendous distortion. The carrier which is at the right frequency and phase relative to the side band information keeps the detector in the linear region so distortion is minimized. A sync detector uses a local oscillator in a similar to the way SSB is detected with the difference that it is phase locked to the signal carrier and mixed with it so when the carrier fades out this near perfect copy of the carrier allows the demodulator to continue to detect the side band or bands without distortion during a carrier fading condition. Here this necessary frequency and phase information carried by the "carrier" is retained by the sync circuitry. What the sync detector brings you is the ability to decode that signal even if the carrier goes missing, because of selective fading. Snip Michael has it right. -- Telamon Ventura, California Why do you insist that the atmosphere treats the carrier differently from the rest of the signal? Geez. You have a spectrum produced by modulation. If the modulation is AM, then a carrier is present. Now you are saying the atmosphere is sucking out the narrow band carrier and leaving the wideband spectrum untouched. Fiction at best. |
#8
![]() |
|||
|
|||
![]()
In article . com,
wrote: Telamon wrote: In article . com, wrote: Michael Black wrote: "N9NEO" ) writes: Greetings, I have just got a sony ICF sw7600GR and it is a very nice radio. The sync detector seems to take care of a lot of the distortion, but the audio continues fading in and out and is quite annoying. Could the fading be mitigated to any extent by using another stage of agc? I am going to be doing some experiments with the 455kc if out on my Red Sun RP2100 whenever it gets here. Detectors, filters, SSB, etc... I thought that along with other experiments I might want to try some outboard agc. Synchronous detectors have never been about dealing with fading. They are about ensuring there is enough "carrier" to beat the sidebands down to audio. Narrow band signal have less fading, thus sync demod will have less fading. However, the result isn't all that significant since all you have done is cut the bandwidth in half. Narrow band signals do not have less fading. So there's fading on the incoming signal. That means the amplitude of the sidebands is varying with that fading. A locally generated "carrier" at the receiver ensures that there is something to beat those sidebands down to audio, even if the transmitter's carrier has faded too much to do the proper job. But a constant level "carrier" at the receiver beats the sideband down to audio intact, ie an ideal mixer would not add anything to the signal. So if the sideband is fading, of course the audio output of the receiver will vary with that fading. With an envelope detector, the carrier isn't beating down the sideband. If you just look at the math of AM modulation, you would see that the carrier is just there for the ride. Selective fading occurs when conditions cause a very narrow band of frequencies to be received at very low amplitudes where most of the side band information is present at levels that your receiver can ordinarily demodulate properly. When part of the side band is being notched out it does not sound all that bad but when the carrier gets weakened then the AM demodulator can't process the side band information properly and there is horrendous distortion. The carrier which is at the right frequency and phase relative to the side band information keeps the detector in the linear region so distortion is minimized. A sync detector uses a local oscillator in a similar to the way SSB is detected with the difference that it is phase locked to the signal carrier and mixed with it so when the carrier fades out this near perfect copy of the carrier allows the demodulator to continue to detect the side band or bands without distortion during a carrier fading condition. Here this necessary frequency and phase information carried by the "carrier" is retained by the sync circuitry. What the sync detector brings you is the ability to decode that signal even if the carrier goes missing, because of selective fading. Snip Michael has it right. Why do you insist that the atmosphere treats the carrier differently from the rest of the signal? Geez. You have a spectrum produced by modulation. If the modulation is AM, then a carrier is present. Now you are saying the atmosphere is sucking out the narrow band carrier and leaving the wideband spectrum untouched. Fiction at best. What makes you interpret my remarks as the "atmosphere" treats the carrier differently than the rest of the signal? You mean ionosphere. Phase cancelation is a frequency dependent phenomenon and the modulation scheme or part of it does not matter. When phase cancelation is narrow and right at the carrier frequency then the carrier level goes down where the rest of the side band signal is still good. You been talking to John Smith or something? -- Telamon Ventura, California |
#9
![]() |
|||
|
|||
![]() |
#10
![]() |
|||
|
|||
![]() Carter-k8vt wrote: wrote: Why do you insist that the atmosphere treats the carrier differently from the rest of the signal? Because it does. See below... Geez. You have a spectrum produced by modulation. If the modulation is AM, then a carrier is present. Now you are saying the atmosphere is sucking out the narrow band carrier and leaving the wideband spectrum untouched. Fiction at best. Bzzzzt. Wrong! Yes, the "atmosphere" [ionosphere] DOES "suck out a narrow band" or even a single frequency. Ask any amateur radio operator that has used RTTY (radio teletype). The RTTY "modulation mode" used is FSK or frequency shift keying. At any given instant, the transmitter is sending either a "mark" or "space", essentially two carriers if you will, 170 Hertz apart that represent the 5-level Baudot code as used in ham RTTY. As an aid to tuning, an oscilloscope is used as a tuning indicator; the mark signal from your RTTY decoder is connected to the horizontal plates of the 'scope, the space signal to the vertical plates. On the screen of the CRT (due to the persistence of the CRT phosphors and your eyes), this shows what appears to be a "+" sign, also known as the classic "cross display". When you see the cross on your screen, you know you are tuned in properly. So, "What does this have to do with the discussion above?" you ask. Remember, you are looking at essentially two "carriers", 170 Hz apart, one on the horizontal axis and one displayed on the vertical axis. During disturbed ionospheric conditions, many times you will see one signal or the other disappear; i.e., the cross turns into a single line, either a "-" or a "|", depending if the mark or space faded--and yes, sometimes both fade, but it is more common to see one or the other disappear. This phenomenon is known to hams as "selective fading", is quite common and is interesting to observe. So, yes, the ionosphere CAN suck out one signal separated from another by as little as 170 Hz. But, is it 'sucking it out' or merely propagating it somewhere else other than that particular spot where your antenna is? And that 'somewhere else' might not be very far away, but merely a few wavelengths in distance. dxAce Michigan USA |
Reply |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Another Dallas Lankford article on synch detectors | Shortwave | |||
Interesting article on fading distortion | Shortwave | |||
Fading signals | Homebrew | |||
Quasi Synchronous?? | Shortwave |