Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #11   Report Post  
Old January 18th 07, 04:54 AM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 317
Default Sync detectors and fading


Telamon wrote:
In article . com,
wrote:

Michael Black wrote:
"N9NEO" ) writes:
Greetings,

I have just got a sony ICF sw7600GR and it is a very nice radio.
The sync detector seems to take care of a lot of the distortion,
but the audio continues fading in and out and is quite annoying.
Could the fading be mitigated to any extent by using another
stage of agc? I am going to be doing some experiments with the
455kc if out on my Red Sun RP2100 whenever it gets here.
Detectors, filters, SSB, etc... I thought that along with other
experiments I might want to try some outboard agc.

Synchronous detectors have never been about dealing with fading.
They are about ensuring there is enough "carrier" to beat the
sidebands down to audio.


Narrow band signal have less fading, thus sync demod will have less
fading. However, the result isn't all that significant since all you
have done is cut the bandwidth in half.


Narrow band signals do not have less fading.

So there's fading on the incoming signal. That means the amplitude
of the sidebands is varying with that fading. A locally generated
"carrier" at the receiver ensures that there is something to beat
those sidebands down to audio, even if the transmitter's carrier
has faded too much to do the proper job. But a constant level
"carrier" at the receiver beats the sideband down to audio intact,
ie an ideal mixer would not add anything to the signal. So if the
sideband is fading, of course the audio output of the receiver will
vary with that fading.


With an envelope detector, the carrier isn't beating down the
sideband. If you just look at the math of AM modulation, you would
see that the carrier is just there for the ride.


Selective fading occurs when conditions cause a very narrow band of
frequencies to be received at very low amplitudes where most of the
side band information is present at levels that your receiver can
ordinarily demodulate properly.

When part of the side band is being notched out it does not sound all
that bad but when the carrier gets weakened then the AM demodulator
can't process the side band information properly and there is
horrendous distortion. The carrier which is at the right frequency and
phase relative to the side band information keeps the detector in the
linear region so distortion is minimized.

A sync detector uses a local oscillator in a similar to the way SSB is
detected with the difference that it is phase locked to the signal
carrier and mixed with it so when the carrier fades out this near
perfect copy of the carrier allows the demodulator to continue to
detect the side band or bands without distortion during a carrier
fading condition. Here this necessary frequency and phase information
carried by the "carrier" is retained by the sync circuitry.

What the sync detector brings you is the ability to decode that
signal even if the carrier goes missing, because of selective
fading.



Snip

Michael has it right.

--
Telamon
Ventura, California


Why do you insist that the atmosphere treats the carrier differently
from the rest of the signal? Geez. You have a spectrum produced by
modulation. If the modulation is AM, then a carrier is present. Now you
are saying the atmosphere is sucking out the narrow band carrier and
leaving the wideband spectrum untouched. Fiction at best.

  #12   Report Post  
Old January 18th 07, 05:40 AM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 4,494
Default Sync detectors and fading

In article . com,
wrote:

Telamon wrote:
In article . com,
wrote:

Michael Black wrote:
"N9NEO" ) writes:
Greetings,

I have just got a sony ICF sw7600GR and it is a very nice radio.
The sync detector seems to take care of a lot of the distortion,
but the audio continues fading in and out and is quite annoying.
Could the fading be mitigated to any extent by using another
stage of agc? I am going to be doing some experiments with the
455kc if out on my Red Sun RP2100 whenever it gets here.
Detectors, filters, SSB, etc... I thought that along with other
experiments I might want to try some outboard agc.

Synchronous detectors have never been about dealing with fading.
They are about ensuring there is enough "carrier" to beat the
sidebands down to audio.

Narrow band signal have less fading, thus sync demod will have less
fading. However, the result isn't all that significant since all you
have done is cut the bandwidth in half.


Narrow band signals do not have less fading.

So there's fading on the incoming signal. That means the amplitude
of the sidebands is varying with that fading. A locally generated
"carrier" at the receiver ensures that there is something to beat
those sidebands down to audio, even if the transmitter's carrier
has faded too much to do the proper job. But a constant level
"carrier" at the receiver beats the sideband down to audio intact,
ie an ideal mixer would not add anything to the signal. So if the
sideband is fading, of course the audio output of the receiver will
vary with that fading.

With an envelope detector, the carrier isn't beating down the
sideband. If you just look at the math of AM modulation, you would
see that the carrier is just there for the ride.


Selective fading occurs when conditions cause a very narrow band of
frequencies to be received at very low amplitudes where most of the
side band information is present at levels that your receiver can
ordinarily demodulate properly.

When part of the side band is being notched out it does not sound all
that bad but when the carrier gets weakened then the AM demodulator
can't process the side band information properly and there is
horrendous distortion. The carrier which is at the right frequency and
phase relative to the side band information keeps the detector in the
linear region so distortion is minimized.

A sync detector uses a local oscillator in a similar to the way SSB is
detected with the difference that it is phase locked to the signal
carrier and mixed with it so when the carrier fades out this near
perfect copy of the carrier allows the demodulator to continue to
detect the side band or bands without distortion during a carrier
fading condition. Here this necessary frequency and phase information
carried by the "carrier" is retained by the sync circuitry.

What the sync detector brings you is the ability to decode that
signal even if the carrier goes missing, because of selective
fading.



Snip

Michael has it right.


Why do you insist that the atmosphere treats the carrier differently
from the rest of the signal? Geez. You have a spectrum produced by
modulation. If the modulation is AM, then a carrier is present. Now you
are saying the atmosphere is sucking out the narrow band carrier and
leaving the wideband spectrum untouched. Fiction at best.


What makes you interpret my remarks as the "atmosphere" treats the
carrier differently than the rest of the signal?

You mean ionosphere. Phase cancelation is a frequency dependent
phenomenon and the modulation scheme or part of it does not matter.

When phase cancelation is narrow and right at the carrier frequency then
the carrier level goes down where the rest of the side band signal is
still good.

You been talking to John Smith or something?

--
Telamon
Ventura, California
  #13   Report Post  
Old January 18th 07, 07:23 AM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Sep 2006
Posts: 80
Default Sync detectors and fading

N9NEO wrote:
Greetings,

I have just got a sony ICF sw7600GR and it is a very nice radio. The
sync detector seems to take care of a lot of the distortion, but the
audio continues fading in and out and is quite annoying. Could the
fading be mitigated to any extent by using another stage of agc? I am
going to be doing some experiments with the 455kc if out on my Red Sun
RP2100 whenever it gets here. Detectors, filters, SSB, etc... I thought
that along with other experiments I might want to try some outboard
agc.

regards,
NEO


The sync' is doing what it's designed for by reducing the distortion
caused by selective fading, but you need a longer time constant (release
time) for the AGC, to help smooth out the fading. You could lengthen the
time constant of the AGC circuit in the 7600GR but then it wouldn't work
well for other conditions where a shorter time constant is needed. This
is why a good table-top receiver has more than one AGC rate, which can
be selected by the user. The AGC in the Drake-R8 series uses a decay
rate of 300-ms for the fast setting and a much longer rate of about
2-seconds in the slow mode. The latter really aides in reducing the
effects of rapid fading.
  #14   Report Post  
Old January 18th 07, 01:47 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Sep 2006
Posts: 69
Default Sync detectors and fading

wrote:

Why do you insist that the atmosphere treats the carrier differently
from the rest of the signal?


Because it does. See below...

Geez. You have a spectrum produced by
modulation. If the modulation is AM, then a carrier is present. Now you
are saying the atmosphere is sucking out the narrow band carrier and
leaving the wideband spectrum untouched. Fiction at best.


Bzzzzt. Wrong!

Yes, the "atmosphere" [ionosphere] DOES "suck out a narrow band" or even
a single frequency. Ask any amateur radio operator that has used RTTY
(radio teletype).

The RTTY "modulation mode" used is FSK or frequency shift keying. At any
given instant, the transmitter is sending either a "mark" or "space",
essentially two carriers if you will, 170 Hertz apart that represent the
5-level Baudot code as used in ham RTTY.

As an aid to tuning, an oscilloscope is used as a tuning indicator; the
mark signal from your RTTY decoder is connected to the horizontal plates
of the 'scope, the space signal to the vertical plates. On the screen of
the CRT (due to the persistence of the CRT phosphors and your eyes),
this shows what appears to be a "+" sign, also known as the classic
"cross display". When you see the cross on your screen, you know you are
tuned in properly.

So, "What does this have to do with the discussion above?" you ask.

Remember, you are looking at essentially two "carriers", 170 Hz apart,
one on the horizontal axis and one displayed on the vertical axis.
During disturbed ionospheric conditions, many times you will see one
signal or the other disappear; i.e., the cross turns into a single line,
either a "-" or a "|", depending if the mark or space faded--and yes,
sometimes both fade, but it is more common to see one or the other
disappear.

This phenomenon is known to hams as "selective fading", is quite common
and is interesting to observe.

So, yes, the ionosphere CAN suck out one signal separated from another
by as little as 170 Hz.

Carter
K8VT
  #15   Report Post  
Old January 18th 07, 01:58 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 7,243
Default Sync detectors and fading



Carter-k8vt wrote:

wrote:

Why do you insist that the atmosphere treats the carrier differently
from the rest of the signal?


Because it does. See below...

Geez. You have a spectrum produced by
modulation. If the modulation is AM, then a carrier is present. Now you
are saying the atmosphere is sucking out the narrow band carrier and
leaving the wideband spectrum untouched. Fiction at best.


Bzzzzt. Wrong!

Yes, the "atmosphere" [ionosphere] DOES "suck out a narrow band" or even
a single frequency. Ask any amateur radio operator that has used RTTY
(radio teletype).

The RTTY "modulation mode" used is FSK or frequency shift keying. At any
given instant, the transmitter is sending either a "mark" or "space",
essentially two carriers if you will, 170 Hertz apart that represent the
5-level Baudot code as used in ham RTTY.

As an aid to tuning, an oscilloscope is used as a tuning indicator; the
mark signal from your RTTY decoder is connected to the horizontal plates
of the 'scope, the space signal to the vertical plates. On the screen of
the CRT (due to the persistence of the CRT phosphors and your eyes),
this shows what appears to be a "+" sign, also known as the classic
"cross display". When you see the cross on your screen, you know you are
tuned in properly.

So, "What does this have to do with the discussion above?" you ask.

Remember, you are looking at essentially two "carriers", 170 Hz apart,
one on the horizontal axis and one displayed on the vertical axis.
During disturbed ionospheric conditions, many times you will see one
signal or the other disappear; i.e., the cross turns into a single line,
either a "-" or a "|", depending if the mark or space faded--and yes,
sometimes both fade, but it is more common to see one or the other
disappear.

This phenomenon is known to hams as "selective fading", is quite common
and is interesting to observe.

So, yes, the ionosphere CAN suck out one signal separated from another
by as little as 170 Hz.


But, is it 'sucking it out' or merely propagating it somewhere else other than
that particular spot where your antenna is?

And that 'somewhere else' might not be very far away, but merely a few
wavelengths in distance.

dxAce
Michigan
USA




  #17   Report Post  
Old January 18th 07, 02:33 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 285
Default Sync detectors and fading

dxAce wrote:

But, is it 'sucking it out' or merely propagating it somewhere else other than
that particular spot where your antenna is?

And that 'somewhere else' might not be very far away, but merely a few
wavelengths in distance.

dxAce
Michigan
USA


Before satellites carried most of the milcom they used "diversity
receivers".
Two, or more, receivers tuned to the same frequency but located some
distance apart.
The logic being that when the singal faded at one location, the other
didn't fade at the
same time. The more important a comm cicuit the more receivers spread
over a wider
area.

A friend and I played with our receivers feeding phone patches and
since we live 30
miles apart it was clear this approach was workable. With signals that
experienced
deep fades we were able to listen to nearly all of the time. Real
(commercial or
military) had AGC based voting systems to decided which signal to pass.
We ran
into issues of our audio phases shifting producing very odd sounding
"flanging"
effects.

I have often thought about trying this with receivers whose antennas
are only a few
hundred to thosand feet apart. I never have gotten around to it.


The military also used freqeuncy diversity, sending the same singal on
more then one
frequency. Kind of like listening to WWV on 5 10 and 15MHz at the same
time.

Terry

  #18   Report Post  
Old January 18th 07, 04:02 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 7,243
Default Sync detectors and fading



wrote:

dxAce wrote:

But, is it 'sucking it out' or merely propagating it somewhere else other than
that particular spot where your antenna is?

And that 'somewhere else' might not be very far away, but merely a few
wavelengths in distance.

dxAce
Michigan
USA


Before satellites carried most of the milcom they used "diversity
receivers".
Two, or more, receivers tuned to the same frequency but located some
distance apart.
The logic being that when the singal faded at one location, the other
didn't fade at the
same time. The more important a comm cicuit the more receivers spread
over a wider
area.

A friend and I played with our receivers feeding phone patches and
since we live 30
miles apart it was clear this approach was workable. With signals that
experienced
deep fades we were able to listen to nearly all of the time. Real
(commercial or
military) had AGC based voting systems to decided which signal to pass.
We ran
into issues of our audio phases shifting producing very odd sounding
"flanging"
effects.

I have often thought about trying this with receivers whose antennas
are only a few
hundred to thosand feet apart. I never have gotten around to it.

The military also used freqeuncy diversity, sending the same singal on
more then one
frequency. Kind of like listening to WWV on 5 10 and 15MHz at the same
time.


Exactly the point I was trying to make.

dxAce
Michigan
USA

  #19   Report Post  
Old January 18th 07, 04:56 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 2,027
Default Sync detectors and fading

wrote:
dxAce wrote:

But, is it 'sucking it out' or merely propagating it somewhere else other than
that particular spot where your antenna is?

And that 'somewhere else' might not be very far away, but merely a few
wavelengths in distance.

dxAce
Michigan
USA


Before satellites carried most of the milcom they used "diversity
receivers".
Two, or more, receivers tuned to the same frequency but located some
distance apart.
The logic being that when the singal faded at one location, the other
didn't fade at the
same time. The more important a comm cicuit the more receivers spread
over a wider
area.

A friend and I played with our receivers feeding phone patches and
since we live 30
miles apart it was clear this approach was workable. With signals that
experienced
deep fades we were able to listen to nearly all of the time. Real
(commercial or
military) had AGC based voting systems to decided which signal to pass.
We ran
into issues of our audio phases shifting producing very odd sounding
"flanging"
effects.

I have often thought about trying this with receivers whose antennas
are only a few
hundred to thosand feet apart. I never have gotten around to it.


The military also used freqeuncy diversity, sending the same singal on
more then one
frequency. Kind of like listening to WWV on 5 10 and 15MHz at the same
time.

Terry


Fascinating. It sounds like a couple of antennae, maybe even on the
same property but spaced some modest distance apart, maybe a few
hundred feet, and phased into the same radio, might also be a solution
to the problem.

Anyone try this with a 50-acre lot and a phasing harness?

Bruce Jensen

  #20   Report Post  
Old January 18th 07, 05:15 PM posted to rec.radio.shortwave
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 7,243
Default Sync detectors and fading



bpnjensen wrote:

wrote:
dxAce wrote:

But, is it 'sucking it out' or merely propagating it somewhere else other than
that particular spot where your antenna is?

And that 'somewhere else' might not be very far away, but merely a few
wavelengths in distance.

dxAce
Michigan
USA


Before satellites carried most of the milcom they used "diversity
receivers".
Two, or more, receivers tuned to the same frequency but located some
distance apart.
The logic being that when the singal faded at one location, the other
didn't fade at the
same time. The more important a comm cicuit the more receivers spread
over a wider
area.

A friend and I played with our receivers feeding phone patches and
since we live 30
miles apart it was clear this approach was workable. With signals that
experienced
deep fades we were able to listen to nearly all of the time. Real
(commercial or
military) had AGC based voting systems to decided which signal to pass.
We ran
into issues of our audio phases shifting producing very odd sounding
"flanging"
effects.

I have often thought about trying this with receivers whose antennas
are only a few
hundred to thosand feet apart. I never have gotten around to it.


The military also used freqeuncy diversity, sending the same singal on
more then one
frequency. Kind of like listening to WWV on 5 10 and 15MHz at the same
time.

Terry


Fascinating. It sounds like a couple of antennae, maybe even on the
same property but spaced some modest distance apart, maybe a few
hundred feet, and phased into the same radio, might also be a solution
to the problem.

Anyone try this with a 50-acre lot and a phasing harness?


I don't think that would work properly. In practice, I think you need the 'voting
machine' that works on two receivers AGC to pick the best signal.

I do recall some folks trying to emulate this to a certain degree by having two
receivers, two antennas widely seperated (more than a wavelength), and feeding the
audio to headphones (one receiver in the right ear, one in the left).

dxAce
Michigan
USA


Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Another Dallas Lankford article on synch detectors [email protected] Shortwave 11 August 2nd 06 02:08 AM
Interesting article on fading distortion [email protected] Shortwave 37 August 1st 06 04:27 AM
Fading signals clifto Homebrew 3 April 19th 06 11:49 PM
Quasi Synchronous?? Lucky Shortwave 38 June 10th 05 03:56 AM


All times are GMT +1. The time now is 03:53 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017