View Single Post
  #10   Report Post  
Old February 6th 04, 07:56 AM
Paul Keinanen
 
Posts: n/a
Default

On 05 Feb 2004 17:35:31 GMT, (Avery Fineman)
wrote:

Signal to noise ratio changes as the _square_root_ of bandwidth
change.


I do not quite understand this.

Usually the signal to noise ratio is defined as the signal power S
compared to the noise N (or compared to S+N). In a white noise
environment with constant noise density, the noise power is directly
proportional to bandwidth.

If the noise bandwidth is larger than the required signal bandwidth,
reducing the noise bandwidth will not affect the signal but only
reduce the noise power directly proportional to the bandwidth
(dropping the bandwidth to 1/4 will drop the noise power by 1/4 or 6
dB and increase the SNR by 6 dB).

However, if you looking at the noise _voltage_, it will drop by the
square_root of the bandwidth. But dropping the bandwidth to 1/4 will
drop the noise voltage to 1/2, which is again 6 dB.

In the OP's case, some of the signal sidebands are also cut, thus also
reducing the signal power. Are you assuming something about the
spectral or phase distribution of these sidebands (e.g. adding
coherent side bands produce the sum of the sideband _voltages_, while
adding noise from two side bands with random phase only produces a sum
of sideband _power_).

Paul OH3LWR