View Single Post
  #1   Report Post  
Old March 21st 10, 08:49 PM posted to rec.radio.amateur.antenna
Oliver Mattos Oliver Mattos is offline
external usenet poster
 
First recorded activity by RadioBanter: Mar 2010
Posts: 1
Default Noise Prediction

Hi,

All the communication equations and formulae today I know of (eg. the
Shannon-Hartley Theorem), give limits on data transmission given
certain signal and noise power levels.

Most models assume that the data received is the sum of the original
signal and Gaussian noise. More advanced models assume a transfer
function is applied to the signal to simulate multipath, and other
radio phenomena.

My question is that since in many cases at least part of the noise is
not entirely unpredictable, it seems like if it could be predicted,
then it could be subtracted from the received signal, therefore not
counting as noise as far as the Shannon-Hartley Theorem goes,
therefore allowing a higher channel throughput when all other
conditions are the same.

Examples of "predictable" interference would be EMI from other man-
made devices, such as oscillators in power supplies.

My idea for doing this would be to receive a given signal (assumed
digital), demodulate it and apply error correction to obtain the
original data. Next, re-encode and modulate the data just like the
transmitter did. At this point, the receiver has a precise copy of
the data transmitted. Next apply a transfer function which simulates
the channel (this part would have to be self-tuning to minimise
error). Now the receiver has a copy of the data as it would have
been
received if there were no external noise sources (but including the
effects of signal reflection and fading, which would be included in
the transfer function).

Next, the receiver could subtract the "ideal" received data from the
actual received data, obtaining the noise received. Of this noise,
some is predictable, and some is truly random (assume true Gaussian).
This data could then be Fourier transformed, time-shifted, and
inverse
Fourier transformed to obtain a prediction of noise, which could then
be subtracted from the incoming signal for the next piece of received
data.

Similar ideas could be used for removing unwanted signals. For
example, imagine two people are transmitting on the same channel. If
you know what type of modulation and error correction they are both
using, it seems feasible that one signal could be decoded, subtracted
from the incoming signal, leaving enough information about a weaker
signal to decode that as well. If neither signal can be decoded
"first" (ie. when treating the other signal as random noise), then I
guess using linear equations to represent the data streams, it is
still possible to decode them as long as the sum of signal data
bandwidths is less than the channel capacity.

Would any of the above sound vaguely plausible? Has it been done
before? How much of real-world noise is "predictable"? How complex
would my noise prediction models need to be to get any real benefit?
Is this the kind of thing I could investigate with a software defined
radio and lots of MATLAB?

Thanks
Oliver Mattos
Imperial College London Undergraduate

(Cross posted from sci.physics.electromag, I can't seem to find
directly relevant groups)