View Single Post
  #1   Report Post  
Old September 28th 06, 09:27 PM posted to rec.radio.amateur.homebrew
[email protected] radiofreeq@gmail.com is offline
external usenet poster
 
First recorded activity by RadioBanter: Sep 2006
Posts: 2
Default Input Bandwidth of an ADC.

Hello All,
I was going through an ADC tutorial and realized that they use the term
input bandwidth more frequently as one of the important factors to be
considered before choosing an ADC.

Based on my search, I found out that "the input bandwidth is something
that determines the maximum bandwidth of an input signal that can be
accurately sampled, regardless of the stated sample rate".

http://www.diamondsystems.com/slides...utorial&page=4

I am not sure why is this input bandwidth different from Fs/2. From the
above link, one of the example has Fs/4. Is it just because to increase
more precision that we use a sampling frequency 4 times the bandwidth
of the signal?

I assume for practical reasons, we oversample it but that is for more
precision (correct me) but the Nyquist criteria still holds good for
sampling the frequencies upto Fs/2. That means I can still get the
original signal. I am not sure about the accuracy though.


Also, can I say that if an ADC is oversampled 4 times the Nyquist rate
(Fs = 4*BW), then is my input frequency given by Fs/4? Assume single
channel case.
Correct me if I am wrong.

Thanks for your responses.