Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
Hello All,
I was going through an ADC tutorial and realized that they use the term input bandwidth more frequently as one of the important factors to be considered before choosing an ADC. Based on my search, I found out that "the input bandwidth is something that determines the maximum bandwidth of an input signal that can be accurately sampled, regardless of the stated sample rate". http://www.diamondsystems.com/slides...utorial&page=4 I am not sure why is this input bandwidth different from Fs/2. From the above link, one of the example has Fs/4. Is it just because to increase more precision that we use a sampling frequency 4 times the bandwidth of the signal? I assume for practical reasons, we oversample it but that is for more precision (correct me) but the Nyquist criteria still holds good for sampling the frequencies upto Fs/2. That means I can still get the original signal. I am not sure about the accuracy though. Also, can I say that if an ADC is oversampled 4 times the Nyquist rate (Fs = 4*BW), then is my input frequency given by Fs/4? Assume single channel case. Correct me if I am wrong. Thanks for your responses. |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Input stage for VHF frequency counter in an FPGA? | Homebrew | |||
ARRL Admits Mistakes in Regulation By Bandwidth Proposal | Policy | |||
How to measure soil constants at HF | Antenna | |||
Constant bandwidth TRF circuit | Shortwave | |||
CCIR Coefficients METHOD 6 REC533 // AUCKLAND --> SEATTLE | Shortwave |