What is the point of digital voice?
On 3/8/2015 6:06 PM, Brian Reay wrote:
On 08/03/15 19:58, rickman wrote:
On 3/8/2015 3:31 PM, Brian Reay wrote:
On 08/03/15 18:46, rickman wrote:
On 3/8/2015 9:53 AM, Brian Reay wrote:
Jerry Stuckle wrote:
On 3/8/2015 7:35 AM, Brian Reay wrote:
Jeff wrote:
I will finally point out that your use of the term "slope
detecting ADC"
is invalid. Google returns exactly 4 hits when this term is
entered
with quotes. The name of this converter may have slope in it, but
that
is because the circuit generates a slope, not because it is
detecting a
slope. Please look up the circuit and use a proper name for it
such as
integrating ADC or dual slope ADC. The integrating converter is
not at
all sensitive to the slope of the input signal, otherwise it would
not
be able to measure a DC signal which has a slope of zero.
I'm only replying so that others are not confused by your
misstatements.
He is probably referring to a CVSD, otherwise known as a Delta
Modulator.
Jeff
I don't think so. In fact, I have to say Jerry seems a bit confused
in this
particular area, perhaps I have missed something.
ADC tend to have a sample and hold prior to the actual ADC
convertor, thus
the value converted is that at the beginning of the sample period
OR if
another approach to conversion is used, you get some kind of average
over
the conversion period. (There are other techniques but those are the
main
ones.)
If you think about, a S/H is required if the rate of change of the
input
signal means it can change by 1/2 lsb during the conversion time for
a SAR
ADC. This limits the overall BW of the ADC process. (I recall
spending
some time convincing a 'seat of the pants engineer' of this when his
design
wouldn't work. Even when he adopted the suggested changes he
insisted his
design would have worked if the ADC was more accurate. In fact, it
would
have made it worse.)
No, Brian, I am not confused. It is a form of delta modulation,
but is
used in an ADC. Two samples are taken, 2 or more times the sample
rate
(i.e. if the sample rate were 20us, the first sample would be taken
every 20us, with the second sample following by 10us or less). The
difference is converted to a digital value for transmission. On the
other end, the reverse happens.
Yes, the signal can change by 1/2 lsb - but that's true of any ADC.
For any sufficiently high sample rate (i.e. 3x input signal or more),
this method is never less accurate than a simple voltage detecting
ADC,
and in almost every case is more accurate. However, it is a more
complex circuit (on both ends), samples a much smaller analog value
and
requires more exacting components and a higher cost (which is
typically
the case for any circuit improvements).
As I said - we studied them in one of my EE coursed back in the
70's. I
played with them for a while back then, but at the time the ICs were
pretty expensive for a college student.
Ok Jerry. You can, of course, find the rate of change (slope) by that
method if you know ( or assume) the signal is either only
increasing or
decreasing between the samples. (A Nyquist matter).
However, the 1/2 lsb matter I mentioned is more for during the
conversion,
rather that for different samples. It is particularly important for
slower
ADC types, such as SAR implementations.
Can you explain your 1/2 lsb effect? What type of ADC are you
referring
to? Different ADC types do require a S/H on the input for signals that
are not *highly* oversampled. For example a flash converter can
mess up
and be quite a bit off if the signal is slewing during conversion.
Same
with SAR converters. But I don't know of any effect where 1/2 lsb is a
threshold.
What threshold would you expect? As I recall, 1/2 lsb is the limit to
ensure that the conversion would be the 'same' over the conversion time.
I'm not sure what you mean by "the conversion would be the 'same' over
the conversion time", but I don't see how 1/2 lsb is any magic threshold.
If you are working with a flash converter, there are a number of
comparators each with a different threshold. The input signal could be
right at the edge of one of these thresholds so that a very tiny change
in the input signal will cause that threshold to be crossed during the
conversion.
Maybe I'm not understanding your point.
Sorry, I was referring to SAR converters. I should have been more precise.
With an SAR converter, if the signal changes during the conversion
period, then the converter will fail (at best)*, if the change is more
than 1/2 lsb. Therefore, the signal must remain constant (within 1/2
lsb) for the period of the conversion. If the maximum rate of change of
signal is known to be such that this will be the case, all is well, if
not, you need a sample and hold. You sample the signal, convert the
sample, and repeat the process for the next sample.
The S/H is designed to minimise the sampling time while ensuring the
required hold time is maintained- ie the sample stays within the
required 1/2 lsb for the conversion period.
Of course, some SAR ADCs have the S/H incorporated within the device,
others require either an external one or have provision for the C to be
external to permit design flexibility.
I understand what you are describing, but you still have not explained
the basis of the 1/2 lsb threshold. In an SAR converter the thresholds
are still fixed. So the amount of room for noise depends on the value
of the signal. If the signal is 1/4 of an lsb from the next conversion
threshold then 1/4 lsb of noise will cause a wrong reading. If the
signal is within 0.001 lsb of the threshold then 0.001 lsb of change in
the signal will cause an error.
*by fail, rather depends on the converter. You will at least get an
false reading. I recall using one ADC which set a bit indicating a
failure to 'find' a 'match'.
I recall the details of the parameters of the S/H design being in the
application notes of the various ADCs I used over the years, I expect if
you look at some you can see for yourself.
By their nature (and application) flash converters don't require an S/H
but lack the resolution of SAR ADCs. They have other limitations of
course. If memory serves, one being that they are not monotonic which
was a requirement in the application I tended to apply ADCs (control
circuits, feedback loops don't like non-monotonic converters).
Actually even flash converters work better with S/H in front of them. A
S/H circuit can have a very small aperture window while the converter
itself often has a much larger window. Remember that all of these
comparators work in parallel with different delays. Even if those
delays are small, these devices are designed to sample the fastest
signals possible and the variations can only be minimized, not
eliminated. So a slewing signal will not convert as accurately and can
cause the sort of error where the thermometer code output from the
comparators is not self consistent having more than one 0/1 transition
in the code. Some flash devices have circuitry to prevent this from
causing an output error, but it can add inaccuracy to the result.
--
Rick
|