View Single Post
  #7   Report Post  
Old August 7th 03, 06:25 AM
Floyd Davidson
 
Posts: n/a
Default

(ray) wrote:
Well considering that P=E*E/R then 10*10/20 equals 5 watts in my book.


Did anyone have a different book? You didn't even discuss that
in your previous article, and that was well established in my
response.

Audio amps generally have 10% THD as a rule of thumb, at 8 ohms.


And interesting statement, but not at all true. Still, it hasn't
got anything at all to do with this discussion.

And
an amp that is rated for 5 watts at 8 ohms won't deliver 5 watts into
20 ohms.


That is true, as I carefully explained in the article you have replied
to, though you did not quote the text which explained why that is true.

What you said previously was:

"sure you keep an 8 ohm load on the audio amp. The amp would
be operating way out in the nonlinear portion without 8 ohm
load, the result would be distortion. Figure your'e going to
need 1.2 watts and up to 12 watts of audio. Depending on how
loud you want it on the Rx end."

And I noted that is not true.

Solid state audio amplifiers do not need a specific load for
linearity. The common limitation is that if the load impedance
is too low, at the maximum output voltage there will be
excessive current flow which can damage the amplifier. On the
other hand, if there is a higher load impedance, the *same*
maximum voltage is reached, but less current flows and therefore
less power is delivered and there is *no* significant change in
the operating characteristics of the amplifier. It just delivers
less power.

And that's because the Vcc doesn't go up when you put 20 ohms
on. The Vcc stays the same.


Which is why I pointed that out in the article you are
responding to, but have not quoted. The *voltage* still has the
same range, but since the load impedance is 2.5 time higher, the
current will be necessarily 2.5 times lower, and thus the power
delivered is lower too.

So if you put 20 ohms on a 12.5 watt amp
rated for 8 ohms, it will draw only 5 watts of power at full blast.


That is exactly the case. However, you've missed the point.
Nobody is concerned about *power* in this discussion, as the
requirement was for *voltage*. 10 volts, to be exact. The
question there for is can any given amplifier deliver 10 volts.
And what was shown is that if the amplifier is rated to provide
12.5 watts into 8 ohms that does tell us that it *must* be
providing, at peak output,

P = E*E/R

as you said, which is

12.5 = E*E/8

which can be solve for E to be

-----------
E = _ / 100
\/

Or precisely 10 volts. You can draw the conclusion that such an
amplifier will be able to deliver 10 volts. You can also
conclude that if an amplifier is rated to provide less than 12.5
watts to an 8 ohm load, it will *not* be able to generate the
requisite 10 volts.

Believe me I've subbed out 16 ohm speakers for 8 ohm, not only does
the power drop but the distortion goes way up to the point that it
would give you a headache.


Not true.

The power drops, but if the distortion goes way up it is *only*
because you have increased the volume in an attempt to get more
power than it can provide. If instead you had left the gain
control in exactly the same spot, and allowed it to deliver
exactly the same voltage (and a lower power), there would *not*
be any difference in distortion (that you could detect without
using some better than average test equipment).

Regardless, the trouble is that the output of the amp we are
actually dealing with is not going to be provided with a 8 Ohm
load. Or even a 20 Ohm load. It will have a 70,000 Ohm load.

And YIKES, it seems you missed the error I made in the previous
article, where I put the decimal in the wrong place and said
that would be 0.00014 watts.

Hmmmm...

E * E
P = -------
R

100
P = --------
70,000

or
1
P = -----
700

Or 0.0014 (not 0.00014 that I stated in the previous article, sorry!)

So... 10 volts across 70,000 Ohms will result in 0.0014 watts
being delivered to the load. That is just more than a
milliwatt, or 0 dBm, of power actually needed to accomplish the
task.

Hence you do *not* need a 12.5 watt minimum output amplifier at
all. What is needed is only something slightly more than 0 dBm,
and that also means it must rated at that power into 70,000
ohms, or else the rate impedance must be transformed in some way
to match the amplifier to the load. That is easy, and brings up
the point that 8 ohm amplifiers probably are not the best
possible solution, since a more obvious one is easily available.

Typically "amplifiers" with 600 ohm "line" outputs have a
maximum output rating of 3 to 6 dB over the nominal 0 dBm output
intended to be considered "normal". That means those amplifiers
(virtually all stereo preamps, tuners, etc) are quite adequate
for the use at hand, which needs 1.4 milliwatts.

The only trick necessary is choosing an appropriate matching
transformer to boost the impedance from 600 to 70,000 so that
the 1.4 milliwatt will be at 10 volts across the 70,000 Ohm
load. The impedance ratio is 600/70,000, so a transformer
giving approximately a 1:100 impedance ratio should be just
fine, and most likely anything with a slightly lower ratio to a
considerably higher ratio will also work very well.

--
Floyd L. Davidson http://web.newsguy.com/floyd_davidson
Ukpeagvik (Barrow, Alaska)