In article ,
Telamon wrote:
Every codec can compress audio/video/pictures to a certain extent to
transmit the data in less bandwidth. The no-loss ones don't do enough so
people employ ones that distort the data in acceptable ways. Whether the
distortion is "acceptable" or not depends upon the listener/viewer and
the material that is compressed. The situation is actually a very
complex mix of the depth or detail of the program material and the
person that is hearing or viewing the material. Some combinations will
work well but others poorly. Personally I have a low tolerance of audio
and video artifacts.
I think that what people should understand is that when a codec is
employed it is similar to adding noise to the program material in a
psycho-acoustic way(for radio). It just kills me when some goober comes
along and touts the use of a compression codec as an "improvement" in
the transmit/receive system.
Well, I guess what I'm saying is that somebody needs to come up with
"some real world but worst possible" test cases that can be used to
come up with a quantitative number for distortion, instead of the media
manglers being able to say "It sounds ok to 50% of our audience, so what's
your problem".
One thing that bothers me is that there are probably certain program
materials that get censored by default because they look or sound bad
when passed through the compression, so that the broadcasters or DVD
distributors just won't bother to run or sell them.
Mark Zenier
Googleproofaddress(account:mzenier provider:eskimo domain:com)