Fox News 2012: HD Radio one of "The Biggest CES Flops of AllTime" LMFAO!!!!!!!!!
On 11/01/2012 16:03, J G Miller wrote:
On Wednesday, January 11th, 2012, at 07:09:26h -0800, SmS 88 declared:
The iBiquity codec is based upon the AAC+ (HE-AAC) codec. "Scientific
testing by the European Broadcasting Union has indicated that HE-AAC at
48 kbit/s was ranked as "Excellent" quality using the MUSHRA scale.[8]
Since the iBiquity codec is *based upon* but not *is* the AAC+ (HE-AAC)
codec, it is not valid to use tests on the original AAC+ (HE-AAC) codec
as evidence that the iBiquity codec its-self delivers quality.
This is a good point.
And I would also add that this claim about he-aac is used by Encoder
Technologies, who developed the codec (or at least the SBR part), and so
have a vested interest in making it sound as good as possible.
In reality aac+ can sound good at 48k, but not CD quality, as Encoder
Technologies would like you to believe. Also I suspect that many
broadcasters don't use it under ideal conditions. Previously when many
internet broadcasters used 64k aac+, it was not excellent. It did sound
acceptable, but not excellent, as there were some SBR artifacts.
Add to this the fact that most HD-Radio broadcasters, don't actually use
any bit rates higher than 40k. At 40k even aac+ sounds poor, and
presumably the HD-Radio codec will sound even worse.
If HD-Radio used the maximum bit rate of 96k, then it would probably
sound acceptable under every day listening conditions. But even at 96k
it would be lower quality than a good FM signal, and would rule out any
hi-fi listening via terrestrial radio. That is something that I just
don't feel right about.
Richard E.
|