View Single Post
  #160   Report Post  
Old February 26th 15, 10:47 PM posted to rec.radio.amateur.equipment
[email protected] jimp@specsol.spam.sux.com is offline
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 1,898
Default What is the point of digital voice?

In rec.radio.amateur.equipment Jerry Stuckle wrote:
On 2/26/2015 3:28 PM, rickman wrote:
On 2/26/2015 10:09 AM, Jerry Stuckle wrote:
On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the
signal
is decompressed.

I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy



Andy,

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.


I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf


http://www.broadcom.com/products/Cab...utions/BCM3560

http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?


If you would bother to understand what you referenced, NONE of these
chipsets are hi-def (1080).

And yes, H.264 is a proprietary algorithm, with only one company
providing the chipsets.


http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC

begin quote
A hardware H.264 encoder can be an ASIC or an FPGA.

ASIC encoders with H.264 encoder functionality are available from many
different semiconductor companies, but the core design used in the ASIC
is typically licensed from one of a few companies such as Chips&Media,
Allegro DVT, On2 (formerly Hantro, acquired by Google), Imagination
Technologies, NGCodec. Some companies have both FPGA and ASIC product
offerings.[56]

Texas Instruments manufactures a line of ARM + DSP cores that perform
DSP H.264 BP encoding 1080p at 30fps.[57] This permits flexibility
with respect to codecs (which are implemented as highly optimized DSP
code) while being more efficient than software on a generic CPU.
end quote

See also:

http://en.wikipedia.org/wiki/H.264/M...mplementations

http://en.wikipedia.org/wiki/MPEG_LA


--
Jim Pennino