Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Old February 25th 15, 02:55 PM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Oct 2012
Posts: 1,067
Default What is the point of digital voice?

On 2/25/2015 2:30 AM, AndyW wrote:
On 24/02/2015 17:00, Jerry Stuckle wrote:

But you forget compression. For instance, unless there is a scene
change, the vast majority of a television picture does not change from
frame to frame. Even if the camera moves, the picture shifts but
doesn't change all that much. Why waste all of that bandwidth resending
information the receiver already has?


Which is why, on cheaper televisions, the picture tesselates when
showing random images such as rain, fire, waterfalls etc.
The true test of a quality television is to watch a waterfall or flames
and see it pin-sharp.
Cheaper TVs use cheap lower-powered decoding systems and for complex
images they do not have enough time to fully decode the image before the
next frame arrives.

Andy


Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the signal
is decompressed.

--
==================
Remove the "x" from my email address
Jerry, AI0K

==================
  #2   Report Post  
Old February 26th 15, 09:58 AM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Feb 2014
Posts: 80
Default What is the point of digital voice?

On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the signal
is decompressed.


I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy






  #3   Report Post  
Old February 26th 15, 04:09 PM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Oct 2012
Posts: 1,067
Default What is the point of digital voice?

On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the signal
is decompressed.


I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy



Andy,

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.

There is a limit to how much information can be transferred in the
allotted bandwidth, so a complete change in picture can't be compressed
perfectly. But at the 30 fps used here, even a scene change is picked
up within a few frames and isn't noticeable to the eye unless you know
what you're looking for.

However, what happens after the decoding can cause more problems. The
lower quality resolutions such as 720p and 1080i typically use less
expensive circuitry when taking the decoded signal and processing it for
the display. They may or may not have the speed required to change all
of the elements in the display before the next image comes along.

Higher resolution displays such as 1080p and UHD (4K) have more
expensive circuitry to prepare the signal for the display. This
circuitry is better able to keep up with the decoded signal and a
complete scenery change is less noticeable. You may see the difference
when you have a 720p resolution set and a 1080p resolution set running
in 720p mode sitting next to each other and displaying the same information.

Of course, this is a generalization, and each set needs to be evaluated
on its own. Some lower resolution sets do quite well, while
occasionally you'll find a higher resolution set which doesn't do so
well. But it's not very common any more.

--
==================
Remove the "x" from my email address
Jerry, AI0K

==================
  #4   Report Post  
Old February 26th 15, 07:57 PM posted to rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 1,898
Default What is the point of digital voice?

In rec.radio.amateur.equipment Jerry Stuckle wrote:

snip

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).


trolls trolls trolls trolls



--
Jim Pennino
  #5   Report Post  
Old February 26th 15, 09:28 PM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Nov 2012
Posts: 989
Default What is the point of digital voice?

On 2/26/2015 10:09 AM, Jerry Stuckle wrote:
On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the signal
is decompressed.


I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy



Andy,

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.


I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf

http://www.broadcom.com/products/Cab...utions/BCM3560

http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?

The decoding is very much *not* proprietary to one company. There is a
consortium of companies who own patents for the MPEG-2 decoder alone...

http://www.mpegla.com/main/programs/...ts/m2-att1.pdf

--

Rick


  #6   Report Post  
Old February 26th 15, 11:04 PM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Oct 2012
Posts: 1,067
Default What is the point of digital voice?

On 2/26/2015 3:28 PM, rickman wrote:
On 2/26/2015 10:09 AM, Jerry Stuckle wrote:
On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the
signal
is decompressed.

I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy



Andy,

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.


I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf


http://www.broadcom.com/products/Cab...utions/BCM3560

http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?


If you would bother to understand what you referenced, NONE of these
chipsets are hi-def (1080).

And yes, H.264 is a proprietary algorithm, with only one company
providing the chipsets.

The decoding is very much *not* proprietary to one company. There is a
consortium of companies who own patents for the MPEG-2 decoder alone...

http://www.mpegla.com/main/programs/...ts/m2-att1.pdf


Once again you show you don't understand the technology, but have to
argue anyway. MPEG-2 is NOT H.264.

--
==================
Remove the "x" from my email address
Jerry, AI0K

==================
  #7   Report Post  
Old February 26th 15, 11:47 PM posted to rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 1,898
Default What is the point of digital voice?

In rec.radio.amateur.equipment Jerry Stuckle wrote:
On 2/26/2015 3:28 PM, rickman wrote:
On 2/26/2015 10:09 AM, Jerry Stuckle wrote:
On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:

Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the
signal
is decompressed.

I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.

I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.

The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.

My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.

Andy



Andy,

I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.


I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf


http://www.broadcom.com/products/Cab...utions/BCM3560

http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?


If you would bother to understand what you referenced, NONE of these
chipsets are hi-def (1080).

And yes, H.264 is a proprietary algorithm, with only one company
providing the chipsets.


http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC

begin quote
A hardware H.264 encoder can be an ASIC or an FPGA.

ASIC encoders with H.264 encoder functionality are available from many
different semiconductor companies, but the core design used in the ASIC
is typically licensed from one of a few companies such as Chips&Media,
Allegro DVT, On2 (formerly Hantro, acquired by Google), Imagination
Technologies, NGCodec. Some companies have both FPGA and ASIC product
offerings.[56]

Texas Instruments manufactures a line of ARM + DSP cores that perform
DSP H.264 BP encoding 1080p at 30fps.[57] This permits flexibility
with respect to codecs (which are implemented as highly optimized DSP
code) while being more efficient than software on a generic CPU.
end quote

See also:

http://en.wikipedia.org/wiki/H.264/M...mplementations

http://en.wikipedia.org/wiki/MPEG_LA


--
Jim Pennino
  #8   Report Post  
Old February 27th 15, 02:41 AM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Nov 2012
Posts: 989
Default What is the point of digital voice?

On 2/26/2015 5:04 PM, Jerry Stuckle wrote:
On 2/26/2015 3:28 PM, rickman wrote:
On 2/26/2015 10:09 AM, Jerry Stuckle wrote:

Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.


I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf


http://www.broadcom.com/products/Cab...utions/BCM3560

http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?


If you would bother to understand what you referenced, NONE of these
chipsets are hi-def (1080).

And yes, H.264 is a proprietary algorithm, with only one company
providing the chipsets.

The decoding is very much *not* proprietary to one company. There is a
consortium of companies who own patents for the MPEG-2 decoder alone...

http://www.mpegla.com/main/programs/...ts/m2-att1.pdf


Once again you show you don't understand the technology, but have to
argue anyway. MPEG-2 is NOT H.264.


"The BCM3560 combines a cable/terrestrial 4/1024 QAM and 8/16-VSB
receiver, an out-of-band QPSK receiver, NTSC demodulator, DVI/HDMI
receiver, a transport processor, a digital audio processor, a
high-definition (HD) MPEG video decoder, 2D graphics processing, digital
processing of analog video and audio, analog video digitizer and DAC
functions, stereo high-fidelity audio DACs, a 250-MHz MIPS processor,
and a peripheral control unit providing a variety of television control
functions."

I am happy to admit I don't know everything about digital TV. But I do
know a ridiculous statement when I see it. "But in the U.S., the method
used is proprietary to one company. The chipsets required to decode the
signal are all produced by this company, so all TV's have similar
decoding." qualifies as a ridiculous statement. No one in the industry
would have allowed the FCC to entrench one company as the sole
manufacturer of decoder chips for digital TV.

BTW, you are right that MPEG-2 is not H.264. It's just not relevant.
They are both used for digital TV.

--

Rick
  #9   Report Post  
Old February 27th 15, 02:55 AM posted to uk.radio.amateur,rec.radio.amateur.equipment
external usenet poster
 
First recorded activity by RadioBanter: Oct 2012
Posts: 1,067
Default What is the point of digital voice?

On 2/26/2015 8:41 PM, rickman wrote:
On 2/26/2015 5:04 PM, Jerry Stuckle wrote:
On 2/26/2015 3:28 PM, rickman wrote:
On 2/26/2015 10:09 AM, Jerry Stuckle wrote:

Yes, the TV only has a certain amount of time to decode the signal.
But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this
company,
so all TV's have similar decoding.

I think you are confusing all chip makers using the same algorithm with
all TV makers buying their chips from the same chip maker.

http://www.toshiba.com/taec/componen...GProdBrief.pdf



http://www.broadcom.com/products/Cab...utions/BCM3560


http://www.fujitsu.com/cn/fsp/home-e...t/MB86H01.html

Are you suggesting that all of these chip makers are reselling one
company's products?


If you would bother to understand what you referenced, NONE of these
chipsets are hi-def (1080).

And yes, H.264 is a proprietary algorithm, with only one company
providing the chipsets.

The decoding is very much *not* proprietary to one company. There is a
consortium of companies who own patents for the MPEG-2 decoder alone...

http://www.mpegla.com/main/programs/...ts/m2-att1.pdf


Once again you show you don't understand the technology, but have to
argue anyway. MPEG-2 is NOT H.264.


"The BCM3560 combines a cable/terrestrial 4/1024 QAM and 8/16-VSB
receiver, an out-of-band QPSK receiver, NTSC demodulator, DVI/HDMI
receiver, a transport processor, a digital audio processor, a
high-definition (HD) MPEG video decoder, 2D graphics processing, digital
processing of analog video and audio, analog video digitizer and DAC
functions, stereo high-fidelity audio DACs, a 250-MHz MIPS processor,
and a peripheral control unit providing a variety of television control
functions."

I am happy to admit I don't know everything about digital TV. But I do
know a ridiculous statement when I see it. "But in the U.S., the method
used is proprietary to one company. The chipsets required to decode the
signal are all produced by this company, so all TV's have similar
decoding." qualifies as a ridiculous statement. No one in the industry
would have allowed the FCC to entrench one company as the sole
manufacturer of decoder chips for digital TV.

BTW, you are right that MPEG-2 is not H.264. It's just not relevant.
They are both used for digital TV.


No, you don't know a "ridiculous statement when you see it". You have
proven multiple times you don't even know your arse from a hole in the
ground.

You really should stick with things you know something about. Maybe
eventually you can figure out what those things are.

--
==================
Remove the "x" from my email address
Jerry, AI0K

==================
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Attempted Internet Harassment Turns To Entertainment -what-about-WiFi Antennas for Solid Point-to-Point ? RHF Shortwave 1 October 10th 10 06:23 PM
iBiquity Digital's Make-or-Break Point Approaches ! [email protected] Shortwave 0 August 1st 06 02:44 PM
Is anyone using DRM on shortwave as a 'point to point audio feeder', as opposed to (companded) SSB as is customary...? Max Power Shortwave 1 January 18th 06 05:45 AM
Digital Voice Sked? N2RLL Digital 0 November 14th 03 12:28 AM
Digital voice for HF - Bandplan charlesb Digital 8 November 5th 03 04:52 AM


All times are GMT +1. The time now is 11:51 AM.

Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright ©2004-2025 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017