On 2/26/2015 3:58 AM, AndyW wrote:
On 25/02/2015 13:55, Jerry Stuckle wrote:
Not really true, at least in the United States. All TV's here use the
same (proprietary) chipsets to decompress the digital signal. However,
it makes a huge difference on the resolution being used, i.e. 720P,
1080P, 1080I, UHD... The difference is in what happens after the signal
is decompressed.
I am unsure of US TV. In the UK terrestrial TV is all digital.
Analog(ue) was switched off a few years ago.
I am referring to the whole box from antenna to screen, most of our TVs
come with built-in 'Freeview'.
I have a digital set about 6 years old that struggles to handle complex
images but my new toy handles it perfectly. My newer TV uses a newer
chipset and more efficient decoding algorithm that is made possible
because of the higher power chipset.
The older chipsets are still in production and still being sold,
presumably the TV manufacturers can buy them cheaply, stick them in the
TV and rely on marketing buzz over technical demonstration to sell then
for a larger markup. Most people I know buy on screen size anyway.
My understanding - which may be incorrect - is that the TV has a fixed
time based upon the framerate in which to decode the image and display
it before it has to start on the next frame.
Better quality TVs are capable of fully decompressing the image and
displaying it between frames but the cheaper and older ones cannot
handle a new image every frame and so, when it runs out of time decoding
the image it just gets sent to the screen, tesselations and all.
Standing ready to be corrected.
Andy
Andy,
I don't know what the Europeans use, so I can't speak for you guys. But
here in the United States, everything is digital also, and has been for
years (here come the trolls).
Yes, the TV only has a certain amount of time to decode the signal. But
in the U.S., the method used is proprietary to one company. The
chipsets required to decode the signal are all produced by this company,
so all TV's have similar decoding.
There is a limit to how much information can be transferred in the
allotted bandwidth, so a complete change in picture can't be compressed
perfectly. But at the 30 fps used here, even a scene change is picked
up within a few frames and isn't noticeable to the eye unless you know
what you're looking for.
However, what happens after the decoding can cause more problems. The
lower quality resolutions such as 720p and 1080i typically use less
expensive circuitry when taking the decoded signal and processing it for
the display. They may or may not have the speed required to change all
of the elements in the display before the next image comes along.
Higher resolution displays such as 1080p and UHD (4K) have more
expensive circuitry to prepare the signal for the display. This
circuitry is better able to keep up with the decoded signal and a
complete scenery change is less noticeable. You may see the difference
when you have a 720p resolution set and a 1080p resolution set running
in 720p mode sitting next to each other and displaying the same information.
Of course, this is a generalization, and each set needs to be evaluated
on its own. Some lower resolution sets do quite well, while
occasionally you'll find a higher resolution set which doesn't do so
well. But it's not very common any more.
--
==================
Remove the "x" from my email address
Jerry, AI0K
==================