Home |
Search |
Today's Posts |
#11
![]() |
|||
|
|||
![]()
In message , Jeff writes
It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! I'm also not sure what the figures mean. From distant memory, the NCTA minimum RF input level (for NTSC) was 0dBmV (into a TV set - it might have been a bit more for set-top boxes), and the CNR 43dB. The UK cable TV level (for PAL set-tops) was 3dBmV to 15dBmV, with no more than 3dB between the levels of adjacent channels, and when digital signals came along, these were run around 15dB below the analogues. [Note that for both the US and the UK, one of the reasons for these obviously high signal levels is because cable set-top boxes have relatively appalling noise figures compared with your modern TV set.] UK off-air transmissions were somewhat similar, with digitals being run at 10, 16 and even occasionally 20dB below the analogues. However, when all the analogues were turned off, the digitals were turned up to typically 7dB below what the analogues had been. This would suggest that digital receivers (including HD) are at least perfectly happy with 7dB less signal than analogue - and in practice, all other things being equal, digital receivers work down to much lower signal levels than would be considered satisfactory for analogue. The only obvious proviso is that while (so far) UK SD transmissions are 64QAM, HD transmissions are 256QAM, and therefore need maybe 6dB more signal (which will only be apparent where reception is marginal). -- ian --- news://freenews.netfront.net/ - complaints: --- |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Connecting coax shield to tower near top | Antenna | |||
High Quality {Low Noise} Coax Cable for Shortwave Listening (SWL) Antennas ? - - - Why Not Quad-Shield RG6 ! | Shortwave | |||
soldering coax shield | Equipment | |||
soldering coax shield | Homebrew | |||
soldering coax shield | Homebrew |