On 3/17/2014 12:15 PM, Ian Jackson wrote:
In message , Jerry Stuckle
writes
On 3/17/2014 11:32 AM, Ian Jackson wrote:
In message , Jerry Stuckle
writes
On 3/17/2014 3:45 AM, Jeff wrote:
7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an
absolutely colossal signal!
Not in the United States. It was the minimum that the cable industry
provides to the TV set.
We are talking a signal 4.25Mhz wide signal, not SSB or CW.
dBm is not a bandwidth dependant measurement such as CNR which is.
Putting +7dBm into a tv receiver is madness, it would cause severe
overload and inter mods. +7dBm is 50mW and that equates to about
61mV in
a 75 ohm system which is an enormous signal.
Jeff
Wrong. TV's are made to handle at least 20 dbm. And cable tv
companies must deliver at least 10 dbm to the premises.
You do realise that 20dBm (appx 68dBmV) is a massive 100mW? With a
modest 50 channel analogue cable TV system, that would be a total input
power of 5W - which would have a TV set or set-top box sagging at the
knees - if not even beginning top smoke!
TV signals (at least in the U.S.) are not measured by CNR
Well of course they aren't. CNR is a ratio - not a level.
- they are measured by dbm.
No. The US and UK cable TV industry definitely uses dBmV.
Which is generally shortened to dbm here.
I must emphasise that you are simply WRONG. None of the professional
cable TV engineers I've ever been associated with (both in the UK and
the USA have ever used the term 'dBm' when they mean 'dBmV'. Can you
think of a reason why? [Clue - There's 48dB difference between the two
units.]
We aren't talking professional cable TV engineers here. We are talking
installers and cable pullers (a much larger group, BTW). They barely
know what a volt is - much less the difference between dBmW and dBmV.
TV technicians at least know what a volt is. But most of them don't
know the difference between dBmV and dBmW.
What you are talking about is dBmW - which, unfortunately, is also
often shortened to dBm. But most people on this side of the pond who
are in the business understand that.
I can live with that. The incorrect use of 'dBm' to mean 'dBmW' is a de
facto industry standard - and I'm not going to try and change the world
by pretending that I don't understand the incorrect 'dBm'.
It depends on the industry you are in.
0dBmV is 1mV - a reasonable signal to feed to a TV set (especially
directly from an antenna).
0dBm is appx 48dBmV (250mV) - and that's one hell of a TV signal!
With a 75 ohm source impedance (antenna and coax) - and no significant
levels of outside noise-like interference, a 0dBmV (1mV) analogue NTSC
signal, direct from an antenna, will have a CNR of around 57dB. A TV set
with a decent tuner noise figure (5dB?) or a set-top box (8dB) will
produce essentially noise-free pictures.
However, with an analogue TV signal from a large cable TV system, the
signal CNR will be much worse than 57dB (regardless of its level). If I
recall correctly, the NCTA ( National Cable Television Association)
minimum spec is a CNR of 43dB (UK is 6B). At this ratio, it is judged
that picture noise is just beginning to become visible.
CNR is not important because the bandwidth does not change.
You're havin' a laff - surely?!
Nope.
OK. Are you by any chance related to John McEnroe?
http://www.youtube.com/watch?v=ekQ_Ja02gTY
Not everyone works the same way.
Your insistence on using CNR shows you know nothing about how the
industry measures signal strength.
I'm not insisting on anything. However, an analogue with a poor CNR will
produce noisy pictures - regardless of the signal level. Similarly, a
digital signal with a too poor an SNR/MER will fail to decode -
regardless of the signal level. I think the UK cable TV spec for digital
signals is 25dB (although a good set-top box will decode down to the
mid-teens).
External noise is somewhat consistent. Front ends are pretty much
comparable in their S/N ratio. The only problems with noise are
generally if you have something generating noise locally. But that is
not a problem with the signal nor the receiver.
That is why the real world uses signal strength to determine proper
signal levels. CNR in TV is not used nor is it required when the
other parameters are known.
So pray tell me why, in my many years in the cable TV industry, I spent
so many pointless hours measuring (among all the other parameters) CNR?
Maybe because you're talking to people who design front ends, etc. They
are only a small group in the entire industry.
--
==================
Remove the "x" from my email address
Jerry Stuckle
==================