
March 17th 14, 01:15 PM
posted to uk.radio.amateur,rec.radio.amateur.equipment,rec.radio.amateur.misc
|
external usenet poster
|
|
First recorded activity by RadioBanter: Oct 2012
Posts: 1,067
|
|
Quad shield coax & dielectric?
On 3/17/2014 8:27 AM, FranK Turner-Smith G3VKI wrote:
"Ian Jackson" wrote in message
...
In message , Jerry Stuckle
writes
On 3/16/2014 7:17 PM, Ian Jackson wrote:
In message , Jerry Stuckle
writes
On 3/16/2014 1:26 PM, Jeff wrote:
It really depends on how good your old analogue NTSC was. For a
noiseless picture, you would need around 43dB CNR, but pictures
were
still more-than-watch-able at 25dB, and the picture was often still
lockable at ridiculously low CNRs (when you certainly wouldn't
bother
watching it). Digital signals can work at SNRs down to around
15dB for
64QAM and 20dB for 256QAM (although if it's a little below this,
and you
will suddenly get nothing).
That has not been our experience. We had a number of customers
here in
the DC area who had great pictures on NTSC sets, but got either
heavy
pixilation or no picture at all when the switchover occurred. We
sent
them to a company which does tv antenna installations (we do a
lot of
low voltage, including tv - but not antennas). In every case,
installing a better outdoor antenna solved the problem.
No one said the NTSC had to be noiseless. But the 43dB is a bit
high,
even for older sets. Input from the cable tv company to our
equipment
was 10-20dB; we tried to push 10dB to all of the outputs but never
had a
problem even down to 7dB (the lowest we would let it drop to).
That makes no sense; a 7dB CNR would be pretty much unwatchable on
analogue, it would be a very very noisy picture, if it even locked at
all!
Jeff
I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of
signal. Most later TV's would work even at 0dbm.
HDTV, not so much.
7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an
absolutely colossal signal!
Not in the United States. It was the minimum that the cable industry
provides to the TV set.
We are talking a signal 4.25Mhz wide signal, not SSB or CW.
The TV signal levels quoted for analogue cable TV don't really involve
bandwidth. The level is always the 'RMS during sync' (or 'RMS during
peak'), which is the RMS level of the vision RF envelope during the
horizontal (or vertical) sync period. This has the advantage of
remaining constant regardless of the video content (ie it's the same
for a completely black picture or a completely white picture).. The
only requirement is that the measuring instrument has sufficient
bandwidth to embrace enough of the low frequency sideband content of
the video signal to give a reading which IS independent of the video
content. On a spectrum analyser, 300kHz resolution will display the
demodulated RF waveform (thus enabling you to read the RF level), but
IIRC many field strength meters have an IF bandwidth of typically
30kHz. However, regardless of the actual measuring bandwidth, noise
levels are normalised to a bandwidth of 4.2MHz (NTSC) and 5.2MHz
(PAL). and signal-to-noise measurements are adjusted accordingly.
Note that the cable TV industry generally uses units of dBmV (dB with
respect to 1mV - traditionally considered a 'good' level to feed to a
TV set). This is because most of the levels the cable TV guys work
with are generally in excess of 0dBmV (typically 0 to 60dBmV).
The off-air TV guys often use dBuV (dB wrt 1microvolt), as they are
usually dealing with weaker signals. As a result, cable TV guys are
always having to mentally deduct 60dB.
RF communications guys (and domestic satellite) tend to use dBm (which
is a slovenly version of 'dBmW' - dB wrt 1mW) - despite the fact that
a lot of their levels are large negative numbers. Also note that dBm
tends to imply a Zo of 50 ohms, and dBmV/dBuV 75 ohms - but it ain't
always necessarily so.
Anyone working in the RF industry would be well advised to ensure that
they always use the correct units - for example, don't say 'dB' or
'dBm' when you really mean dBmV. Failure to do so can often result in
people needlessly arguing and talking at cross-purposes.
Back in the 1970s I was involved with the assessment of the coverage of
analogue UHF TV. At that time the service limit was defined as 70dB rel
1uV/m field strength. (3.16mV/m).
At 600MHz a half wave dipole is near enough 25cm and so would capture
about a quarter of this voltage.
A typical outdoor TV antenna of the time had a gain of at least 6dB, so
the available signal level before feeder loss would be in the order of 1
- 2mV.
HTH
Obviously it was not the U.S. industry you were advising...
--
==================
Remove the "x" from my email address
Jerry, AI0K
==================
|