![]() |
|
Quad shield coax & dielectric?
75-ohm RG-6 coax: quad shield differs from "standard" RG-6 in that the
dielectric is reduced in diameter to accomodate the extra shielding. How does this affect the performance? I'm looking at 1 GHz (HDTV use). Thanks. |
Quad shield coax & dielectric?
On 3/14/2014 6:51 PM, Bob E. wrote:
75-ohm RG-6 coax: quad shield differs from "standard" RG-6 in that the dielectric is reduced in diameter to accomodate the extra shielding. How does this affect the performance? I'm looking at 1 GHz (HDTV use). Thanks. How are you going to use it for HDTV? HDTV is a TV signal protocol, not a communications method. -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
How are you going to use it for HDTV? HDTV is a TV signal protocol, not
a communications method. Not to be rude, but it's a simple question asked. The answer doesn't involve the use to which it will be put. Just trying to keep replies on-topic... |
Quad shield coax & dielectric?
75 ohm cable with a loss of ~6db/30m at 1GHz
|
Quad shield coax & dielectric?
Bob E wrote:
How are you going to use it for HDTV? HDTV is a TV signal protocol, not a communications method. Not to be rude, but it's a simple question asked. The answer doesn't involve the use to which it will be put. But sure that does matter. Especially the signal frequency at which you will use it, and the tolerable losses in the run of cable. "HDTV" by itself does not tell enough. It could be terristrial broadcast, cable TV, satellite TV. Each of them has different characteristics w.r.t. frequencies in use and losses that are tolerable. |
Quad shield coax & dielectric?
In message ,
Bob E. writes 75-ohm RG-6 coax: quad shield differs from "standard" RG-6 in that the dielectric is reduced in diameter to accomodate the extra shielding. How does this affect the performance? I'm looking at 1 GHz (HDTV use). Thanks. I note that there have been a some replies, but none seem to make much attempt at answering your question. RG6Q is used extensively in the UK cable TV industry as 'drop' cable - ie from the taps in the street cabinet to the home. It is used to provide a high degree of immunity from ingress of interfering signals - especially those at the lower frequencies (in the reverse path part of the spectrum - typically between 5 and 65MHz). RG6 is not a particularly low-loss cable, and for long drop runs, RG11 is sometimes used. As for the attenuation differences between RG6 and RG6Q, I've done a bit of Googling, and I can't see anything which is immediately pointed out. Even on this site http://www.ehow.com/list_7605813_difference-between-rg6-rg6q.html all it says is that "RG-6 and RG-6Q share nearly the exact same outer dimensions and have similar flexibility. RG-6Q is slightly stiffer due to the increased amount of inner shielding". I suspect that even if the diameter of the RG6Q dielectric is slightly less (something which I've never really noticed) - requiring a slightly thinner inner conductor in order to preserve the Zo - the increase of attenuation won't be very much. However, I'm sure that a bit more intensive Googling on RG6 physical and electrical specs will reveal the true answer! -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
On 3/15/2014 1:29 AM, Bob E. wrote:
How are you going to use it for HDTV? HDTV is a TV signal protocol, not a communications method. Not to be rude, but it's a simple question asked. The answer doesn't involve the use to which it will be put. Just trying to keep replies on-topic... Yes, it does. The question was completely on topic. How you use it will determine if RG6-quad is usable of your needs or not. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Quad shield coax & dielectric?
On 3/15/2014 9:18 AM, Ian Jackson wrote:
In message , Bob E. writes 75-ohm RG-6 coax: quad shield differs from "standard" RG-6 in that the dielectric is reduced in diameter to accomodate the extra shielding. How does this affect the performance? I'm looking at 1 GHz (HDTV use). Thanks. I note that there have been a some replies, but none seem to make much attempt at answering your question. RG6Q is used extensively in the UK cable TV industry as 'drop' cable - ie from the taps in the street cabinet to the home. It is used to provide a high degree of immunity from ingress of interfering signals - especially those at the lower frequencies (in the reverse path part of the spectrum - typically between 5 and 65MHz). RG6 is not a particularly low-loss cable, and for long drop runs, RG11 is sometimes used. As for the attenuation differences between RG6 and RG6Q, I've done a bit of Googling, and I can't see anything which is immediately pointed out. Even on this site http://www.ehow.com/list_7605813_difference-between-rg6-rg6q.html all it says is that "RG-6 and RG-6Q share nearly the exact same outer dimensions and have similar flexibility. RG-6Q is slightly stiffer due to the increased amount of inner shielding". I suspect that even if the diameter of the RG6Q dielectric is slightly less (something which I've never really noticed) - requiring a slightly thinner inner conductor in order to preserve the Zo - the increase of attenuation won't be very much. However, I'm sure that a bit more intensive Googling on RG6 physical and electrical specs will reveal the true answer! No one has answered his question because the information is insufficient. FYI - my company (a home automation company) installs thousands of feet of coax every year (even more twisted pair). But we never specify what to use until we know how it is being used. Additionally, it depends if he needs to send send power over the coax also, and if so, how much. For instance, the new specs for HDTV (Ultra-hi-def, 3D at 240 frames/sec) require bandwidths of up to 18Ghz. It's something we have to take into consideration on ANY installation. Just saying it's going to be used for HDTV is not sufficient. -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
I note that there have been a some replies, but none seem to make much
attempt at answering your question. THANK YOU IAN!! A thousand points for noting this. RG6Q is used extensively in the UK cable TV industry as 'drop' cable - ie from the taps in the street cabinet to the home. It is used to provide a high degree of immunity from ingress of interfering signals - especially those at the lower frequencies (in the reverse path part of the spectrum - typically between 5 and 65MHz). RG6 is not a particularly low-loss cable, and for long drop runs, RG11 is sometimes used. As for the attenuation differences between RG6 and RG6Q, I've done a bit of Googling, and I can't see anything which is immediately pointed out. Even on this site http://www.ehow.com/list_7605813_difference-between-rg6-rg6q.html all it says is that "RG-6 and RG-6Q share nearly the exact same outer dimensions and have similar flexibility. RG-6Q is slightly stiffer due to the increased amount of inner shielding". And another thousand points for answering the question--which was about the cable's specs, NOT ABOUT ITS APPROPRIATENESS FOR A SPECIFIC APPLICATION. I suspect that even if the diameter of the RG6Q dielectric is slightly less (something which I've never really noticed) - requiring a slightly thinner inner conductor in order to preserve the Zo - the increase of attenuation won't be very much. However, I'm sure that a bit more intensive Googling on RG6 physical and electrical specs will reveal the true answer! Ian The question was how does RG6 compare to RG6Q, specifically whether or not the reduced diameter of the dielectric effects its specifications. Best to you. |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes On 3/15/2014 9:18 AM, Ian Jackson wrote: In message , Bob E. writes 75-ohm RG-6 coax: quad shield differs from "standard" RG-6 in that the dielectric is reduced in diameter to accomodate the extra shielding. How does this affect the performance? I'm looking at 1 GHz (HDTV use). Thanks. I note that there have been a some replies, but none seem to make much attempt at answering your question. RG6Q is used extensively in the UK cable TV industry as 'drop' cable - ie from the taps in the street cabinet to the home. It is used to provide a high degree of immunity from ingress of interfering signals - especially those at the lower frequencies (in the reverse path part of the spectrum - typically between 5 and 65MHz). RG6 is not a particularly low-loss cable, and for long drop runs, RG11 is sometimes used. As for the attenuation differences between RG6 and RG6Q, I've done a bit of Googling, and I can't see anything which is immediately pointed out. Even on this site http://www.ehow.com/list_7605813_difference-between-rg6-rg6q.html all it says is that "RG-6 and RG-6Q share nearly the exact same outer dimensions and have similar flexibility. RG-6Q is slightly stiffer due to the increased amount of inner shielding". I suspect that even if the diameter of the RG6Q dielectric is slightly less (something which I've never really noticed) - requiring a slightly thinner inner conductor in order to preserve the Zo - the increase of attenuation won't be very much. However, I'm sure that a bit more intensive Googling on RG6 physical and electrical specs will reveal the true answer! No one has answered his question because the information is insufficient. FYI - my company (a home automation company) installs thousands of feet of coax every year (even more twisted pair). But we never specify what to use until we know how it is being used. Additionally, it depends if he needs to send send power over the coax also, and if so, how much. For instance, the new specs for HDTV (Ultra-hi-def, 3D at 240 frames/sec) require bandwidths of up to 18Ghz. It's something we have to take into consideration on ANY installation. Just saying it's going to be used for HDTV is not sufficient. My immediate lateral-thinking guess is that the OP has acquired some RG6Q, and is wondering whether he can use it as antenna drop cable for UHF TV (which, in the UK, includes HD). He has specifically said that it's for use at less than 1GHz. His main concern is probably that quad-shield might be a more lossy than RG6 (which indeed it could be as a smaller diameter dielectric would require a smaller diameter inner in order to maintain a Zo of 75 ohms, and this would increase the attenuation). Of course, he could also be concerned about some of the many other parameters - but I suspect not. If it's not attenuation that's concerning him, I'm sure he will tell us. -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
Bob E wrote:
And another thousand points for answering the question--which was about the cable's specs, NOT ABOUT ITS APPROPRIATENESS FOR A SPECIFIC APPLICATION. Your posting appeared on usenet different from your own intention. From your posting: "How does this affect the performance? I'm looking at 1 GHz (HDTV use)." That clearly is a question about appropriateness for a specific application. You did not ask about the loss, you asked about the performance. So that means "they may be loss, but does it affect the results". The answer clearly is: it depends on further details, like what margin you have on the signal. Shouting does not help you, just face the facts. |
Quad shield coax & dielectric?
Ian Jackson wrote:
My immediate lateral-thinking guess is that the OP has acquired some RG6Q, and is wondering whether he can use it as antenna drop cable for UHF TV (which, in the UK, includes HD). He has specifically said that it's for use at less than 1GHz. His main concern is probably that quad-shield might be a more lossy than RG6 (which indeed it could be as a smaller diameter dielectric would require a smaller diameter inner in order to maintain a Zo of 75 ohms, and this would increase the attenuation). Of course, he could also be concerned about some of the many other parameters - but I suspect not. If it's not attenuation that's concerning him, I'm sure he will tell us. This was my understanding as well. But he reverted to shouting and indicated that we have all misunderstood him. He asked about the performance. As he was pointing to loss, I would guess he would like to know the performance w.r.t. loss. But as he also indicated a use case, I think he wants (or needs) to know if the loss is not too high for the use case he has. That cannot be determined given the info there is. We need to know what margin he has on the signal and how long his cable run is. The margin is determined by the type of signal (terrestrial, cable, satellite, we can rule out satellite because he said 1GHz). When terrestrial, we need to know how close he is to the transmitter. Even with all such general information, it probably is not possible to close in enough on the calculation to know if a couple of dB or so of extra loss per 100m is going to affect the performance of the system. |
Quad shield coax & dielectric?
In message , Rob
writes Bob E wrote: And another thousand points for answering the question--which was about the cable's specs, NOT ABOUT ITS APPROPRIATENESS FOR A SPECIFIC APPLICATION. Your posting appeared on usenet different from your own intention. From your posting: "How does this affect the performance? I'm looking at 1 GHz (HDTV use)." That clearly is a question about appropriateness for a specific application. You did not ask about the loss, you asked about the performance. So that means "they may be loss, but does it affect the results". The answer clearly is: it depends on further details, like what margin you have on the signal. Shouting does not help you, just face the facts. OK, Bob E - it appears that the ball is in your court. In the interests of peace and harmony, and to prevent confusion, please could you please tell us exactly (and I mean EXACTLY) which RG-6 vs RG-6Q parameters you are concerned about? -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
OK, Bob E - it appears that the ball is in your court. In the interests
of peace and harmony, and to prevent confusion, please could you please tell us exactly (and I mean EXACTLY) which RG-6 vs RG-6Q parameters you are concerned about? Ian OK, thanks for the discussions. I have a VHF/UHF omnidirectional antenna with integral amplifier for TV reception: http://www.amazon.com/Antennacraft-5...mplified-HDTV- Antenna/dp/B007Z7YOKS Several broadcast towers surround me, from 40 to 50 miles: http://www.tvfool.com/?option=com_wrapper&Itemid=29&q=id%3d5b9405cba93e1 5 Terrain is pretty flat. The antenna is currently connected to RG6 located indoors, up high in a 1-story cathedral-ceiling home. Signal reception is marginal, gauged by the HDTV's (relative) Signal Strength display; dropouts occur regularly on some channels. I plan to mount the antenna outdoors on the peak of the roof. I was planning to use RG6 quad-shield, but wanted to check whether it is truly a better solution or not. Cable run indoors now is about 50 ft. From the roof location this will increase to 75 or 100, depending on the route I choose, hence my question. Thanks. |
Quad shield coax & dielectric?
Bob E wrote:
OK, Bob E - it appears that the ball is in your court. In the interests of peace and harmony, and to prevent confusion, please could you please tell us exactly (and I mean EXACTLY) which RG-6 vs RG-6Q parameters you are concerned about? Ian OK, thanks for the discussions. I have a VHF/UHF omnidirectional antenna with integral amplifier for TV reception: http://www.amazon.com/Antennacraft-5...mplified-HDTV- Antenna/dp/B007Z7YOKS Several broadcast towers surround me, from 40 to 50 miles: http://www.tvfool.com/?option=com_wrapper&Itemid=29&q=id%3d5b9405cba93e1 5 Terrain is pretty flat. The antenna is currently connected to RG6 located indoors, up high in a 1-story cathedral-ceiling home. Signal reception is marginal, gauged by the HDTV's (relative) Signal Strength display; dropouts occur regularly on some channels. I plan to mount the antenna outdoors on the peak of the roof. I was planning to use RG6 quad-shield, but wanted to check whether it is truly a better solution or not. Cable run indoors now is about 50 ft. From the roof location this will increase to 75 or 100, depending on the route I choose, hence my question. Thanks. When you have an antenna with integrated amplifier, the loss of the cable normally will not be a prime concern. Of course this only holds true when the antenna+amplifier is well designed. I don't know the situation in the USA, but here in Europe there are only very few good manufacturers and all the rest sell crap and snake-oil. Don't know what category your antenna is in. With a bare antenna (without amplifier), loss is very important as the signal from the antenna is attenuated and the noise at the input of the receiver is constant, so your signal/noise ratio worsens. However, with an amplifier near the antenna, the signal should be raised sufficiently to be above the noise at the receiver, and the signal/noise ratio at the input of the amplifier becomes the predominant factor. In this case, the loss from your coax should not matter too much. The good-quality shielding is often more important. Note that in digital TV, the occurrence of dropouts is not only determined by signal strength, but also by signal quality. This will improve dramatically when you put the antenna on the roof, especially when this results in a more or less clear view of the transmitter. What you receive now is probably a jumble of reflections. While digital TV is more tolerant to that than old analog TV, it still eats from the margin that you need for dropout-free reception. |
Quad shield coax & dielectric?
In message , Rob
writes Bob E wrote: OK, Bob E - it appears that the ball is in your court. In the interests of peace and harmony, and to prevent confusion, please could you please tell us exactly (and I mean EXACTLY) which RG-6 vs RG-6Q parameters you are concerned about? Ian OK, thanks for the discussions. I have a VHF/UHF omnidirectional antenna with integral amplifier for TV reception: http://www.amazon.com/Antennacraft-5...mplified-HDTV- Antenna/dp/B007Z7YOKS Several broadcast towers surround me, from 40 to 50 miles: http://www.tvfool.com/?option=com_wrapper&Itemid=29&q=id%3d5b9405cba93e1 5 Terrain is pretty flat. The antenna is currently connected to RG6 located indoors, up high in a 1-story cathedral-ceiling home. Signal reception is marginal, gauged by the HDTV's (relative) Signal Strength display; dropouts occur regularly on some channels. I plan to mount the antenna outdoors on the peak of the roof. I was planning to use RG6 quad-shield, but wanted to check whether it is truly a better solution or not. Cable run indoors now is about 50 ft. From the roof location this will increase to 75 or 100, depending on the route I choose, hence my question. Thanks. When you have an antenna with integrated amplifier, the loss of the cable normally will not be a prime concern. Of course this only holds true when the antenna+amplifier is well designed. I don't know the situation in the USA, but here in Europe there are only very few good manufacturers and all the rest sell crap and snake-oil. Don't know what category your antenna is in. I would say that all omnidirectional TV antennas (amplified or not) tend to be in the snake-oil category, and should not be used unless there is not a more-sanitary alternative. The antenna itself has low gain (at least 3dB down on a halfwave dipole - so analogue pictures could be noisy), and offers no protection to the effects of multipath reception (analogue picture could have lots of ghosts). That said, an omnidirectional does have its uses - provided it works well enough for what you want. The advent of digital TV has meant that - up to a point - reception is much more tolerant of the impairments that often gave you poor analogue reception. With a bare antenna (without amplifier), loss is very important as the signal from the antenna is attenuated and the noise at the input of the receiver is constant, so your signal/noise ratio worsens. However, with an amplifier near the antenna, the signal should be raised sufficiently to be above the noise at the receiver, and the signal/noise ratio at the input of the amplifier becomes the predominant factor. Indeed. If you need an amplifier, it should be located at or near the antenna. This gives you the best signal-to-noise ratio (whatever the length of the drop cable is). In this case, the loss from your coax should not matter too much. The good-quality shielding is often more important. Note that in digital TV, the occurrence of dropouts is not only determined by signal strength, but also by signal quality. This will improve dramatically when you put the antenna on the roof, especially when this results in a more or less clear view of the transmitter. What you receive now is probably a jumble of reflections. Quite. While digital TV is more tolerant to that than old analog TV, it still eats from the margin that you need for dropout-free reception. In the UK, I don't think that many homes use installed omnidirectional antennas. You see some on caravans and mobile homes, and on boats, but never on houses. Those living close to the transmitter might use 'rabbit's ears' set-top antennas (or some fancy variant) - especially now that all TV is UHF (small antenna) and 'you can get away with murder' digital - but you don't see any proper installations. One big difference between the UK and many other countries (and in particular the USA) is that we have generally received all our TV signals from one direction (initially from closely-located transmitter masts, and latterly from a single mast). It is only in outlying 'fringe' areas where you used to see homes with two (or more) antennas pointing in different directions - and as the TV signals were weak, these were always high-gain yagis. Regarding the original question, on looking at the specs for RG-6, it appears that Mr Heinz and his '57 varieties' is left standing. 'RG-6' seems to be a generic number for many types of coax. Various parameters differ - including the loss (typically 6 to 7.5dB per 100' at 1000MHz) and - in the case of RG-6Q - the outside diameter could be 1mm more (in which case the diameter of the dielectric is probably the same as ordinary RG-6). One caveat sometimes mentioned is the relative high loop resistance (because the inner is steel, copper plated, and not all copper), and this can cause problems if you're line powering up the drop cable. It's unlikely to affect the working of (say) a straightforward, relatively low current antenna preamplifier, but with a satellite LNB the voltage drop could confuse the band-switching operation. In the OP's situation, it's pretty obvious that the addition of another 50' RG-6 will drop the signal at the TV set by (at the most) around 3dB - and (with luck) this will probably be more-than-be-made-up-for by mounting the antenna outside, higher, and in-the-clear. [Depending on the roofing material that the TV signal is presently having to pass through to reach the antenna, the received signal could be a lot stronger.] All the OP can really do is try it, and see what happens. If that doesn't provide satisfactory reception, the best advice might be to consider an antenna with inherent gain - possibly with a rotator to enable him to get all the transmissions. -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
Ian Jackson wrote:
In the UK, I don't think that many homes use installed omnidirectional antennas. You see some on caravans and mobile homes, and on boats, but never on houses. Those living close to the transmitter might use 'rabbit's ears' set-top antennas (or some fancy variant) - especially now that all TV is UHF (small antenna) and 'you can get away with murder' digital - but you don't see any proper installations. One big difference between the UK and many other countries (and in particular the USA) is that we have generally received all our TV signals from one direction (initially from closely-located transmitter masts, and latterly from a single mast). It is only in outlying 'fringe' areas where you used to see homes with two (or more) antennas pointing in different directions - and as the TV signals were weak, these were always high-gain yagis. Here in the Netherlands, the original state TV programs were transmitted from about 8 high towers spread around our (small, flat) country, and yagis were used by everyone. Closeby for the required directivity to avoid ghost pictures, further away for the additional gain. In the seventies and early eighties, all cities got cable TV. Commercial TV and programmes from other countries were introduced only on cable TV and later on direct-to-home satellite, they were not transmitted on the analog network. Yagis disappeared from the rooftops. Later a digital terrestrial TV network was deployed in the most densely populated areas of the country and it includes both state and commercial TV, but it operates using a dense network of lower powered and lower height transmitters (usually on tall buildings) so the nearby transmitter is often at most 10km away. This means users can often employ small nondirective antennas typically placed on the windowsill near the TV. It often does not work completely satisfactorily, but the digital terrestrial TV network is inferior in quality and channel repertoire to cable and satellite anyway, and mainly used by those with low quality requirements and by mobile users. When the analog network was shut down, the existing towers were fitted with transmitters for the digital terrestrial network, to act as coverage for less populated areas. People there would have to use yagis, but they are seldomly seen as most had installed satellite dishes by then. The programming and distribution companies are separate, so there is no issue with receiving different programmes from different directions. (all transmitters transmit all TV programmes available on the system) This is probably the same as in the UK. |
Quad shield coax & dielectric?
"Rob" wrote in message
... Ian Jackson wrote: In the UK, I don't think that many homes use installed omnidirectional antennas. You see some on caravans and mobile homes, and on boats, but never on houses. Those living close to the transmitter might use 'rabbit's ears' set-top antennas (or some fancy variant) - especially now that all TV is UHF (small antenna) and 'you can get away with murder' digital - but you don't see any proper installations. One big difference between the UK and many other countries (and in particular the USA) is that we have generally received all our TV signals from one direction (initially from closely-located transmitter masts, and latterly from a single mast). It is only in outlying 'fringe' areas where you used to see homes with two (or more) antennas pointing in different directions - and as the TV signals were weak, these were always high-gain yagis. Here in the Netherlands, the original state TV programs were transmitted from about 8 high towers spread around our (small, flat) country, and yagis were used by everyone. Closeby for the required directivity to avoid ghost pictures, further away for the additional gain. In the seventies and early eighties, all cities got cable TV. Commercial TV and programmes from other countries were introduced only on cable TV and later on direct-to-home satellite, they were not transmitted on the analog network. Yagis disappeared from the rooftops. Later a digital terrestrial TV network was deployed in the most densely populated areas of the country and it includes both state and commercial TV, but it operates using a dense network of lower powered and lower height transmitters (usually on tall buildings) so the nearby transmitter is often at most 10km away. This means users can often employ small nondirective antennas typically placed on the windowsill near the TV. It often does not work completely satisfactorily, but the digital terrestrial TV network is inferior in quality and channel repertoire to cable and satellite anyway, and mainly used by those with low quality requirements and by mobile users. When the analog network was shut down, the existing towers were fitted with transmitters for the digital terrestrial network, to act as coverage for less populated areas. People there would have to use yagis, but they are seldomly seen as most had installed satellite dishes by then. The programming and distribution companies are separate, so there is no issue with receiving different programmes from different directions. (all transmitters transmit all TV programmes available on the system) This is probably the same as in the UK. With the exception of regional "opt-outs", this is true. -- ;-) .. 73 de Frank Turner-Smith G3VKI - mine's a pint. .. http://turner-smith.co.uk |
Quad shield coax & dielectric?
The programming and distribution companies are separate, so there is no issue with receiving different programmes from different directions. (all transmitters transmit all TV programmes available on the system) This is probably the same as in the UK. With the exception of regional "opt-outs", this is true. Not really, many of the low powered relays do not carry the full set of programmes, they only carry the 3 public service multiplexes not the full set. See :http://stakeholders.ofcom.org.uk/binaries/research/tv-research/no3factsheet.pdf Jeff |
Quad shield coax & dielectric?
On 3/15/2014 8:45 PM, Bob E. wrote:
OK, Bob E - it appears that the ball is in your court. In the interests of peace and harmony, and to prevent confusion, please could you please tell us exactly (and I mean EXACTLY) which RG-6 vs RG-6Q parameters you are concerned about? Ian OK, thanks for the discussions. I have a VHF/UHF omnidirectional antenna with integral amplifier for TV reception: http://www.amazon.com/Antennacraft-5...mplified-HDTV- Antenna/dp/B007Z7YOKS Several broadcast towers surround me, from 40 to 50 miles: http://www.tvfool.com/?option=com_wrapper&Itemid=29&q=id%3d5b9405cba93e1 5 Terrain is pretty flat. The antenna is currently connected to RG6 located indoors, up high in a 1-story cathedral-ceiling home. Signal reception is marginal, gauged by the HDTV's (relative) Signal Strength display; dropouts occur regularly on some channels. I plan to mount the antenna outdoors on the peak of the roof. I was planning to use RG6 quad-shield, but wanted to check whether it is truly a better solution or not. Cable run indoors now is about 50 ft. From the roof location this will increase to 75 or 100, depending on the route I choose, hence my question. Thanks. Bob, You have two problems here. The first one is the antenna is located inside of the house. This results in significant signal loss. Your second problem is you're using an omnidirectional antenna. I agree with Rob - there isn't a decent omnidirectional antenna around. HDTV requires a stronger signal than the old NTSC. If you're looking at 40-50 miles, even if it is flat terrain, you're going to have a weak signal on an omnidirectional antenna. The preamp will help - but it's not going to be a good solution. And whether you use RG-6 or RG-6 quad will make no noticeable difference (as long as both are good quality - there are good brands and bad brands in coax, also). Putting the antenna outside will, of course, help. It might even be satisfactory if you're willing to put up with some pixilation and dropout. But if you want a good signal, get a directional antenna and rotor. It will make a huge difference. BTW - we only use RG6-quad in our installations. The extra shielding means less signal leakage - both into and out of the cable. The loss difference is negligible. -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes HDTV requires a stronger signal than the old NTSC. It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
On 3/16/2014 11:42 AM, Ian Jackson wrote:
In message , Jerry Stuckle writes HDTV requires a stronger signal than the old NTSC. It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). I don't know what the current specs the techs are using now; I don't get into the field much any more. But I would be surprised if it were less than 15-20dB. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Quad shield coax & dielectric?
It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff |
Quad shield coax & dielectric?
In message , Jeff writes
It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! I'm also not sure what the figures mean. From distant memory, the NCTA minimum RF input level (for NTSC) was 0dBmV (into a TV set - it might have been a bit more for set-top boxes), and the CNR 43dB. The UK cable TV level (for PAL set-tops) was 3dBmV to 15dBmV, with no more than 3dB between the levels of adjacent channels, and when digital signals came along, these were run around 15dB below the analogues. [Note that for both the US and the UK, one of the reasons for these obviously high signal levels is because cable set-top boxes have relatively appalling noise figures compared with your modern TV set.] UK off-air transmissions were somewhat similar, with digitals being run at 10, 16 and even occasionally 20dB below the analogues. However, when all the analogues were turned off, the digitals were turned up to typically 7dB below what the analogues had been. This would suggest that digital receivers (including HD) are at least perfectly happy with 7dB less signal than analogue - and in practice, all other things being equal, digital receivers work down to much lower signal levels than would be considered satisfactory for analogue. The only obvious proviso is that while (so far) UK SD transmissions are 64QAM, HD transmissions are 256QAM, and therefore need maybe 6dB more signal (which will only be apparent where reception is marginal). -- ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
"Jeff" wrote in message ...
The programming and distribution companies are separate, so there is no issue with receiving different programmes from different directions. (all transmitters transmit all TV programmes available on the system) This is probably the same as in the UK. With the exception of regional "opt-outs", this is true. Not really, many of the low powered relays do not carry the full set of programmes, they only carry the 3 public service multiplexes not the full set. See :http://stakeholders.ofcom.org.uk/binaries/research/tv-research/no3factsheet.pdf Jeff True, but I thought the topic was the advantage, if any, of receiving more than one high power main station. Even in the analogue days not all relays carried Channel 5, for instance. -- ;-) .. 73 de Frank Turner-Smith G3VKI - mine's a pint. .. http://turner-smith.co.uk |
Quad shield coax & dielectric?
On 3/16/2014 1:26 PM, Jeff wrote:
It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
On 3/16/2014 7:17 PM, Ian Jackson wrote:
In message , Jerry Stuckle writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. Well the "43dB"that you were stating "was a bit high" was expressed as CNR, so it is reasonable to think that your other figures were also CNR as you did bot state otherwise. Also 7dBm (5mW) is a very high signal and would cause most sets to intermod like crazy. Perhaps you meant 7dBmV. Jeff |
Quad shield coax & dielectric?
7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. dBm is not a bandwidth dependant measurement such as CNR which is. Putting +7dBm into a tv receiver is madness, it would cause severe overload and inter mods. +7dBm is 50mW and that equates to about 61mV in a 75 ohm system which is an enormous signal. Jeff |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes On 3/16/2014 7:17 PM, Ian Jackson wrote: In message , Jerry Stuckle writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. The TV signal levels quoted for analogue cable TV don't really involve bandwidth. The level is always the 'RMS during sync' (or 'RMS during peak'), which is the RMS level of the vision RF envelope during the horizontal (or vertical) sync period. This has the advantage of remaining constant regardless of the video content (ie it's the same for a completely black picture or a completely white picture).. The only requirement is that the measuring instrument has sufficient bandwidth to embrace enough of the low frequency sideband content of the video signal to give a reading which IS independent of the video content. On a spectrum analyser, 300kHz resolution will display the demodulated RF waveform (thus enabling you to read the RF level), but IIRC many field strength meters have an IF bandwidth of typically 30kHz. However, regardless of the actual measuring bandwidth, noise levels are normalised to a bandwidth of 4.2MHz (NTSC) and 5.2MHz (PAL). and signal-to-noise measurements are adjusted accordingly. Note that the cable TV industry generally uses units of dBmV (dB with respect to 1mV - traditionally considered a 'good' level to feed to a TV set). This is because most of the levels the cable TV guys work with are generally in excess of 0dBmV (typically 0 to 60dBmV). The off-air TV guys often use dBuV (dB wrt 1microvolt), as they are usually dealing with weaker signals. As a result, cable TV guys are always having to mentally deduct 60dB. RF communications guys (and domestic satellite) tend to use dBm (which is a slovenly version of 'dBmW' - dB wrt 1mW) - despite the fact that a lot of their levels are large negative numbers. Also note that dBm tends to imply a Zo of 50 ohms, and dBmV/dBuV 75 ohms - but it ain't always necessarily so. Anyone working in the RF industry would be well advised to ensure that they always use the correct units - for example, don't say 'dB' or 'dBm' when you really mean dBmV. Failure to do so can often result in people needlessly arguing and talking at cross-purposes. -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
"Ian Jackson" wrote in message
... In message , Jerry Stuckle writes On 3/16/2014 7:17 PM, Ian Jackson wrote: In message , Jerry Stuckle writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. The TV signal levels quoted for analogue cable TV don't really involve bandwidth. The level is always the 'RMS during sync' (or 'RMS during peak'), which is the RMS level of the vision RF envelope during the horizontal (or vertical) sync period. This has the advantage of remaining constant regardless of the video content (ie it's the same for a completely black picture or a completely white picture).. The only requirement is that the measuring instrument has sufficient bandwidth to embrace enough of the low frequency sideband content of the video signal to give a reading which IS independent of the video content. On a spectrum analyser, 300kHz resolution will display the demodulated RF waveform (thus enabling you to read the RF level), but IIRC many field strength meters have an IF bandwidth of typically 30kHz. However, regardless of the actual measuring bandwidth, noise levels are normalised to a bandwidth of 4.2MHz (NTSC) and 5.2MHz (PAL). and signal-to-noise measurements are adjusted accordingly. Note that the cable TV industry generally uses units of dBmV (dB with respect to 1mV - traditionally considered a 'good' level to feed to a TV set). This is because most of the levels the cable TV guys work with are generally in excess of 0dBmV (typically 0 to 60dBmV). The off-air TV guys often use dBuV (dB wrt 1microvolt), as they are usually dealing with weaker signals. As a result, cable TV guys are always having to mentally deduct 60dB. RF communications guys (and domestic satellite) tend to use dBm (which is a slovenly version of 'dBmW' - dB wrt 1mW) - despite the fact that a lot of their levels are large negative numbers. Also note that dBm tends to imply a Zo of 50 ohms, and dBmV/dBuV 75 ohms - but it ain't always necessarily so. Anyone working in the RF industry would be well advised to ensure that they always use the correct units - for example, don't say 'dB' or 'dBm' when you really mean dBmV. Failure to do so can often result in people needlessly arguing and talking at cross-purposes. Back in the 1970s I was involved with the assessment of the coverage of analogue UHF TV. At that time the service limit was defined as 70dB rel 1uV/m field strength. (3.16mV/m). At 600MHz a half wave dipole is near enough 25cm and so would capture about a quarter of this voltage. A typical outdoor TV antenna of the time had a gain of at least 6dB, so the available signal level before feeder loss would be in the order of 1 - 2mV. HTH -- ;-) .. 73 de Frank Turner-Smith G3VKI - mine's a pint. .. http://turner-smith.co.uk |
Quad shield coax & dielectric?
On 3/17/2014 3:45 AM, Jeff wrote:
7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. dBm is not a bandwidth dependant measurement such as CNR which is. Putting +7dBm into a tv receiver is madness, it would cause severe overload and inter mods. +7dBm is 50mW and that equates to about 61mV in a 75 ohm system which is an enormous signal. Jeff Wrong. TV's are made to handle at least 20 dbm. And cable tv companies must deliver at least 10 dbm to the premises. TV signals (at least in the U.S.) are not measured by CNR - they are measured by dbm. CNR is not important because the bandwidth does not change. Your insistence on using CNR shows you know nothing about how the industry measures signal strength. -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
On 3/17/2014 8:27 AM, FranK Turner-Smith G3VKI wrote:
"Ian Jackson" wrote in message ... In message , Jerry Stuckle writes On 3/16/2014 7:17 PM, Ian Jackson wrote: In message , Jerry Stuckle writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. The TV signal levels quoted for analogue cable TV don't really involve bandwidth. The level is always the 'RMS during sync' (or 'RMS during peak'), which is the RMS level of the vision RF envelope during the horizontal (or vertical) sync period. This has the advantage of remaining constant regardless of the video content (ie it's the same for a completely black picture or a completely white picture).. The only requirement is that the measuring instrument has sufficient bandwidth to embrace enough of the low frequency sideband content of the video signal to give a reading which IS independent of the video content. On a spectrum analyser, 300kHz resolution will display the demodulated RF waveform (thus enabling you to read the RF level), but IIRC many field strength meters have an IF bandwidth of typically 30kHz. However, regardless of the actual measuring bandwidth, noise levels are normalised to a bandwidth of 4.2MHz (NTSC) and 5.2MHz (PAL). and signal-to-noise measurements are adjusted accordingly. Note that the cable TV industry generally uses units of dBmV (dB with respect to 1mV - traditionally considered a 'good' level to feed to a TV set). This is because most of the levels the cable TV guys work with are generally in excess of 0dBmV (typically 0 to 60dBmV). The off-air TV guys often use dBuV (dB wrt 1microvolt), as they are usually dealing with weaker signals. As a result, cable TV guys are always having to mentally deduct 60dB. RF communications guys (and domestic satellite) tend to use dBm (which is a slovenly version of 'dBmW' - dB wrt 1mW) - despite the fact that a lot of their levels are large negative numbers. Also note that dBm tends to imply a Zo of 50 ohms, and dBmV/dBuV 75 ohms - but it ain't always necessarily so. Anyone working in the RF industry would be well advised to ensure that they always use the correct units - for example, don't say 'dB' or 'dBm' when you really mean dBmV. Failure to do so can often result in people needlessly arguing and talking at cross-purposes. Back in the 1970s I was involved with the assessment of the coverage of analogue UHF TV. At that time the service limit was defined as 70dB rel 1uV/m field strength. (3.16mV/m). At 600MHz a half wave dipole is near enough 25cm and so would capture about a quarter of this voltage. A typical outdoor TV antenna of the time had a gain of at least 6dB, so the available signal level before feeder loss would be in the order of 1 - 2mV. HTH Obviously it was not the U.S. industry you were advising... -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Quad shield coax & dielectric?
On 3/17/2014 3:38 AM, Jeff wrote:
No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. Well the "43dB"that you were stating "was a bit high" was expressed as CNR, so it is reasonable to think that your other figures were also CNR as you did bot state otherwise. Also 7dBm (5mW) is a very high signal and would cause most sets to intermod like crazy. Perhaps you meant 7dBmV. Jeff Yes, I should have been more clear. It is 7dBmV - but the TV industry generally shortens it to dbm (and that's how the test equipment is labeled). Just like other industries which use dBmW generally shortens it to dbm. Sorry for the confusion - it's been about 10 years since I've been in the field - I've been away from it for too long. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Quad shield coax & dielectric?
"Jerry Stuckle" wrote in message
... On 3/17/2014 8:27 AM, FranK Turner-Smith G3VKI wrote: "Ian Jackson" wrote in message ... In message , Jerry Stuckle writes On 3/16/2014 7:17 PM, Ian Jackson wrote: In message , Jerry Stuckle writes On 3/16/2014 1:26 PM, Jeff wrote: It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. HDTV, not so much. 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. The TV signal levels quoted for analogue cable TV don't really involve bandwidth. The level is always the 'RMS during sync' (or 'RMS during peak'), which is the RMS level of the vision RF envelope during the horizontal (or vertical) sync period. This has the advantage of remaining constant regardless of the video content (ie it's the same for a completely black picture or a completely white picture).. The only requirement is that the measuring instrument has sufficient bandwidth to embrace enough of the low frequency sideband content of the video signal to give a reading which IS independent of the video content. On a spectrum analyser, 300kHz resolution will display the demodulated RF waveform (thus enabling you to read the RF level), but IIRC many field strength meters have an IF bandwidth of typically 30kHz. However, regardless of the actual measuring bandwidth, noise levels are normalised to a bandwidth of 4.2MHz (NTSC) and 5.2MHz (PAL). and signal-to-noise measurements are adjusted accordingly. Note that the cable TV industry generally uses units of dBmV (dB with respect to 1mV - traditionally considered a 'good' level to feed to a TV set). This is because most of the levels the cable TV guys work with are generally in excess of 0dBmV (typically 0 to 60dBmV). The off-air TV guys often use dBuV (dB wrt 1microvolt), as they are usually dealing with weaker signals. As a result, cable TV guys are always having to mentally deduct 60dB. RF communications guys (and domestic satellite) tend to use dBm (which is a slovenly version of 'dBmW' - dB wrt 1mW) - despite the fact that a lot of their levels are large negative numbers. Also note that dBm tends to imply a Zo of 50 ohms, and dBmV/dBuV 75 ohms - but it ain't always necessarily so. Anyone working in the RF industry would be well advised to ensure that they always use the correct units - for example, don't say 'dB' or 'dBm' when you really mean dBmV. Failure to do so can often result in people needlessly arguing and talking at cross-purposes. Back in the 1970s I was involved with the assessment of the coverage of analogue UHF TV. At that time the service limit was defined as 70dB rel 1uV/m field strength. (3.16mV/m). At 600MHz a half wave dipole is near enough 25cm and so would capture about a quarter of this voltage. A typical outdoor TV antenna of the time had a gain of at least 6dB, so the available signal level before feeder loss would be in the order of 1 - 2mV. HTH Obviously it was not the U.S. industry you were advising... BBC UK -- ;-) .. 73 de Frank Turner-Smith G3VKI - mine's a pint. .. http://turner-smith.co.uk |
Quad shield coax & dielectric?
Jerry Stuckle wrote:
On 3/16/2014 11:42 AM, Ian Jackson wrote: In message , Jerry Stuckle writes HDTV requires a stronger signal than the old NTSC. It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. Most likely the company reduced the transmitted power by a factor of 10 at the time of the switchover, to put the added link margin in their own pockets. (transmitting a megawatt of ERP as was regular in the analog days puts a serious dent in your electricity bill, even when you have a lot of antenna gain) |
Quad shield coax & dielectric?
On 3/17/2014 10:45 AM, Rob wrote:
Jerry Stuckle wrote: On 3/16/2014 11:42 AM, Ian Jackson wrote: In message , Jerry Stuckle writes HDTV requires a stronger signal than the old NTSC. It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. Most likely the company reduced the transmitted power by a factor of 10 at the time of the switchover, to put the added link margin in their own pockets. (transmitting a megawatt of ERP as was regular in the analog days puts a serious dent in your electricity bill, even when you have a lot of antenna gain) Not at all. If anything, they raised their power. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes On 3/17/2014 3:45 AM, Jeff wrote: 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. dBm is not a bandwidth dependant measurement such as CNR which is. Putting +7dBm into a tv receiver is madness, it would cause severe overload and inter mods. +7dBm is 50mW and that equates to about 61mV in a 75 ohm system which is an enormous signal. Jeff Wrong. TV's are made to handle at least 20 dbm. And cable tv companies must deliver at least 10 dbm to the premises. You do realise that 20dBm (appx 68dBmV) is a massive 100mW? With a modest 50 channel analogue cable TV system, that would be a total input power of 5W - which would have a TV set or set-top box sagging at the knees - if not even beginning top smoke! TV signals (at least in the U.S.) are not measured by CNR Well of course they aren't. CNR is a ratio - not a level. - they are measured by dbm. No. The US and UK cable TV industry definitely uses dBmV. 0dBmV is 1mV - a reasonable signal to feed to a TV set (especially directly from an antenna). 0dBm is appx 48dBmV (250mV) - and that's one hell of a TV signal! With a 75 ohm source impedance (antenna and coax) - and no significant levels of outside noise-like interference, a 0dBmV (1mV) analogue NTSC signal, direct from an antenna, will have a CNR of around 57dB. A TV set with a decent tuner noise figure (5dB?) or a set-top box (8dB) will produce essentially noise-free pictures. However, with an analogue TV signal from a large cable TV system, the signal CNR will be much worse than 57dB (regardless of its level). If I recall correctly, the NCTA ( National Cable Television Association) minimum spec is a CNR of 43dB (UK is 6B). At this ratio, it is judged that picture noise is just beginning to become visible. CNR is not important because the bandwidth does not change. You're havin' a laff - surely?! Your insistence on using CNR shows you know nothing about how the industry measures signal strength. I'm not insisting on anything. However, an analogue with a poor CNR will produce noisy pictures - regardless of the signal level. Similarly, a digital signal with a too poor an SNR/MER will fail to decode - regardless of the signal level. I think the UK cable TV spec for digital signals is 25dB (although a good set-top box will decode down to the mid-teens). -- Ian --- news://freenews.netfront.net/ - complaints: --- |
Quad shield coax & dielectric?
In message , Jerry Stuckle
writes On 3/17/2014 3:38 AM, Jeff wrote: No one said the NTSC had to be noiseless. But the 43dB is a bit high, even for older sets. Input from the cable tv company to our equipment was 10-20dB; we tried to push 10dB to all of the outputs but never had a problem even down to 7dB (the lowest we would let it drop to). That makes no sense; a 7dB CNR would be pretty much unwatchable on analogue, it would be a very very noisy picture, if it even locked at all! Jeff I'm not talking CNR - I'm talking signal strength. 7dbm is plenty of signal. Most later TV's would work even at 0dbm. Well the "43dB"that you were stating "was a bit high" was expressed as CNR, so it is reasonable to think that your other figures were also CNR as you did bot state otherwise. Also 7dBm (5mW) is a very high signal and would cause most sets to intermod like crazy. Perhaps you meant 7dBmV. Jeff Yes, I should have been more clear. It is 7dBmV - but the TV industry generally shortens it to dbm (and that's how the test equipment is labeled). Just like other industries which use dBmW generally shortens it to dbm. No. You are absolutely wrong. No one in the professional cable TV would even think of referring to 'dBmV as 'dBm'. There's around 48dB difference between the two. However, you are right about 'dBmW' - which is invariably (and regrettably) shortened to 'dBm'. Sorry for the confusion - it's been about 10 years since I've been in the field - I've been away from it for too long. Well, I think it is beginning to show! [Sorry for being personal, as it's something I always try to avoid.] -- Ian --- news://freenews.netfront.net/ - complaints: --- |
All times are GMT +1. The time now is 05:50 AM. |
|
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
RadioBanter.com