![]() |
Question on SWR
In message , Antonio
Vernucci writes since most of the loss in practical coax cables is due to I^2R loss (compared to V^2G) A quick question. If most of the the cable loss is due to I^2R, how can one explain that the foam versions of common coaxial cables show a much lower loss than versions having solid PE insulation? For instance RG-213 is rated at 8.5dB loss for 100 meters at 144 MHz, while RG-213 foam at only 4.5 dB. If G is relatively unimportant with regard to loss, how can one explain that a change of insulation material yields such a tremendous change in loss? Thanks and 73 Tiony I0JX Lower k dielectric larger diameter inner conductor lower resistance lower loss. -- Ian |
Question on SWR
"Antonio Vernucci" wrote in
: since most of the loss in practical coax cables is due to I^2R loss (compared to V^2G) A quick question. If most of the the cable loss is due to I^2R, how can one explain that the foam versions of common coaxial cables show a much lower loss than versions having solid PE insulation? If you construct a cable of similar outside dimensions but using a foam dielectric, it needs a larger diameter inner conductor. That accounts for the lower loss at lower frequencies (typically below the GHz range.) For instance RG-213 is rated at 8.5dB loss for 100 meters at 144 MHz, while RG-213 foam at only 4.5 dB. If G is relatively unimportant with regard to loss, how can one explain that a change of insulation material yields such a tremendous change in loss? See above. If you use my calculator (link in earlier posting), it gives you the coefficients of two terms of the loss model, one is due to I^2R and the other V^2G. You can evaluate them at any given frequency and determine the contribution that conductor and dielectric losses make at that frequency for that cable type. Does that help? Owen Thanks and 73 Tiony I0JX |
Question on SWR
Antonio Vernucci wrote:
For instance RG-213 is rated at 8.5dB loss for 100 meters at 144 MHz, while RG-213 foam at only 4.5 dB. If G is relatively unimportant with regard to loss, how can one explain that a change of insulation material yields such a tremendous change in loss? Those statements about most loss being due to I^2*R losses are at *HF* frequencies. 144 MHz is NOT HF. The difference in RG-213 and RG-213 foam is only 0.2 dB at 10 MHz while the difference between RG-58 and RG-213 is about 0.7 dB at 10 MHz. -- 73, Cecil http://www.w5dxp.com "According to the general theory of relativity, space without ether is unthinkable." Albert Einstein |
Question on SWR
"Owen Duffy" wrote in message ... "Dave" wrote in : ... yeah, when you use the full complex Z0 and probably the full transmission line equations it gets a bit more complex. but for amateur use that graph is close enough. the difference between 5 and 500 ohm loads of .07db or so for 100m just ain't worth quibbling about for normal amateur hf use. unless you want to argue it out with cecil. You either misread my example (it was 1m not 100m) or you labour under the misapprehension that loss per unit length under mismatched conditions is constant at all displacements along the cable. When approximations that hold under some conditions replace the underlying principles, we dumb down. The formula and graphs for "additional loss due to VSWR" without statement of the assumptions under which it is valid are an example... where now, so many people accept the concept that VSWR necessarily increases loss. The OP was trying to reconcile calculated losses in a particular scenario, and one of the contributions to error was the "additional loss due to VSWR". It is fine with me that understanding doesn't appeal to you Dave, but you don't need to press that approach on the rest of us. Owen (PS: if we take a length of 50 ohm coax sufficiently short that current distribution is approximately uniform, and consider the losses under matched conditions and under a 500 ohm load with same load power where voltage is three times and current is one third, it is intuitive that since most of the loss in practical coax cables is due to I^2R loss (compared to V^2G), that loss will be LESS (than Matched Line Loss)... approximately one tenth in that case... so why swallow the ROT that high VSWR necessarily increases loss.) why should i swallow your rot that shows when you push the limits of the equations you get results that 'seem' to defy logic. I understand perfectly well what you are saying, and i do understand the full complex forms of the transmission line equations. what i am saying is that for most 'normal' amateur use the graph presented in the arrl book is adequate. if you insist on pushing computerized calculations to the absurd limits i'll lump you in with art and his over optimized ball of wire antenna. |
Question on SWR
Antonio Vernucci wrote:
since most of the loss in practical coax cables is due to I^2R loss (compared to V^2G) A quick question. If most of the the cable loss is due to I^2R, how can one explain that the foam versions of common coaxial cables show a much lower loss than versions having solid PE insulation? For instance RG-213 is rated at 8.5dB loss for 100 meters at 144 MHz, while RG-213 foam at only 4.5 dB. If G is relatively unimportant with regard to loss, how can one explain that a change of insulation material yields such a tremendous change in loss? In reasonably well constructed coax cables, the main source of loss up to about 1GHz is the I^2R loss in the centre conductor. The inside of the shield carries an equal (and opposite) current, but the current density is lower so the I^2R loss there is less important. Dielectric loss is usually less important still. In low-loss cables that have the same outside diameter as the classic PE cables they are replacing, the reduction in loss is almost entirely due to a larger centre conductor. But that change cannot be made on its own. In order to maintain a 50 ohm impedance and keep the same outside diameter too, it is necessary to reduce the dielectric constant of the insulation material. In other words, they're using foam or semi-airspaced construction because they *have* to. Replacing some of the solid PE with gas may make a small contribution to the lower losses, but nowhere near so much as the advertisers would have you believe. The main contributor is always the reduced I^2R loss in a larger centre conductor. -- 73 from Ian GM3SEK |
Question on SWR
Owen Duffy wrote in
: .... If you use my calculator (link in earlier posting), it gives you the coefficients of two terms of the loss model, one is due to I^2R and the other V^2G. You can evaluate them at any given frequency and determine the contribution that conductor and dielectric losses make at that frequency for that cable type. Lest someone confuses this with an incorrect calculation or estimate: From TLLC, the matched line loss in dB of LMR400 (a foam coax of similar OD to RG213) is 3.941e-6*f^0.5+1.031e-11*f. The first term is due to R and the second due to G. At 144MHz, the percentage of power lost per meter due to R is (1-10^- (3.941e-6*f^0.5)/10)*100 is 1.08%. If you do similar for G, the loss is 0.034%, so loss in R is more than 30 times loss in G. The numbers lead to a better understanding. Does this make sense? Did I get it correct? Owen (BTW for RG213 at 144MHz, the percentage of power lost per meter due to R is more than 6 times loss in G. Most of the loss advantage of LMR400 comes from reduction of the R loss component per metre from 1.6% to 1.1% due to the larger diameter inner conductor.) Owen |
Question on SWR
Antonio Vernucci wrote:
since most of the loss in practical coax cables is due to I^2R loss (compared to V^2G) A quick question. If most of the the cable loss is due to I^2R, how can one explain that the foam versions of common coaxial cables show a much lower loss than versions having solid PE insulation? For instance RG-213 is rated at 8.5dB loss for 100 meters at 144 MHz, while RG-213 foam at only 4.5 dB. If G is relatively unimportant with regard to loss, how can one explain that a change of insulation material yields such a tremendous change in loss? Thanks and 73 Tiony I0JX 144 MHz isn't HF, which is where the original statement is valid. At frequencies above around 50 MHz, depending on the dielectric, the dielectric loss starts to be more significant. Another trap for the unwary, when comparing coax losses, has to do with skin effect and the thickness of the copper or silver cladding on the center conductor. You could have an air insulated coax with silver plated over stainless steel where the loss is actually greater at low frequencies than higher, because the skin depth is greater at low frequencies and the current is flowing mostly in the SS, rather than the copper. (such coax is used in cryogenic applications, lest one think it's overly contrived as an example) |
Question on SWR
In reasonably well constructed coax cables, the main source of loss up to about 1GHz is the I^2R loss in the centre conductor. The inside of the shield carries an equal (and opposite) current, but the current density is lower so the I^2R loss there is less important. Dielectric loss is usually less important still. Ian and others, thanks for your clear explanation, but I still have a doubt that you may kindly clarify. The 300-ohm TV flat ribbon specifications show an attenuation generally lower than that of plain RG-8, despite the conductors of the ribbon are by far thinner than those of RG-8 (especially than the cable shield). What am I missing now? Thanks & 73 Tony I0JX |
Question on SWR
"Antonio Vernucci" wrote in
: The 300-ohm TV flat ribbon specifications show an attenuation generally lower than that of plain RG-8, despite the conductors of the ribbon are by far thinner than those of RG-8 (especially than the cable shield). Under matched line conditions, the 300 ohm line transfers the power at higher voltage and lower current, one sixth of the current, so even though the conductors might seem relatively thin (RF R is proportional to diameter for wide spaced line), I^2R loss is lower. Owen |
Question on SWR
Antonio Vernucci wrote:
In reasonably well constructed coax cables, the main source of loss up to about 1GHz is the I^2R loss in the centre conductor. The inside of the shield carries an equal (and opposite) current, but the current density is lower so the I^2R loss there is less important. Dielectric loss is usually less important still. Ian and others, thanks for your clear explanation, but I still have a doubt that you may kindly clarify. The 300-ohm TV flat ribbon specifications show an attenuation generally lower than that of plain RG-8, despite the conductors of the ribbon are by far thinner than those of RG-8 (especially than the cable shield). What am I missing now? 300 ohms vs 50 ohms. Since IR losses dominate at these frequencies, reducing current reduces loss. The higher Z means more voltage and less current for the same power. Loss will be 1/36th, assuming all the conductor sizes are the same. |
All times are GMT +1. The time now is 07:03 AM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
RadioBanter.com