Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
I've read for years ( and never asked why ) that when you're operating
into a high SWR that a high impedance feedline ( say 450 Ohm ladder line VS 52 Ohm coax ) provides much less loss. I think I recall someone in this group saying that its mostly current losses. Does the high impedance line have higher voltage points across its length and therefore less current flow for a give power level ( say 100 watts ) than the 52 Ohm coax ? I guess an analogy if the above is true could be made about the 120Kv + power lines on tall steel towers that are about 500 feet behind my shack. ( Lucky me ! ) They have much less loss than trying to run say 120 volts and all the current flow that would entail for the same wattage delivered to homes, business etc ? I can imagine the size of the conductors required to deliver the same amount of wattage at 120V VS 120 Kv +/-. Thanks .... Gary |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
QST Article: An Easy to Build, Dual-Band Collinear Antenna | Antenna | |||
Length of Coax Affecting Incident Power to Meter? | Antenna | |||
Variable stub | Antenna | |||
50 Ohms "Real Resistive" impedance a Misnomer? | Antenna | |||
Conservation of Energy | Antenna |