Cecil Moore wrote:
Jeff wrote:
There is an apocryphal story that 50 ohms started out as a common impedance
for coax because that happened to be the number that came out using common
British copper pipe sizes.
I vaguely remember something about 50 ohms being good
for transmitting and 73 ohms being good for receiving.
--
73, Cecil http://www.w5dxp.com
:-) It shouldn't be vague. It should be crystal-clear. If lowest
loss is important and you're going to use coax with smooth conductors
and the same metal for inner and outer conductors, and you have a fixed
outer conductor diameter, you want the ratio of the inside of the outer
conductor to inner conductor diameters to be 3.59:1. That assumes
negligible dielectric loss. It's not difficult to find the ratio for
other cases, if you know the ratio of RF resistivities of the inner and
outer conductors and the dielectric loss. The loss doesn't increase
very quickly as you get away from that ratio some, but that's the ratio
for lowest loss. If you have air dielectric, that 3.59:1 ratio gives
you 76.7 ohms. If you have solid polyethylene dielectric, it gives you
about 50.6 ohms. Foam dielectric would give you roughly 60 ohms.
There are different conductor diameter ratios for maximum voltage
handling (assuming uniform dielectric breakdown rating and a fixed
outer conductor size; you want a conductor diameter ratio that
minimizes the maximum voltage gradient, i.e., the gradient next to the
center conductor) and maximum power handling (assuming the line is
voltage-limited, which generally only is the case for very low duty
cycle, like radar pulses). If the line is thermally limited (almost
always the case for typical ham installations), lowest attenuation will
give you very close to the highest power handling...details depend on
how well the center conductor can get rid of heat.
Cheers,
Tom